Skip to main content

An explanation of JavaScript Event Loop

In JavaScript, the event loop is a mechanism that allows asynchronous code to run without blocking the main thread. It continuously monitors the call stack and the task queue for new tasks and schedules them to run in the appropriate order.

What is a call stack?

The call stack is a data structure used by JavaScript runtime to keep track of the sequence of function calls in the execution context. Every time a function is called, a new frame is pushed onto the top of the stack, and when a function completes its execution, the frame is popped off the top of the stack.

When a JavaScript program starts executing, a global execution context is created, and its corresponding frame is pushed onto the call stack. As functions are called, new frames are pushed onto the stack, and when a function returns, its frame is popped off the stack.

For example, consider the following code:

function greet(name) {
  console.log(`Hello, ${name}!`);
}

function sayHello() {
  console.log("Calling greet function...");
  greet("John");
}

console.log("Starting program...");
sayHello();
console.log("Program completed.");

Here's what happens when this code is executed:

  1. The global execution context is created, and its corresponding frame is pushed onto the call stack.
  2. The console.log statement is executed, printing "Starting program..." to the console.
  3. The sayHello function is called, and its corresponding frame is pushed onto the stack.
  4. The console.log statement inside sayHello is executed, printing "Calling greet function..." to the console.
  5. The greet function is called from within sayHello, and its corresponding frame is pushed onto the stack.
  6. The console.log statement inside greet is executed, printing "Hello, John!" to the console.
  7. The greet function completes its execution and its frame is popped off the stack.
  8. Execution returns to sayHello, and its frame is still on top of the stack.
  9. The sayHello function completes its execution and its frame is popped off the stack.
  10. Execution returns to the global context, and its frame is now on top of the stack.
  11. The console.log statement is executed, printing "Program completed." to the console.
  12. The global context completes its execution and its frame is popped off the stack.

What is Task Queue?
The task queue is a data structure that manages the execution order of tasks. A task is any unit of work that needs to be executed, such as an event handler, a function call, or a timer callback. When a task is added to the task queue, it is processed by the event loop and executed by the JavaScript engine.

The task queue is divided into two types of tasks: macro tasks and micro tasks. Macro tasks are tasks that are queued by the browser or the JavaScript engine, such as user input events, network requests, and timer callbacks. Micro tasks, on the other hand, are tasks that are queued by the JavaScript engine itself, such as Promise callbacks, mutation observers, and queueMicrotask.

Microtasks are executed before macrotasks in the event loop. When a microtask is added to the microtask queue, it is executed before any other task in the event loop, including any pending macrotasks. This means that if there are multiple microtasks in the queue, they will be executed one after the other before any macrotasks are processed.

Here's an example of how the event loop works in JavaScript:

console.log('start');

setTimeout(() => {
  console.log('setTimeout 1');
}, 0);

Promise.resolve().then(() => {
  console.log('Promise 1');
});

setTimeout(() => {
  console.log('setTimeout 2');
}, 0);

Promise.resolve().then(() => {
  console.log('Promise 2');
});

console.log('end');
In this example, we're using setTimeout and Promise to simulate asynchronous code. Here's how the event loop processes this code:
  1. The first console.log statement is executed synchronously, printing "start" to the console.
  2. The first setTimeout callback is added to the task queue with a delay of 0 milliseconds.
  3. The first Promise is resolved, and its callback is added to the microtask queue.
  4. The second setTimeout callback is added to the task queue with a delay of 0 milliseconds.
  5. The second Promise is resolved, and its callback is added to the microtask queue.
  6. The last console.log statement is executed synchronously, printing "end" to the console.
  7. The event loop begins processing the microtask queue, starting with the first Promise callback. It prints "Promise 1" to the console.
  8. The event loop continues processing the microtask queue, running the second Promise callback. It prints "Promise 2" to the console.
  9. The event loop begins processing the task queue, starting with the first setTimeout callback. It prints "setTimeout 1" to the console.
  10. The event loop continues processing the task queue, running the second setTimeout callback. It prints "setTimeout 2" to the console.
Note that the order of the setTimeout callbacks is not guaranteed, since they have the same delay of 0 milliseconds. The order of the Promise callbacks is guaranteed since they are added to the microtask queue.

This example demonstrates how the event loop allows asynchronous code to run without blocking the main thread, and how it schedules tasks in the appropriate order.

Comments

Popular posts from this blog

Learn how to setup push notifications in your Ionic app and send a sample notification using Node.js and PHP.

Ionic is an open source mobile UI toolkit for building modern, high quality cross-platform mobile apps from a single code base. To set up push notifications in your Ionic app, you will need to perform the following steps: Create a new Firebase project or use an existing one, and then enable Firebase Cloud Messaging (FCM) for your project. Install the Firebase Cloud Messaging plugin for Ionic: npm install @ionic-native/firebase-x --save Add the plugin to your app's app.module.ts file: import { FirebaseX } from '@ionic-native/firebase-x/ngx' ; @ NgModule({ ... providers: [ ... FirebaseX ... ] ... }) Initialize Firebase in your app's app.component.ts file: import { FirebaseX } from '@ionic-native/firebase-x/ngx' ; @ Component({ ... }) export class AppComponent { constructor ( private firebase : FirebaseX ) { this .firebase.init(); } } Register your app with Firebase Cloud Messaging by adding

How to export php/html page to Excel,Word & CSV file format

This class can generate the necessary request headers to make the outputted HTML be downloaded as a file by the browser with a file name that makes the file readable by Excel(.xls),Word(.doc) and CSV(.csv). Step1: Create PHP file named 'ExportPHP.class.php' ExportPHP.class.php <?php class ExportPHP { // method for Excel file function setHeaderXLS ( $file_name ) { header( "Content-type: application/ms-excel" ); header( "Content-Disposition: attachment; filename=$file_name" ); header( "Pragma: no-cache" ); header( "Expires: 0" ); } // method for Doc file function setHeaderDoc ( $file_name ) { header( "Content-type: application/x-ms-download" ); header( "Content-Disposition: attachment; filename=$file_name" ); header( 'Cache-Control: public' ); } // method for CSV file function setHeaderCSV (

Why is Apollo Client a crucial tool for your GraphQL project?

Apollo Client is a popular JavaScript library for managing GraphQL queries and mutations in client-side applications. There are several reasons why you might want to use Apollo Client in your GraphQL application: Simplified data management : With Apollo Client, you can easily manage your application's data with a single, declarative API. This can help to simplify your code and make it easier to reason about. Real-time updates : Apollo Client includes built-in support for subscriptions, which allow you to receive real-time updates from your GraphQL server. This can be useful for applications that require real-time data, such as chat applications or real-time analytics dashboards. Caching and performance : Apollo Client includes a sophisticated caching system that can help to improve the performance of your application by reducing the number of network requests required to fetch data. The cache can also be configured to automatically invalidate data when it becomes stale, ensuring th