With the release of Python 3.2, the concurrent.future module was introduced. In this video, you see how to use this module.
- [Instructor] In the previous section, we saw the process-based parallelism. Welcome to the fourth section, asynchronous programming. With a sequential and parallel execution model, there is a third model called the asynchronous model that is of fundamental importance to us along with the concept of event programming. In this section, we'll take a look at asynchronous programming through the medium of videos, which we'll cover. One, how to use the concurrent.futures Python module. Two, event loop management with Asyncio. Three, handling coroutines with Asyncio.
Four, task manipulation with Asyncio. Five, dealing with Asyncio and Futures. Let's begin with our first video of this section, titled using the concurrent.futures Python modules. In this video, we will manage concurrent programming tasks such as process and thread pooling, with the help of concurrent.future module. With the release of Python 3.2, the concurrent.future module was introduced, which allows us to manage concurrent programming tasks such as process and thread pooling, nondeterministic execution flows, and processes and thread synchronization.
This package is built by the following classes. One, concurrent.futures.Executor. This is an abstract class that provides methods to execute calls asynchronously. Two, submit (function, argument). This schedules the execution of a function called callable on the arguments. Three, map (function, argument). This executes the function on arguments in an asynchronous mode. Four, shutdown (Wait = True). This signals the executor to free any resource. Five, concurrent.futures.Future.
This encapsulates the asynchronous execution of a callable function. Future objects are instantiated by submitting tasks, functions with optional parameters, to executors. Executors are abstractions that are accessed through their subclasses, thread or process ExecutorPools. In fact, instantiation of threads and process is a resource-demanding task, so it's better to pool these resources and use them as repeatable launchers or executors. Hence the executors concept for parallel or concurrent tasks. Let's look at how to deal with the process and thread pool.
A thread or process pool, also called pooling, indicates a software manager that is used to optimize and simplify the use of threads and/or processes within a program. Through the pooling, you can submit the task or tasks that are to be executed to the pooler. The pool is equipped with an internal queue of tasks that are pending, and a number of threads or processes that execute them. A recurring concept in pooling is reuse. A thread or process is used several times for different tasks during its lifecycle. It decreases the overhead of creating and increasing the performance of the program that takes advantage of the pooling.
Reuse is not a rule, but it is one of the main reasons that lead a programmer to use pooling in his or her application. The current.futures model provides two subclasses of the executor class, respectively, which manipulates a pool of threads and a pool of processes asynchronously. The two subclasses are as shown on the screen. One, concurrent.futures.ThreadPoolExecutor(max_workers). Two, concurrent.futures.ProcessPoolExecutor(max_workers). The max_workers parameter identifies the max number of workers that execute the call asynchronously.
This example shows you the functionality of process and thread pooling. The task to be performed is that we have a list of numbers from one to 10, number_list. For each element of the list, a count is made up of 10 million, just to waste time. And then the latter number is multiplied with the eighth element of the list. By doing this, the cases listed on the screen are evaluated. One, sequential execution. Two, a thread pool with five workers. Consider the code shown. We build a list of numbers stored in number_list, and for each element in the list, we operate the counting procedure until 100 million iterations.
Then we multiply the resulting value for 100 million. In the main program, we execute the task that will be performed in a sequential mode. Also, in parallel mode, we will use the concurrent.futures module's pooling capability for a thread pool. The thread pool executor executes the given task using one of its internally pooled threads. It manages file threads working on its pool. Each thread takes a job out of the pool and executes it. When the job is executed, it takes the next job to be processed from the thread pool. When all the jobs are processed, the execution time is printed.
For the processed pooling implemented by the ProcessPoolExecutor class, we have this code. Like ThreadPoolExecutor, the ProcessPoolExecutor class is an executor subclass that uses a pool of processes to execute calls asynchronously. However, unlike ThreadPoolExecutor, the process pool executor uses the multiprocessing module, which allows us to outfling the the global interpreter lock and obtain a shorter execution time. The pooling is used in almost all server applications where there's a need to handle more simultaneous requests from any number of clients.
Many applications, however, require that each task should be performed instantly, or you have more control over the thread that executes it. In this case, pooling is not the best choice. Let's run the code and see the output now. Open the command prompt and navigate to your working directory. Then run the Python file. After running the code, we have these results with the execution time. Great! So in this video we saw the use of concurrent.futures Python modules. In our next video, we take a look at the event loop management with Asyncio.
Note: This course was created by Packt Publishing. We are pleased to host this training in our library.
- Memory organization
- Parallel programming models
- Designing a parallel program and evaluating performance
- Working with threads in Python
- Synchronizing threads and using multithreading
- Spawning a process
- Running a process in the background
- Synchronizing processes
- Using the mpi4py Python module
- Using collective communication
- Reducing operations
- Managing events, tasks, and routines with Asyncio
- Distributing tasks