Illustration of Project Loom Concurrency Model
May 12, 2022

Understanding Project Loom Concurrency Models

Java Updates

Project Loom, which is under active development and has recently been targeted for JDK 19 as a preview feature, has the goal of making it easier to write, debug, and maintain concurrent Java applications. Learn more about Project Loom’s concurrency model and virtual threads.

Back to top

What Are Threads In Java? What are Virtual Threads?

In Java, and computing in general, a thread is a separate flow of execution. It tells your program to do something. With threads, you can have multiple things happening at the same time.

Traditional threads in Java are very heavy and bound one-to-one with an OS thread, making it the OS’ job to schedule threads. This means threads’ execution time depends on the CPU. Virtual threads, also referred to as green threads or user threads, moves the responsibility of scheduling from the OS to the application, in this case the JVM. This allows the JVM to take advantage of its knowledge about what’s happening in the virtual threads when making decision on which threads to schedule next.

📚 Learn more about virtual threads in JDK 21. 

Back to top

Benefits of Virtual Threads

Today Java is heavily used in backend web applications, serving concurrent requests from users and other applications. In traditional blocking I/O, a thread will block from continuing its execution while waiting for data to be read or written. Due to the heaviness of threads, there is a limit to how many threads an application can have, and thus also a limit to how many concurrent connections the application can handle. This constraint means threads do not scale very well.

The answer to that has for a long time been the use of asynchronous I/O, which is non-blocking. When using asynchronous I/O, a single thread can handle many concurrent connections, but at the cost of increased code complexity. A single execution flow handling a single connection is a lot simpler to understand and reason. While many frameworks today, in particular reactive frameworks, hide a lot of this complexity from the developer, a different mindset is needed for asynchronous I/O.

Back to top

Why Use Project Loom?

This is where Project Loom comes in. Virtual threads give the developer the opportunity to develop using traditional blocking I/O, since one of the big perks of virtual threads is that blocking a virtual thread does not block the entire OS thread. This removes the scalability issues of blocking I/O, but without the added code complexity of using asynchronous I/O, since we are back to a single thread only overseeing a single connection.

📕 Learn more about How to Reduce Code Complexity by Using Hiera Hashes

Back to top

Scale Java Threading With Project Loom

Developing using virtual threads are near identical to developing using traditional threads. The enhancement proposal adds several API methods for this.

The Thread class itself has a few added methods, like Thread.ofVirtual(), that returns a builder to start virtual threads or create a ThreadFactory. Likewise, Executors.newVirtualThreadPerTaskExecutor() has been added, to create an ExecutorService that uses virtual threads. In many cases, switching to use these to create the ExecutorService or the ThreadFactory is enough to take advantage of virtual threads!

Like any other preview feature, to take advantage of it, you need to add the --enable-preview JVM argument while compiling and running.

Get More Java Resources >> 

Back to top

Project Loom + Future of Java

Although asynchronous I/O is hard, many people have done it successfully. But this shift in mindset has not been widely adopted. Netflix has been widely known for using reactive programming and being big contributors to the reactive programming frameworks out there. But even they have scaled back on their use recently.

While virtual threads won’t magically run everything faster, benchmarks run against the current early access builds do indicate that you can obtain similar scalability, throughput, and performance as when using asynchronous I/O.

With the current implementation of virtual threads, the virtual thread scheduler is a work-stealing fork-join pool. For most people this is probably all that’s needed. But there have been requests made to be able to supply your own scheduler to be used instead. While this is currently not supported in the current preview version, we might see it in a future improvement or enhancement proposal.

Virtual threads are currently targeted for inclusion in JDK 19 as a preview feature. If everything goes well, virtual threads should be ready to exit its preview state by the time JDK 21 comes out, which is the next likely LTS version.

For early adopters, is already included in the latest early access builds of JDK 19. So, if you’re so inclined, go try it out, and provide feedback on your experience to the OpenJDK developers, so they can adapt and improve the implementation for future versions.

Prepare your teams for the future by ensuring they are faster today with JRebel.

See how much faster your team could code during your 14-day free trial. 

Try JRebel Free

Back to top