Scala 3 / Dotty – Facts and Opinions. What do we expect?

What’s Scala 3?

Scala 3 is the new major version of the Scala programming language. It’s the result of years of research, development, and collaboration between companies and organizations that are coordinating the development of Scala with the help of many other individuals and organizations that are investing their free time to make it happen. This joint effort has brought us the most prominent changes in the language. 

What motivated the coming of this new version was:  a connection to the essence of Scala (namely DOT calculus – the reason why Scala 3 started as Dotty); an increase in productivity and predictability, making it easier, more fun and safer to code; an improvement in the tooling and binary compatibility; and being even more friendly to newcomers.

In this article, we’ll highlight some of the changes that we think are of great value for day-to-day Scala programmers. We’ll also talk about the migration process and binary compatibility. Finally, we will conclude with our opinion on this new version.

Is Scala 3 a new language?

Yes, it is. Because there are a lot of language changes, including features that will be phased out. Also because the learning experience is different from  previous versions. Furthermore, the knowledge base will need to be refreshed. 

And no, it isn’t. Despite the changes that we’ll be revealing in the article, on top of many others not mentioned (to avoid making the article too long), Scala 3 is still Scala. The core concepts remain in place and the support for cross-build reinforces that the set of features has a large intersection with the previous version. 

Why so many changes at once?

By the end of this article, you will be wondering why so many changes at once. The answer is: the Scala 3 book. This represents the language itself. By releasing the changes all at once, there’s no need to keep rewriting  the book every time,  as more paced out releases would do.   tThe changes mainly affect the foundations of the language, simplifying the life of its users or  replacing existing features. Therefore, the coming changes needed to be restricted and prioritized according to the foundations, simplifications and restrictions. Then anything  left to change in any possible later versions will be more related to adding more power and expressiveness, especially for expert users, i.e, things that can be postponed and do not drastically affect the language. 

Is Scala 3 the new Python 3?

There’s an unfounded belief that Scala 3 is the new Python 3 regarding compatibility with its previous version. However, there are some arguments against that opinion: namely that you don’t need to migrate everything to Scala 3, since there’s binary compatibility with Scala 2.13 (more about that in the migration section); you can migrate with confidence due to Scala’s strong  type system; and there are a lot more benefits to migrating from 2 to 3 in Scala than there were from 2 to 3 in Python 3.

What are the key changes? 

We have chosen some key features that we consider more relevant to  day-to-day Scala programmers and we’ll describe them and comment on how they might affect us. We won’t comment on all of the new features because the list is too long. Anyway, this article will not be a tutorial about every feature. If you want to see a  list of all the changes, references and more resources, take a look at

Optional Braces

One of the most revolutionary new features is optional braces and the use of indentation rules,  as with Python. This is revolutionary because it visually changes  the code and has an effect on readability — enough for an inattentive reader to think it’s a new language. In addition to leading to cleaner and smaller code, optional braces/meaningful whitespaces are a good thing because: 

  1. We already try to drop braces wherever possible (like for methods/functions that consist of a single expression);
  2. Even with braces, almost every project uses some rules for indentations very rigorously (checked either at code review or enforced by scalafmt), so the braces are only an additional token for us to register while reading the code, and it doesn’t introduce any more information.
trait Printer:
  def print(msg: String): Unit

class ConsolePrinter extends Printer:
  def print(msg: String): Unit = println(msg)

class EmojiPrinter(underlying: Printer) extends Printer:
  def print(msg: String): Unit =
    val emoji = msg match
      case ":)"  => 😊
      case ":D"  => 😁
      case ":|"  => 😐
      case other => other

One of the drawbacks to using indentation rules is to discern when a large indentation region ends. To tackle this problem, Scala 3 offers an endmarker. 

class EmojiPrinter(underlying: Printer) extends Printer:
  def print(msg: String): Unit =
    if msg != null then
      val emoji = msg match
        case ":)"  => 😊
        case ":D"  => 😁
        case ":|"  => 😐
        case other => other
    end if
end EmojiPrinter

Note that we don’t put parentheses and also note the presence of then. Both of these – and other changes – are part of the new control syntax, another change that visually affects the code.

Apart from increasing your codebase compared to braces, the advantage of end markers over braces is that you can label the closure. Thus, it becomes easier to find which component is being closed.

It’s not recommended to put an end marker everywhere. The general hint is to use it when the indentation region is too long. However, the definition of “too long” can vary from person to person. Thus, according to the official documentation, an end marker makes sense if:

  • The constructor contains blank lines, or
  • The constructor has 15 lines or more
  • The constructor has 4 indentation levels or more

According to the official documentation, optional braces are an experimental feature, i.e., you can disable them. Apart from that, the compiler will warn you about any badly indented code. Furthermore, the rule is that adding a pair of optional braces will not change the meaning of a well-indented program.


Almost every Scala programmer coming by way of Java misses the enum keyword and the concept that it carries. Before Scala 3, you had to write some boilerplate to achieve something similar to an enumeration:

sealed trait Color
case object Red extends Color
case object Green extends Color
case object Blue extends Color

In Scala 3, we can use the built-in enum types.

enum Color:
  case Red, Blue, Green

Over the years, more and more code has been written with type safety in mind. Concepts such as Algebraic Data Types (ADTs) have become common in systems modeling. Therefore, it would be suitable to offer programmers this way easier mechanism to implement these data structures. Indeed, Scala 3 offers an easier way to implement ADTs through enums:

enum Option[+T]:
  case Some(x: T) // extends Option[T]       (omitted)
  case None       // extends Option[Nothing] (omitted)

If you want to make your Scala-defined enum compatible with Java enum, you need explicitly to extend java.lang.Enum, which is imported by default:

enum Color extends Enum[Color]:
  case Red, Blue, Green

println(Color.Green.compareTo(Color.Red)) // 2

If you need a more complex enumeration case,  such as with parameters or Generalized ADTs, take a look at enums reference.

Implicit redesign 

Despite the criticism, implicit is one of the most distinguished features of Scala. However, it’s also one of the most controversial. They convey that they are more of a mechanism rather than their real intention which is to solve problems. Furthermore, the fact that implicit combines easily with a lot of constructors turns out to be hard when it comes to preventing abuses and misuses. Therefore, Scala 3 redesigns implicit features putting each use case in its place. Following are the changes that we consider the most relevant regarding implicit redesign.

Implicit Definitions ➡️ Given Instances

Given instances is how Scala 3 uses synthesizing context parameters for a certain type. It replaces the previous implicit usage for that purpose. In Scala 3, you can optionally name a given instance. If you omit the given name, compile will infer one.

trait Ord[T]:
  def compare(a: T, b: T): Int

given intOrd: Ord[Int] with // with name
  def compare(a: Int, b: Int): Int = a - b

given Order[String] with // without name
  def compare(a: String, b: String): Int = a.compareTo(b)

Implicit parameters ➡️ Using Clauses

Context parameters (or implicit parameters) help you to avoid writing repetitive parameters over a chain of calls. In Scala 3, you make use of implicit parameters through the using keyword. For instance, from the given instances defined above, we can define a min function that works with them.

def min[T](a: T, b: T)(using ord: Ord[T]): T =
  if, b) < 0 then a else b

min(4, 2)min(1, 2)(using intOrd)
min("Foo", "Bar")

When you just need to forward the context parameters, you don’t need to name them.

def printMin[T](a: T, b: T)(using Ord[T]): Unit =
  println(min(a, b))

Implicit Imports ➡️ Given Imports

There are some cases where an improper import of implicits can cause problems. In addition, some tools lsuch as IDEs and documentation generators fail to handle implicit imports. Scala 3 provides a new way of distinguishing given imports from normal ones.

object A:
  class TC
  given tc: TC = ???
  def f(using TC) = ???

object B:
  import A._
  import A.given

In the example above, we had to import given imports separately even after importing with wildcad (_), because in Scala 3 given imports work differently from normal ones. You can merge both imports into a single one.

object C:
  import A.{using, _}

Here are some specifications regarding given imports by type, take a look at given imports documentation

Implicit Conversion ➡️ Given Conversion

Before Scala 3, if you wanted to define an implicit conversion, you just needed to write an implicit function that receives an instance of the origin type and returns an instance of the target type. Now you need to define a given instance of scala.Conversion class, which behaves like a function. Indeed, instances of scala.Conversion are functions. Take a look at its definition.

abstract class Conversion[-T, +U] extends (T => U):
  def apply (x: T): U

For example, here is a conversion from Int to Double and its shorter version:

given int2double: Conversion[Int, Double] with
def apply(a: Int): Double = a.toDouble

given Conversion[Int, Double] = _.toDouble

The main reason for Given Conversions is to have a specific mechanism for value conversion without any dubious conflicts with other language constructors. According to the given conversions documentation, all other forms of implicit conversions will be phased out.

Implicit classes ➡️ Extension methods

Extension methods are a more intuitive and less boilerplate way than implicit classes to add methods to already defined types.

case class Image(width: Int, height: Int, data: Array[Byte])

extension (img: Image)
  def isSquare: Boolean = img.width == img.height

val image = Image(256, 256, readBytes("image.png"))

println(image.isSquare) // true

Extension methods can have type parameters both in its definition and on its methods. Their definitions can also have multiple methods.

extension [T](list: List[T])
def second: T = list.tail.head
def heads: (T, T) = (list.head, second)

As you can see, extension methods are much cleaner than writing implicit classes. Note that you don’t need to name an extension definition unlike implicit classes.

Intersection and Union Types

Scala 3 brings new ways to combine types, two of these are Intersection and Union Types. 

Intersection Types

Intersection Types are types whose members are all from both of the types that compound it. They’re defined by operator & over two types. & is commutatives: A & B produces the same type of B & A. They can also be chained since they are types too.

trait Printable[T]:
 def print(x: T): Unit

trait Cleanable:
 def clean(): Unit

trait Flushable:
 def flush(): Unit

def f(x: Printable[String] & Cleanable & Flushable) =
 x.print("working on...")

You might be wondering how the compiler solves the conflicts of shared members. The answer is the compiler does not need to. Intersection Types represent the requirements for the values of the types. They work pretty similar to with for type composition. At the point where a value is constructed, one must simply make sure that all the inherited members are correctly defined.

trait A:
  def parent: Option[A]

trait B:
  def parent: Option[B]

class C extends A,B:
  def parent: Option[A & B] = None
  // or
  // def parent: Option[A] & Option[B] = Nil

def work(x: A & B) =
  val parent:[A & B] = x.parent
  // or
  // val parent: Option[A] & Option[B] = x.parent
  println(parent) // None

work(new C)

Note that in class C we need to solve the conflicts: children member appears both in A and B. So its type in C is the intersection of its type in A and its type in B, i.e., Option[A] & Option[B], which can be further simplified to Option[A & B] because Option is covariant.

Union Types

A Union Type A | B accepts both all instances of type A and all instances of type B. Note that we’re talking about instances, not members as Intersection Types do. Therefore, if we want to access its members, we need to pattern match over it. 

def parseFloat(value: String | Int): Float = 
  value match 
    case str: String => str.toFloat
    case int: Int => int.floatValue

parseFloat("3.14") // 3.14
parseFloat(42) // 42.0

Union Types are not inferred automatically. If you want the type of definition (val, var or def) to be a Union Type, you need to do it explicitly, otherwise, the compiler will infer the lowest common ancestor.

val any = if (cond) 42 else "3.14" // Any
val union: String | Int = if (cond) 42 else "3.14" // String | Int

Honorable mentions

Some other changes are pretty relevant and worth mentioning here:

Trait parameters

Scala 3 allows traits to have parameters. These parameters are evaluated immediately before trait initialization. Trait parameters are a replacement for early initializers, which have been dropped from Scala 3.

Universal apply methods

Case class constructors have become quite popular and many developers write case class es just to not have to write new to create objects. So in Scala 3, you don’t need to write new to construct class instances anymore. 

Opaque Types

Opaque Types provide type abstraction without any overhead. By modifying a type definition with opaque, you restrict the fact that the type definition is just an alias to another type where it’s defined. For the clients of its scope, opaque types behave perfectly as a type, not just as an alias. Thus, for instance, you can’t assume the existence of the alias for creating values of an aliased type and assign to opaque type definitions.

Export clauses

Export clauses is an easy way to forward members from one type to another without any inheritance. By putting export aside members selection from a class (including traits and objects) in the body of another class (also including traits and objects), you are copying members and making them available through target class instances.  

Metaprogramming redesign

In Scala 2, macros remained an experimental feature. Due to macros being strongly dependent on the Scala 2 compiler, it has turned out  impossible to migrate it to Scala 3. Scala 3 metaprogramming introduces new constructs to make it easier to use. Take a look at the overview of Scala 3 metaprogramming.

Restrictions and dropped features

In order to simplify the language and make it safer, Scala 3 restricts more options and phases out some features. The most remarkable are:

  • Restricted type projections (C#P) to classes only, i.e., abstract types no longer support them;
  • To use infix notation, the infix modifier must be flagged on desirable methods;
  • Multiversal Equality is an opt-in way to avoid unexpected equalities;
  • Implicit conversions and the given imports discussed above are also kinds of restrictions;
  • The special handling of the DelayedInit trait is no longer supported;
  • Procedure syntax (omitting return type and = on function definition) has been dropped;
  • XML Literals are still supported, but will be dropped in the near future;
  • Auto application, when an empty argument list () is implicitly inserted when calling a method without arguments, is no longer supported;
  • Symbol literals are no longer supported. 

A complete list of the dropped features is available in the oficial documentation.

Do I have to migrate to Scala 3?

First of all, there’s some very well done documentation dedicated exclusively to Scala 3 migration. Here, we’ll just share some thoughts that you might consider when  you start using Scala 3 on your current projects.

It’s well known that it is recommended you are up to date with your tech stack and dependencies, since these can patch bug fixes and make improvements in usability, performance and so on. This is also true for Scala. No matter how little the effort to migrate anything is, sometimes it needs to be agreed with stakeholders. However, Scala 3 migration has been designed to be as smooth as possible.  This means you can take advantage of the most prominent evolutions in the language and make it easier, more fun and safer to program in. 

What’s the right time to migrate to Scala 3?

We would like to recommend you migrate right now, but we know there are variables beyond even a great enthusiast’s perspective. If the following sounds like a great argument to convince your boss, Scala 3 retains both backwards and forwards compatibility with Scala 2.13 (except for macros). Not everything though, but for every incompatibility there’s a cross-compiling solution to turn it around.

What’s the binary compatibility in Scala 3?

Scala 3 offers backwards binary compatibility with Scala 2. This means you can still depend on the Scala 2 library. From the Scala 2.13.4 release in Nov. 2020 you can consume the libraries written in Scala 3. Thus, in Scala 3 you have back and forth binary compatibility. 

Scala 3 supports backwards and forwards compatibility through a revolutionary mechanism in the Scala ecosystem. Scala 3 outputs TASTy files and supports Pickle ones from Scala 2.x versions. Scala 2.13.4 comes with TASTy readers, so it supports all of the traditional features as well as new ones, such as Enums, Intersection types and others. See the compatibility guide for more details. 


We’re very excited about this newer version of our core language. It comes with so many cool changes, in a well conducted way, and with a careful migration process that’s impossible either  to not fit it into the next project or start refactoring existing ones.

Broadly, Scala 3 seems to be a great refinement of Scala 2. Lots of these things we have learned how to live without: for some there were libraries solving given limitations to some extent, for others it was either impossible or beyond their comfort zone. So once Scala 3 is adopted widely, we expect to see more well-typed code to be written, mostly because it’s much simpler to do there.


Here  are some references so you can start learning Scala 3 today:


Check out also:


Emanuel Oliveira

I've been working with Information Technology for 10 years. I'm always looking to improve my skills and doing my best to boost teamwork. My expertise is software development, but my real motivation is to solve people's problems.

Latest Blogposts

24.02.2021 / By Daria Karasek

Technical debt: the nuts and bolts

Today, we’re going to cover this topic in detail, highlighting the challenges and characteristics of technical debt for each and every company.

23.02.2021 / By Jesus Aguilar

What is Apache Kafka, and what are Kafka use cases?

Apache Kafka is the most popular event streaming platform, used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications

16.02.2021 / By Emanuel Oliveira

Scala 3 / Dotty – Facts and Opinions. What do we expect?

Scala 3 is the new major version of the Scala programming language. In this article we’ll highlight some of the changes that we think are of great value for day-to-day Scala programmers. We’ll also talk about the migration process and binary compatibility. Finally, we will conclude with our opinion on this new version.

Need a successful project?

Estimate project