Migration Guide from Slick 1.0 to 2.0

Slick 2.0 contains some improvements which are not source compatible with Slick 1.0. When migrating your application from 1.0 to 2.0, you will likely need to perform changes in the following areas.

Code Generation

Instead of writing your table descriptions or plain SQL mappers by hand, in 2.0 you can now automatically generate them from your database schema. The code-generator is flexible enough to customize it’s output to fit exactly what you need. More info on code generation.

Table Descriptions

In Slick 1.0 tables were defined by a single val or object (called the table object) and the * projection was limited to a flat tuple of columns that had to be constructed with the special ~ operator:

// --------------------- Slick 1.0 code -- does not compile in 2.0 ---------------------

object Suppliers extends Table[(Int, String, String)]("SUPPLIERS") {
  def id = column[Int]("SUP_ID", O.PrimaryKey)
  def name = column[String]("SUP_NAME")
  def street = column[String]("STREET")
  def * = id ~ name ~ street
}

In Slick 2.0 you need to define your table as a class that takes an extra Tag argument (the table row class) plus an instance of a TableQuery of that class (representing the actual database table). Tuples for the * projection can use the standard tuple syntax:

class Suppliers(tag: Tag) extends Table[(Int, String, String)](tag, "SUPPLIERS") {
  def id = column[Int]("SUP_ID", O.PrimaryKey)
  def name = column[String]("SUP_NAME")
  def street = column[String]("STREET")
  def * = (id, name, street)
}
val suppliers = TableQuery[Suppliers]

You can import TupleMethods._ to get support for the old ~ syntax. The simple TableQuery[T] syntax is a macro which expands to a proper TableQuery instance that calls the table’s constructor (new TableQuery(new T(_))). In Slick 1.0 it was common practice to place extra static methods associated with a table into that table’s object. You can do the same in 2.0 with a custom TableQuery object:

object suppliers extends TableQuery(new Suppliers(_)) {
  // put extra methods here, e.g.:
  val findByID = this.findBy(_.id)
}

Note that a TableQuery is a Query for the table. The implicit conversion from a table row object to a Query that could be applied in unexpected places is no longer needed or available. All the places where you had to use the raw table object in Slick 1.0 have been changed to use the table query instead, e.g. inserting (see below) or foreign key references.

The method for creating simple finders has been renamed from createFinderBy to findBy. It is defined as an extension method for TableQuery, so you have to prefix the call with this. (see code snippet above).

Mapped Tables

In 1.0 the <> method for bidirectional mappings was overloaded for different arities so you could directly pass a case class’s apply method to it:

// --------------------- Slick 1.0 code -- does not compile in 2.0 ---------------------

def * = id ~ name ~ street <> (Supplier _, Supplier.unapply)

This is no longer supported in 2.0. One of the reasons is that the overloading led to complicated error messages. You now have to use a function with an appropriate tuple type. If you map to a case class you can simply use .tupled on its companion object:

def * = (id, name, street) <> (Supplier.tupled, Supplier.unapply)

Note that .tupled is only available for proper Scala functions. In 1.0 it was sufficient to have a method like apply that could be converted to a function on demand (<> (Supplier.apply _, Supplier.unapply)).

When using a case class, the companion object extends the correct function type by default, but only if you do not define the object yourself. In that case you should provide the right supertype manually, e.g.:

case class Supplier(id: Int, name: String, street: String)

object Supplier // overriding the default companion object
  extends ((Int, String, String) => Supplier) { // manually extending the correct function type
  //...
}

Alternatively, you can have the Scala compiler first do the lifting to a function and then call .tupled:

def * = (id, name, street) <> ((Supplier.apply _).tupled, Supplier.unapply)

Profile Hierarchy

Slick 1.0 provided two profiles, BasicProfile and ExtendedProfile. These two have been unified in 2.0 as JdbcProfile. Slick now provides more abstract profiles, in particular RelationalProfile which does not have all the features of JdbcProfile but is supported by the new HeapDriver and DistributedDriver. When porting code from Slick 1.0, you generally want to switch to JdbcProfile when abstracting over drivers. In particular, pay attention to the fact that BasicProfile in 2.0 is very different from BasicProfile in 1.0.

Inserting

In Slick 1.0 you used to construct a projection for inserting from the table object:

// --------------------- Slick 1.0 code -- does not compile in 2.0 ---------------------

(Suppliers.name ~ Suppliers.street) insert ("foo", "bar")

Since there is no raw table object any more in 2.0 you have to use a projection from the table query:

suppliers.map(s => (s.name, s.street)) += ("foo", "bar")

Note the use of the new += operator for API compatibility with Scala collections. The old name insert is still available as an alias.

Slick 2.0 will now automatically exclude AutoInc fields by default when inserting data. In 1.0 it was common to have a separate projection for inserts in order to exclude these fields manually:

// --------------------- Slick 1.0 code -- does not compile in 2.0 ---------------------

case class Supplier(id: Int, name: String, street: String)

object Suppliers extends Table[Supplier]("SUPPLIERS") {
  def id = column[Int]("SUP_ID", O.PrimaryKey, O.AutoInc)
  def name = column[String]("SUP_NAME")
  def street = column[String]("STREET")
  // Map a Supplier case class:
  def * = id ~ name ~ street <> (Supplier.tupled, Supplier.unapply)
  // Special mapping without the 'id' field:
  def forInsert = name ~ street <> (
    { case (name, street) => Supplier(-1, name, street) },
    { sup => (sup.name, sup.street) }
  )
}

Suppliers.forInsert.insert(mySupplier)

This is no longer necessary in 2.0. You can simply insert using the default projection and Slick will skip the auto-incrementing id column:

case class Supplier(id: Int, name: String, street: String)

class Suppliers(tag: Tag) extends Table[Supplier](tag, "SUPPLIERS") {
  def id = column[Int]("SUP_ID", O.PrimaryKey, O.AutoInc)
  def name = column[String]("SUP_NAME")
  def street = column[String]("STREET")
  def * = (id, name, street) <> (Supplier.tupled, Supplier.unapply)
}
val suppliers = TableQuery[Suppliers]

suppliers.insert(mySupplier)

If you really want to insert into an AutoInc field, you can use the new methods forceInsert and forceInsertAll.

Pre-compiled Updates

Slick now supports pre-compilation of updates in the same manner like selects, see Compiled Queries.

Database and Session Handling

In Slick 1.0, the common JDBC-based Database and Session types, as well as the Database factory object, could be found in the package scala.slick.session. Since Slick 2.0 is no longer restricted to JDBC-based databases, this package has been replaced by the new DatabaseComponent (a.k.a. backend) hierarchy. If you work at the JdbcProfile abstraction level, you will always use a JdbcBackend from which you can import the types that were previously found in scala.slick.session. Note that importing simple._ from a driver will automatically bring these types into scope.

Dynamically and Statically Scoped Sessions

Slick 2.0 still supports both, thread-local dynamic sessions and statically scoped sessions, but the syntax has changed to make the recommended way of using statically scoped sessions more concise. The old threadLocalSession is now called dynamicSession and the overloads of the associated session handling methods withSession and withTransaction have been renamed to withDynSession and withDynTransaction respectively. If you used this pattern in Slick 1.0:

// --------------------- Slick 1.0 code -- does not compile in 2.0 ---------------------

import scala.slick.session.Database.threadLocalSession

myDB withSession {
  // use the implicit threadLocalSession here
}

You have to change it for Slick 2.0 to:

import scala.slick.jdbc.JdbcBackend.Database.dynamicSession

myDB withDynSession {
  // use the implicit dynamicSession here
}

On the other hand, due to the overloaded methods, Slick 1.0 required an explicit type annotation when using the statically scoped session:

myDB withSession { implicit session: Session =>
  // use the implicit session here
}

This is no longer necessary in 2.0:

myDB withSession { implicit session =>
  // use the implicit session here
}

Again, the recommended practice is NOT to use dynamic sessions. If you are uncertain if you need them the answer is most probably no. Static sessions are safer.

Mapped Column Types

Slick 1.0’s MappedTypeMapper has been renamed to MappedColumnType. Its basic form (using MappedColumnType.base) is now available at the RelationalProfile level (with more advanced uses still requiring JdbcProfile). The idiomatic use in Slick 1.0 was:

// --------------------- Slick 1.0 code -- does not compile in 2.0 ---------------------

case class MyID(value: Int)

implicit val myIDTypeMapper =
  MappedTypeMapper.base[MyID, Int](_.value, new MyID(_))

This has changed to:

case class MyID(value: Int)

implicit val myIDColumnType =
  MappedColumnType.base[MyID, Int](_.value, new MyID(_))

If you need to map a simple wrapper type (as shown in this example), you can now do that in an easier way by extending MappedTo:

case class MyID(value: Int) extends MappedTo[Int]

// No extra implicit required any more