Scala @ Empathica

Technology Change: .NET to Scala

Share Button

In our careers as software developers we’re frequently pigeonholed to one particular language & framework. Often times these specializations go even further and before you know it you’ll find yourself being the front end guy, the datalayer girl, or in my case: the person who gets saddled with anything that has to do with deployment and devops work in general :). This is a story about how Empathica as an organization transformed itself from being a Microsoft .NET shop to a Scala one. It didn’t happen overnight, and this is definitely not a howto guide on how to achieve similar change somewhere else.  If you’re looking for a side by side comparison of Scala and C#/.NET, you won’t find it here.  This is just an account of how the change itself unfolded.  Since a transformation of this kind is thus far unique in my career I wanted to share it with the community and hopefully provide some wisdom for other teams deciding to take a similar plunge into Scala or some other open source technology.

The stage

Empathica was almost exclusively a Microsoft shop as late as the end of summer 2012. There was some legacy infrastructure in Java, but it had remained largely untouched since the company formed in the early 2000′s. For all intents and purposes, Empathica embraced using Microsoft technologies throughout the entire stack. Everything was written in C#, all servers ran on Windows Server, OLTP and OLAP databases were SQL Server and Analysis Services. It wasn’t until Simon Palmer joined the company as CTO that anyone in the development department gave the technology stack a hard look. After all, once you’re fully invested in Microsoft it can be quite difficult to break away.

As an aside I would like to mention that in recent years Microsoft has made great strides in supporting the OSS community.  They’ve open sourced large parts of the CLR, open sourced big projects like Entity Framework and ASP.NET MVC, and have even provided a portable class library license that can be run on platforms other than Windows. There are lots of new Microsoft OSS initiatives as indicated by their OpenTech division.  However, I think it will still be some time (if ever) that we see widespread support and adoption of Microsoft works into the larger open source community.

When Simon came onboard it represented a sea change for Empathica’s development department. At a higher level it changed the company from being services orientated to products based.  The development process was switched nearly immediately to XP. Consultants were brought in from Berteig Consulting to help with the transformation. Internal political red tape was cut and development teams were empowered to do their own project planning, track their own progress, and self organize with as little direct supervision from the management team as possible. Members of the development team were encouraged to challenge their existing beliefs about how to build software.

EDIT 25/10/13: I want to elaborate a bit about when our new CTO came onboard.  At the time, Empathica’s products were stagnant, we were beginning to lose big clients, and something needed to be done.  Simon was brought on to give the company more effective technical direction because up until then there were a few people wearing CTO-type hats, but not one voice in particular.  Simon brought a wealth of software development leadership to the organization.  He’s incredibly tech savvy, a serial entrepreneur, and has a great deal of experience applying Agile practices and principles to large development teams.  One of the first changes Simon made was to bring Agile to Empathica.  It’s my opinion that that change is what set the stage for developers to try new technologies like Scala.

As usually happens with change, it didn’t resonate with everybody. A lot of the existing development staff weren’t too keen on what was happening to the department.  There was a slow exodus of technical staff who left to seek new opportunities. This gave the management team the opportunity to hire a new team of smart and passionate developers, product managers, and user experience people that were onboard with the new direction. It was around this time that I was fortunate to be offered a developer position at Empathica in July 2011.

The new team was challenged to reimagine our space (Customer Experience Management) by developing a new line of products.  XP gave us lots of flexibility to operate as we saw fit.  Teams were flat; the most junior developer had as much of a voice during an iteration planning session as a veteran of the craft.  I had more input into the development process and product direction at Empathica than I did as a Team Lead and Development Manager in my past positions.  I credit this open and collaborative culture as the main reason that we were able to try new technologies like Scala in the first place.

Why Scala?

Now that the stage is set, I’ll elaborate on what exactly got the ball rolling to try Scala. Before every development project we have a “technology discussion”. We’re always open to trying new things, but for the most part new projects chose a base technology stack of C#, ASP.NET MVC, etc. We kicked off a number of new projects around August of 2012. One of these projects turned into a Scala project that I’ll affectionately refer to as BeetleJuice for the remainder of this post. The kickoff for BeetleJuice started off as a “Hey, wouldn’t be cool if we had something like this to compete with competitor X?” In fact BeetleJuice initial requirements were so scarce we didn’t even know if it would be something we would release to our clients. It was an experiment; a grandiose software spike into an area of the business we never fully explored before. This gave us an enormous amount of liberty to try a new language and framework because:

  1. The project wasn’t overly complex
  2. As with most software spikes there was an implicit permission to fail and to try new things
  3. At the end of the day it didn’t really matter what language or framework we chose for our application server because of its simple nature

It was then I challenged the rest of the team to honestly consider other technologies. “This is our chance to try something new. Let’s take it.” I said. Personally, I had had some experience with Python and Django, but I was also interested in Ruby and the Rails framework, so I proposed both. Another colleague tossed Node.js into the ring. And finally, one of our newest hires at the time, Steven Skelton proposed Scala.

I didn’t know much about Scala. I gave myself the weekend to read up on it and I quickly appreciated its elegance and concise syntax. I recognized the opportunity that since it is JVM based we implicitly gained awesome platform interoperability and native OSS library support.  Scala is statically typed, which seemed refreshing to me given the mainstream’s infatuation with dynamic languages like Ruby, Python, and JavaScript over the past 10 years.  It has a reputation as being a great language to write highly performant concurrent systems in.  We were sold and in the next iteration the BeetleJuice team decided to give Scala a try.

The experiment

We created a Play! 2 web project. It was an easy transition from ASP.NET MVC from a framework perspective.  We started developing BeetleJuice and it wasn’t long before we knew that we could deliver a functioning Scala web application with a minimal hit on productivity.

Once we began demonstrating our progress at weekly iteration demo’s it became clear that BeetleJuice was evolving out of its experimentation phase and into something that had real business value. When this became apparent, the management team decided to take a closer look at our technology choice and started asking tough questions like “How do we deploy this in a production environment?” and “Will this framework and language be around next year?” and inevitably “Am I going to be able to find people to support this technology in 1, 3, and 5 years?” I started investigating the maturity of Play! and to my delight found that it had an great list of endorsements from tech companies around the world (LinkedIn, Klout, theguardian, to name a few). The Play! 2 framework was still new when we first began using it, but its version 1 had been available to the community for some time. From Play! we soon learned more about the Typesafe stack and services. We came to the conclusion that due to Typesafe’s inclusion of Play! into its stack and the successful history of the project itself, that we had picked a winner.

It was also becoming clear that Scala was being heavily adopted in the industry based on data from job sites like, Typesafe’s impressive client list, the number of native Scala OSS projects available, and more anecdotal information like the number of recruiters contacting me on LinkedIn! This chart on shows that in early 2013 employers looking for developers with Scala experience for the first time outpaced the demand for those with Clojure experience or any other JVM language other than Java itself.

EDIT 26/10/13: I felt it important to only compare 2nd generation JVM languages.  IMO it’s not fair to include the Java language itself because it dwarfs adoption of 2nd gen languages due to its omnipresence in the industry for over 20 years.


Simon and some of our other technical management began dabbling with learning Scala themselves. I remember Simon telling me that the power of the language is obvious, but that it would be easy misuse it and get yourself in trouble. Why of course. With great power comes great responsibility.  This reminded me of that old programmer humour post “How to Shoot Yourself in the Foot in Any Programming Language” by Mike Walker.

Scala: you stare at your foot for 3 days without any sleep, you then figrue [sic] out how to shoot yourself in the foot with one line of code… recursively.

Granted, this addition came from a commenter about 7 years after the original post, but I think the author described Scala perfectly within the context of the meme!

Legacy integration

Integration with the rest of our Microsoft platform and products was a concern from the start. It was time to start investing in backend API’s.  Not only to integrate with our .NET projects, but because of a separate long term goal to abstract away our persistence layer so we could change it in the future.

Our API initiative began with serving only the data needs of BeetleJuice. We started a spike on different API technologies. The first was a tomcat based web service that let you abstract queries based on an ANTLR domain specific language. You could craft HTTP requests that contained projections, filters, and joins (like SQL) and it would return a JSON representation of the data. It worked, but it was quite complicated to use and the queries looked a whole lot like our data access queries themselves.

Our second API implementation was written by Steven and based Apache Thrift and Twitter’s Finagle. The API endpoints had methods you could call much in the same way as a traditional RPC service. Thrift/Finagle gave us the capability to generate native C# and Scala code so we didn’t need to manually write our own data transfer objects, clients, and services.  Finagle has a ton of functionality out of the box such as easy horizontal scaling with ZooKeeper, excellent stats using Ostrich, concurrency made easy with Twitter’s library of threadpooling logic, and an easy way to instantiate servers and client connections.  Thrift/Finagle is an excellent choice to write API’s in.   Steven has blogged about Thrift and Finagle extensively and even gave a short talk at a local Toronto Scala Meetup. You can find his slides here.

Once we had a Thrift API ready and in production we were able to get BeetleJuice and our other .NET projects to connect to it and start consuming data.

A moment of doubt

BeetleJuice and our new API were proving to be a success. It wasn’t long before a large new project, Grail, was on the table and a new technology discussion began. This discussion was different than those past because we now had some actual Scala experience on the team.  We even hired top Scala talent such as the likes of Katrin Shechtman (who runs a cool startup on the side called Becipe that runs on a Scala, Play! Mongo and Heroku stack). I’ll admit that I had my reservations about the language and I thought it was my responsibility to contrast some of the Scala evangelization that had been growing on the team and play devil’s advocate.

My main argument against Scala was a common complaint about the tooling. The compiler’s slow, the IDE’s are buggy and lacking in features, the preferred build and dependency management framework, SBT, has a tremendous learning curve. I was seriously questioning whether or not the rest of the developers on our team were going to get onboard with Scala knowing that using it would mean taking a productivity hit at first. I made my arguments, but ultimately it was decided it was worth the risk. I was relieved that management and my developer colleagues were onboard despite the drawbacks. It was at this time that I decided to make it a priority to really embrace Scala and find new ways to cope with the different development practices and toolchain from what I was used to in .NET.

An endorsement of Open Source technology

After Grail had been humming along for a few weeks, Simon made an official announcement to the development department that endorsed the use of Scala, Linux, and other open source tools and technologies. This was a critical moment in our organization because up until then C#, .NET, and the Microsoft stack were our default choices.  This was Simon’s announcement outlining our long term technical strategy.

To: Empathica Developers
Subject: Technical Strategy


I seem to have set the Scala cat among the C# pigeons yesterday, so I wanted to lay out my view of our long-term technical strategy in the hope that it will allay some fears and clear up any ambiguity that I may have caused.  And I’d like to apologise if anyone felt disenfranchised, or as though there were important decisions being made without their input.

First, some principles:

  1. We make a point of recruiting the brightest technical people we can find and our expectation is that you will be comfortable working in whatever technology is required for the job at hand.  I think you all fit into this category and I hope that you feel the same way, but if you are working for a long time in a single technology a rut can look a lot like a groove.
  2. The job at hand will require different technologies depending on the need, and we will not be bound to any single technology.  A technology choice will be made appropriately for the needs as we see them.  The choice of technology is one of the hardest because it happens early and is somewhat irreversible.  It also has many dimensions, some of which are hard to discern at the start of a project.
  3. We have to balance our agile rapidity against good architecture.  It is one of the core criticisms of agile, and particularly XP, that you too easily ignore architecture as you ricochet from one user driven feature to the next.  This point is particularly relevant to us now as we are embarking on two significant architectural changes, one being a core data model change, the other inserting an API layer.
  4. There’s never a good time to make a large architectural shift and you will always be balancing tactical commitments against long-term aspirations, as we are right now.  Furthermore you never get given the time to do it.  A data API is one of the key missing components in our platform.  What we have done with the [API] is an excellent example of how to start to insert a formal tier in the architecture, and what we are contemplating for [Grail] and [MultiPass] access is a good natural extension.  The success of any future revolution in our data tier will be contingent on us having done the [API] work well.
  5. A profusion of service endpoints has a somewhat bad architectural smell.  You should have good reasons to fragment your APIs into many pieces and expedience is rarely a good reason.  A single endpoint may not be the answer either, but there should be a good architectural reason to split it apart.

With these in mind here is what I would like to see us move towards in the medium to long term:

  1. Minimal dependence on Microsoft, both O/S and data stores
  2. Increased infrastructure in “the cloud”
  3. Horizontal scalability as a fundamental architectural principle

As a consequence this probably means:

  • Less C#/.Net, more Scala/Java/Open Source and
  • Less SQLServer/OLAP Services, more NoSQL/Columnar Storage/Mongo/Vertica
  • Less Windows, more Linux
  • Less cabinets/pizza boxes, more AWS/Private cloud

Taking a leaf from the agile manifesto wording, I mean: “while I see value in the things on the left, I see more long-term value in the things on the right”.

So, no hard decision has been made about our future languages or technologies, and there are no rules that all new development must happen in Scala.  Instead I want us all to align on the principles and then make the right decisions as we balance our short-term needs against our long-term aspirations.

If you have any further questions please feel free to come and find me.  I am stuck at home today because the four snowmen of the snowpocalypse arrived in Oakville this morning, but I’ll gladly take a phone call.

Simon Palmer
Chief Technology Officer
Empathica Inc

- Simon Palmer

Dealing with technology change

Not everybody was pleased with the transition to Scala. Giving up years of experience with one particular language and framework just to be a “beginner” with a new one can be a tough pill for some people to swallow.  Most people inherently don’t like change. For a programmer, changing technologies can have a profound impact on your productivity and in some cases your mental stability!  In our industry you have to fight to overcome these innate reactions.

In the technology world there are no excuses for people who don’t adapt.  Software development doesn’t have a deep body of knowledge like some engineering practices do (I know calling software development an engineering discipline is a slippery slope, but please indulge me).  For example, the know-how and best practices for building a bridge have been known for thousands of years.  What we consider software engineering may have been around for close to 50 to 80 years, depending on how liberal your definition of what defined the beginning of software engineering is.  It could be decades or even centuries before we reach some kind of steady state of technological development.  Until that time people in the software industry need to learn to adapt and embrace what’s new or be left behind.

A good carpenter finds the right tool for the job. This analogy easily extends to software, especially when the tools and jobs themselves change at such an astonishing pace! Compiled, scripted, imperative, or functional. Pick something the community ascribes to to get the job done. Haven’t used it yet? Try it out!  Those who don’t strive to learn new languages and technologies are in for a rude awakening when the years of experience they have with Blub don’t count for anything with employers.

A data renaissance through API’s

There became a need to extend our Scala Thrift/Finagle API as Grail was being developed. It was decided that when there were data needs for new projects they would be put behind our API. This began a data renaissance of sorts; not only did we put Grail’s needs behind an API, but as we identified overlapping needs of our existing products, we retrofitted them to make use of the API as well. This had a snowball effect of establishing API’s for the remainder of our legacy projects that were still under active development.

A significant allocation of our development team was refactoring projects to use Thrift while at the same time establishing data contracts for API’s. For a number of months we checked off boxes on our API wishlist. An incredible amount of code was written, refactored, rewritten, or simply not needed any more. Our product middle tiers shrank down to a token of their former sizes. Introducing an API layer forced us to establish clear lines between our apps and their data. This separation of concerns was a huge boon to the distributed architecture of our entire infrastructure and in line with the sentiments Simon made in his technical strategy.

We started thinking a lot about concurrency and distributed systems.  We started investigating new data access technologies to standardize across our API layer such as Slick for RDBMS and Couchbase for caching.  We had already been using Mongo for parts of our infrastructure, but we also started planning an overhaul of the rest of our RDBMS infrastructure from SQL Server to cheaper and ironically, more scalable solutions like Vertica (a columnar relational database) and “No SQL” databases.  All of these decisions were made that much easier because of the introduction of our API layer and the myriad of library choices available in the Scala and Java OSS ecosystem.

Training and education

A lot of Empathica developers were interested in using Scala.  Unfortunately, other than those that had been working on BeetleJuice, Grail, and our API, there weren’t a ton of people with experience in the language.  We entertained the thought of bringing in instructors from Typesafe or their Canadian Scala training partner, (more on them later), but in the beginning we looked to Coursera and their growing catalogue of Massive Open Online Course’s (MOOC’s).

Functional Programming Principles in Scala

Martin Odersky, the inventor of Scala and a key figure in the Java community, provided video lectures for the Functional Programming Principles in Scala course on Coursera.  This course has been wildly popular and is regarded as one of the most successful software programming MOOC’s ever.

We had more than 50,000 registered students— an unfathomably large number in the context of traditional teaching. While large, that number doesn’t tell the whole story; as is typical for a MOOC, a statistical majority of those students participate no further beyond watching a couple of videos to find out what the course is about. Of the 50,000, about 21,000 students participated in the interactive in-video quizzes that are part of the lectures, and a remarkable 18,000 unique students attempted at least one programming assignment. A whopping 9,593 students successfully completed the course and earned a certificate of completion— that’s an incredible 20% of students, which blows the average 10% rate of completion for MOOCs out of the water.

- Functional Programming Principles in Scala: Impressions and Statistics By Heather Miller and Martin Odersky

With the blessing of the management team we decided to allocate one day a week for the length of the course to watching video lectures and working on weekly assignments.  On Thursday morning we would meet together in a room, discuss last week’s assignment and watch the lectures for the week.  We often paused and attempted to solve in-lecture exercises and to discuss the concepts being taught.

Some of us had doubts of how useful a functional programming course would be in day-to-day work at Empathica, but with the explosion of LISP dialects in recent years and the need for side-effect free code that can run concurrently there’s been a huge demand in the industry to apply these skills to new systems.  The course exercises do involve a lot of algorithms and basic data structures, but it’s also an excellent introduction to Scala syntax such as pattern matching, for comprehensions, Scala’s collection types, and much more.  I would strongly recommend this course to anyone interested in learning Scala, functional programming, or both!

As I mentioned earlier, there are many vendors that offer a variety of Scala training options. not only runs training courses based on Typesafe curriculum, but they’ve also introduced a new program called the Scala Developer Factory; a rigorous training course that promises to deliver skilled and productive Scala developers!  I’ve kept a line of communication open with Mike and Eric at to discuss possible future on-site training options for the developers at Empathica.

Embracing the Scala community

Toronto has an active Scala community.  Several meetups are planned each year and are usually hosted at software shops that have adopted Scala into their organizations.  The Toronto Scala Meetup has run meetups for a few years.  Chris Dinn has often organized these events.  Speakers from the community are invited to present on any Scala related subject they like.  Representatives from Typesafe and often show up as well.

Scala @ Empathica

Katrin proposed the idea that we host a meetup at Empathica’s downtown office.  At the time, our office was a dingy top floor on Peter St. atop an infamous night club in Toronto’s entertainment district called “Time”.  Plans to host the meetup were shelved until we setup shop in our new digs on Spadina ave.   With renewed vigor, Katrin began preparing a schedule.  Our RSVP list grew.  We ran the event with 3 speakers and representatives from both Typesafe and  Empathica provided the space plus refreshments and the folks at were kind enough to buy pizza’s for a group of roughly 50 people.  It was a huge success and the social gathering after the talks allowed for lots of networking and prompted many interesting discussions.  I can safely say on behalf of the devs at Empathica that we’re psyched to continue to attend, contribute, and host these events in the future.

Empathica Toronto Scala Meetup - IntroToronto Scala Meetup - Steven

Toronto Scala Meetup - Panorama


There are now several dev teams at Empathica working on Scala projects.  Tool chains are being developed.  Coding standards are being established and enforced.  Libraries are being standardized across our projects.  It’s starting to return to a comfortable pace of software development for everyone.

That’s it.  I’m known for being verbose so I apologize for the lengthy read, but I wanted to get this story out in its entirety if for no other reason then for Empathica’s own posterity.  It wasn’t an easy transition.  Looking back, there are things I probably would have done differently, but I think with the information available we did a pretty good job.  I hope that for those of you thinking about a technology change that our experiences help you come up with a plan to bring about that change.  I’m sure there are also some people reading that have gone through such a change and I welcome your input in the comments whether good or bad.

If I were involved in a significant technology change again I would definitely take a different tack both in its proposition and implementation.  Some things I would do differently and some things I would not.  I’ve compiled a summary of what I think are the most important matters to address.

  • Consensus building.  Get the whole development team’s input and suggestions.  Be as accommodating as you can, but don’t expect unanimity.  It’s important to get input from everyone you work with on a daily basis because they will all be affected by the decision.
  • Start with something small that has potential to grow it into something big.
  • Budget for training in your iteration/sprint/project plan.  Some people prefer learning on their own by reading a book, some like the classroom environment, and some just want to start writing code.  See who falls into what group and try to accommodate them all.
  • Ask hiring prospects how they would feel if they worked with a different technology than one they have experience with.
  • Include OSS experience as part of your hiring criteria.  Either experience contributing to specific projects or experience working with open source languages and frameworks.
  • Be patient.  Significant technology changes don’t happen over night.  But with the right people and the right attitude amazing things can happen.

I’ll conclude this post with a link to a little bit of internet history.  This video made the rounds in 2010 and although it does not specifically endorse Scala it’s still a harrowing story about leaving the Microsoft nest and embracing open source technologies. Plus it co-stars the lovely Scala Johansson :)

(NSFW, depending on where you work!)

EDIT 08/09/14: If you would like to know more about our transition on .NET to Scala then check out my follow up post: .NET to Scala developer Q&A with Typesafe.

Share Button


  1. Martin

    Great reading, thanks!

    Could you tell us more about the change from official MS tooling (VS, MSSQL, IIS etc.) to the rather heterogenous Scala variety (Eclipse, SBT, 3rd party libraries)? I would imagine this one of the greatest hurdles to get over when migrating a team.

    • Sean Glover

      Thanks Martin!

      Yes, tooling was one of the biggest pain points during our transition. There’s no simple answer to that question.

      I think tooling is the number one most significant impact to productivity when switching to Scala from .NET. The tooling available in Scala is not so much lacking as it is different. As Microsoft developers we are pretty spoiled with the Microsoft tool chain. Visual Studio (with JetBrain’s ReSharper) is one of the most magnificent development environments that I have ever used and I think most developers at Empathica would agree with that. However, other than possibly IntelliJ for Java (another JetBrains creation), there are few IDE’s out there that really come close to matching the capabilities of Visual Studio. When using the Microsoft stack you get a set of tooling that works well together and great integration with other Microsoft technologies (most of the time). When using .NET, it rarely makes much sense to use any non-Microsoft libraries and products because so few projects can really compete.

      In the non-.NET Open Source community it is rare to find developers completely enamored with a particular IDE (except maybe Java devs). From my experience you’re often encouraged to build your own tool chain by picking and choosing those tools that make you most productive. Two developers that may be working in the same language, same framework, and even on the same project will often have a completely different toolchain from one another. In order to be a diversified Open Source developer you have to develop different skills and practices to do tasks a typical IDE might handle.

      For example, you have to choose a text editor you’re comfortable with. It doesn’t have to be the same as your coworker, just pick something that interests you and try it out. Some common cross platform options are VIM, emacs, and Sublime.

      To debug there are a host of techniques OSS developers generally employ. Your average rubyist or python developer knows their logger API inside and out. The main reason to write logs is to assist in troubleshooting. If your logs don’t have enough information for you to reproduce a problem then you need to add more. To begin troubleshooting we will start with the same information we do in .NET, a stack trace. When not using an IDE, often your next step is to enable tracing to find out more about the context when a problem occurs.

      Next we use a test (or write a new one) that recreates context as accurately as possible and then bootstrap your way into your application without the need of running the application in full integration. tail your log or keep an eye on your test console and read your enhanced logging. If it’s reproduced and you still can’t tell what’s going on then add some more logging. Rinse and repeat and until you’ve identified the problem and fix it. Keep or discard your test once you’re done. This should sound familiar, because it is essentially just borrowing the “make a small change, run test, re-factor [or add more logging]” mantra defined in TDD.

      If these kinds of troubleshooting techniques aren’t desirable to you then there’s always Scala-IDE. It will support the classic breakpoint and process attachment troubleshooting techniques .NET developers are used to. In regards to SBT integration with Eclipse I acknowledge that the one way integration sucks. I don’t know of a solution to this, but there are alternatives worth investigating. Ditching SBT for Maven and installing the m2e-scala Eclipse plugin would provide the two-way integration between a dependency management solution and the IDE. Another option could be to use IntelliJ with its Scala support and alpha SBT support. I’ve heard mixed reviews about IntelliJ’s support so far, but it’s a promising start.

      There is certainly going to be a temporary productivity hit when switching to Scala, but you could say the same thing about a team taking on any new language. I recommend embracing OSS development troubleshooting techniques and investigating alternative solutions that may be available.

  2. Pingback: Technology Change: .NET to Scala | Boardmad
  3. Pingback: Technology Change: .NET to Scala | Enjoying The Moment
  4. Pingback: Real Estate News NYC via Tigho | Technology Change: .NET to Scala - Real Estate News NYC via Tigho
  5. Pingback: Technology Change: .NET to Scala | randonom | M...
  6. Pingback: Technology Change: .NET to Scala | Rocketboom
  7. Olly Wickham

    Having been through a similar experience, I really enjoyed this post. The thing that struck me the most was how wonderfully well it was presented in the blog post. Kodos!

      • Olly Wickham

        Your opening sentence “In our careers as software developers we’re frequently pigeonholed to one particular language & framework” struck a chord with me. All too often people with many skills and companies with existing products get channelled into a particular technology. The cost of moving becomes larger, the more you invest. I find this is particularly true of the Microsoft stack, as it tends to only have one IDE, one platform, etc.

        To make the most of the innovations in cloud computing, NoSQL and many other great tools, you really need to start putting a lot of open source into your solutions (which is a good thing!). I have had lots of small complications with Ruby libraries and various windows ports of open source tools, and feel that unix is a more natural fit for most open source. That is not to say that there are not some great open source libraries on the .Net stack, but it is just that the majority of significant open source projects seem to start on a unix variant.

        In my opinion, the JVM is a great starting point for portability to almost any platform. It has a great community, many implementations and lots of languages sitting on top. The problem moving to the JVM from my point of view was that Java is no where near as productive as C# (IMHO), and is very verbose.

        At various points I have investigated other JVM languages and never really felt the benefits were bigger than the loss of productivity compared to .Net. Things like JRuby are interesting, but by many accounts not as good as standard ruby, which excludes the JVM. I also personally have never really enjoyed using dynamic languages for large code bases, and having a statically compiled language in addition to a bunch of automated tests gives me a warm fuzzy feeling. The refactoring tools and code discovery is much easier in a statically compiled language.

        So last year I started learning Scala and enjoyed it. I did some internal presentations about it, and internal project or two. When there was enough interest among developers, it became a serious possibility that it would be used if the right projects came along. Eventually projects came along that Scala was actually the best tool for the job – I was keen to make sure that we weren’t falling for the “if all you have is a hammer, everything looks like a nail” effect.

        The developers that we have at our consultancy are keen and talented to pick things up, but we also do a lot of joint projects with client developers. With a few clients, I am not sure that all of their developers care enough about learning a new tool to go through the pain of a shift to a functional language. For these projects it is less suitable.

        I do hear from other companies that have adopted Scala that the best thing is recruitment – far fewer applicants, but a higher quality of applicant. Attracting good talent one of the most important parts of putting a team together. A REALLY good team can make almost any technology work if required, but to put a tool like Scala in their hands makes for a great code-base.

        As for how our company finds Scala, I think in general it has been a unifying experience between Java and C# developers. Some of us have done the Coursera courses and find them useful. Some general notes are: every who uses Play is productive pretty quickly; C# developers seem to grasp everything a bit faster than java only developers, just because of lamdas, implicit typing, ASP MVC, etc; The IDE support was pretty bad when we started, and it is getting steadily better; The libraries before 2.10/sprayIO until recently were a PITA for compatibility too, but that has improved; The compiler is SLOW. We are upgrading laptops for devs doing Scala, and no doubt, at some point in the future this will not be an issue with moore’s law and the parallel nature of the compiler, plus compiler optimisations.

        With one of our clients, I did actually do a case study for Typesafe about that the experiences on that project. See the Valtech case study here: or the PDF version here

        My observations about F# is that there are some nice research projects done by Microsoft in it. it has some great features. I do doubt that it isn’t really far enough from the very capable C# for most people to justify using it (e.g as opposed to Java->Scala). The evaluation model is also eager, which whilst a bit simpler, not so powerful. I did a search on two job sites in London to gauge the level of serious interest in various functional languages, and there was about 2 Clojure jobs, no F# jobs and 80 Scala jobs. It is important to understand that my definition was using that language as the primary focus, not a foot note such as “also, experience in X,Y,Z skill is also good”

        A quick footnote about Mono, is that I have used it on and off since
        early 2000′s and found it to be far behind C#’s rapid pace of development. I did once find a project
        that hit the sweet spot and it used it to good effect. Mostly, I just feel it is
        clone, as opposed to a vibrant, community based language that is seeing
        real innovation, like Scala.

        Hope that was of interest. To cut a long story short, we are enjoying Scala, and using it on some nice projects. Of course we are still using C# on some nice projects, in cases where it is a better fit.

    • Sean Glover

      I was asked this question on /r/programming too and I hope you don’t mind me giving you a canned response.

      I haven’t used F# before, so I can’t contrast it to Scala directly. I assume you’re asking that if we were looking at functional languages, then why didn’t we consider F#. That wasn’t the case, we weren’t looking at functional languages specifically, but popular languages and frameworks in general. Scala allows you to use functional programming constructs, but it’s not strictly a functional programming language as it has lots of progressive ideas related to OO programming too. Martin Odersky has said that most of the new concepts that Scala has to offer are related to imperative and OO paradigms.

      Our decision to try out Scala was largely arbitrary. We wanted to try a new open source language and framework because we were given the flexibility to do so. At the time we didn’t anticipate the impact it would have on the organization. We didn’t even know if the project we were working would make it to production (it did, eventually). However, we did constrain our choices to technologies we could easily run on Linux. At the time we had a pretty significant amount of Microsoft infrastructure that cost a lot in licensing fees (Windows Server, SQL Server, Analysis Services, tooling, etc), so anything but .NET was a fairly easy sell to the organization from that perspective.

  8. mavnn

    Out of curiosity, did you also consider mono and things like F#? It seems like it would be a useful extra tool in the box given it gives some of the same advantages as JVM/Scala while still being able to leverage your knowledge of the .net platform.

    • Sean Glover

      We didn’t really consider Mono because we wanted to learn something new for BeetleJuice.

      The Mono debate has come up for other projects (i.e. reuse some code we already had, throw it behind an API and host it on Linux). TBH I don’t know a heckuva lot about Mono, but there aren’t many proponents for it in our office.

      For more info on why we didn’t choose F# see my reply to vittore in the comments.

      • mavnn

        Thanks for the response (and apologies for the double post with vittore , his appeared while I was writing mine). I understand completely the just wanting to learn something new, although I have to admit I’d have gone clojure or haskell :)

  9. AaronHeld

    Thank you for sharing. I’m moving my company in a similar direction as I pry business logic out of MSSQL procs and into maintainable python and appreciate the time you took to share this.

    Specifically the letter you wrote is brilliant. In hindsight had I sent out something so clear and supportive I would have prevented much of the uncertainty the move generated.

  10. Tomek

    Great post.

    Do you think .NET platform is some kind of danger for a company so you have to switch to Open Source? How would you compare your productivity before and after the transformation (in terms of speed of development process and ease of product maintenance)? What was developers number one complaint before the process and what is it now?

    (I’m from Poland so I’m sorry for my English)

  11. niko

    Very intersting article. Have you considered using F#? What type of your costs goes to database? Our company uses C# on top of Oracle database. We think a major win could come far more from not paying Oracle and using MongoDb.

  12. Pingback: Technology Change: .NET to Scala | randonom | p...
  13. Pingback: Anonymous
  14. Pingback: Links for October 21st through October 25th
  15. ulon

    Very interesting read. Thanks!

    I’m very surprise to see anybody going away the Visual Studio to something like Idea with sbt and play framework. I mean, I’m honestly shock to see you liked the change!

    I’ve been working almost a year with idea 12 and play 2.x. The tooling is so bad, the compiling so slow and the play development so dreadful in comparison with .NET (i did some stuff there, but not much) that I cannot blame the people who left your company. I think I would’ve too.

    I agree with you that the language is not really a reason to choose one or another tech (God, I love objective-c and many people say it’s horrible) but for me the tooling is so important… it can be all the different.

    Good luck and I hope you keep us posted. I’m eager to see how it goes.

    • Sean Glover

      Tooling is important. IMO IntelliJ is the only thing that approaches the capabilities that VS has for .NET projects, but it’s still sorely lacking in features and is still quite buggy. I’ve mostly just got used to not having it around. I’ve relied on the tooling that’s available to me, the REPL, the Scala Worksheet, my logger, and testing to make up the gaps. For more of my thoughts on the matter check out my response to Martin’s question about tooling.

      You’ve misunderstood the reasoning for people leaving Empathica. That happened in regards to an earlier development process change that transformed the company before we adopted Scala. I agree it may be a little confusing. I may revise it to make it more clear.

      Thanks for the feedback.

  16. Pingback: This week in #Scala (28/10/2013) | Cake Solutions Team Blog
  17. Nigel

    Oh very interesting post. Having played around with Scala/Java a bit for various jobs, but mostly have worked with .NET technologies on linux servers using mono. The monodevelop tooling/IDEs so far seem better than what I have found for scala, and the C#/F# v. Scala/Java trade-off isn’t decisive in my mind relative to the dynamic/C-C++ trade off, but am still getting the grasp of things. I gather you decided to ditch .NET without moving to mono though, can I ask how that decision went?

    • Sean Glover

      Hi Nigel. Thanks for reading. In the beginning it wasn’t a concerted push to move our infrastructure to Open Source. It was just an experiment to try something new (that wasn’t C#). I would agree that tooling isn’t as mature as that available for .NET on Windows (I can’t really speak for MonoDevelop not having any experience with it), but I found ways to overcome that disadvantage (read my reply to Martin’s comment for more). We didn’t move to Scala/Java, it was a straight launch into Scala, although we did use some Java libraries because of its native interoperability with the JVM. The decision went well and is still largely considered a success.

  18. Dylan

    Nice post Sean, thanks. I’m on the same transition from C# to Scala with my team (mainly for Scalding and Spark on Hadoop which is something we started using about 8 months ago.) I miss the tooling a great deal, but love the succinctness of Scala, the REPL and way functional programming lends itself to distributed computing. Glad to hear it was a success for you guys.

    • Sean Glover

      Thanks Dylan. Using Scala with FP patterns for distributed processing like you describe is definitely a good use case. I’ve never used that particular tech stack, but I took a peak at the Scalding library this morning and it sounds pretty cool. Yeah, tooling can be a pain.. or you can look at it as a learning experience to adopt more “conventionally OSS” software development techniques. When I think about how I develop software relative to when I used the Microsoft tech stack I’m satisfied that I’ve become productive in working with a wide array of technologies (not just Scala). I think of the .NET days as programming with training wheels ;)nnBest of luck with your project and thanks for reading!

Post a comment

You may use the following HTML:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>