A quick and exciting update: After three years in the MacCoss Lab, over a year of hard work at North Seattle College, and lots of stress about my future, I have finally been accepted to my dream school and program:  Computer Science at the University of Washington. I will be beginning in the Fall of 2017 and the stoke is high my friends!

On a side note I will be off the grid from now until July 16th, 2017. To see what I will be up to for the next month click here!

I'm Yuval and I like to adventure.

I live in Seattle

and work as a programmer in a Lab.

I'm a student, studying Computer Science.

When I can, I travel.

Around here, I explore.


I have been working in the MacCoss Lab since October 2013. I work with a team of 6 developers on an open-source software called Skyline. Skyline is a freely-available Windows application for building methods and analyzing resulting mass spectrometer data. Working in a mass spectrometer lab allows us to work with researchers to create the best possible application that is constantly updated to support the quickly growing and evolving field.

My job in the lab is broad. I work on everything from graphic design projects, full stack web development, to desktop C# software development.


I spend most of my days working on Skyline. It is a pretty large, mainly C# code base , that runs on top of ProteoWizard.
My responsibilities have included:

  • Designing a nice start page with quick access to commonly used features and recent files.
  • Implementing an on build code inspection using Resharper's code inspection api.
  • Adding a better error reporting features.
  • Unit testing and updating our internal testing and reporting framework.
  • Adding internal code inspection to our build using ReSharper's sdk
  • Internationalization including creating a ReSharper Visual Studio package to warn us about unlocalized strings
  • Creating charts and visualizations.
  • Updating our external tool framework and developing external tools (see below)

You can learn more about skyline at skyline.ms.

Tool Store

My first big project working for the Skyline team was to create a web database for our external tools.  External tools are tools that anyone can create that "add" to Skyline's functionality.  The tool framework allows anyone to create an external tool which has access to Skyline's data.  Given access to the user's data the tools can then do things that are not implemented in Skyline's core features.  The tool store give tool creators a nice web interface to modify, version, or add to their tool. It tracks downloads, keeps a version history, and most importantly connects seamlessly with Skyline on the desktop. Users can install tools straight from the store in Skyline, see when their tools have updates, and discuss with the tool owners issues or questions that they may have.

Along with building a user interface for users to add external tools I have also helped create external tools for many companies and labs that use Skyline. People often want an seamless way to integrate their own softwares with Skyline, and Skyline's tool framework makes it easy to do that.

Tools I've worked on:

  • MPPReport (Built by me)
  • MSstats (Helped integrate existing tool with Skyline)
  • SProCoP (Helped integrate existing tool with Skyline)
  • QuaSAR (Helped integrate existing tool with Skyline)
  •  3 more tools close to being published as of April 2017, 2/3 of them built by me

Web development & graphic design

I wear many hats in the lab. I'm a software developer, but I also spend a lot of time doing full stack web development as that was my main skill going into this job. Our website and our web suit of tools, run on a framework called LabKey which I had to learn when starting here. It runs on a tomcat server and is geared towards academic research and data analysis in a web environment. Alongside back-end development I've ton a ton of client side work, and would consider myself proficient using Javascript and many common libraries. I absolutely love doing full stack development and building something from the ground up. I've learned a lot about planning ahead and how important chosing the best tools for a job is.
Major web dev projects:

  • User Registration module for our website that allowed users to create and account and then requires them verify their email. Before this users would fill out a form which would email and we would manually create an account for them.
  • TestResults
  • Passport
  • PanoramaWeb - Updated and added to Skyline's web portal which allows users to upload, share, run scripts on, and maniplate their data
  • Crux toolkit  website upgrade for the Noble lab.
  • Skyline Tool Store
  • Countless minor updates to Skyline's website

Graphic design work:

  • Designed a PNAS issue cover
  • Many logos and icons for Skyline and colleagues in need
  • Figures for papers


JavascriptHTML / CSSNode.jsPostgreSQLJade

I was tasked with creating a web application that allows users to browse protein standards that have been analyzed by selected reaction monitoring (SRM). Not all peptides are stable when sitting in an autosampler waiting to be puth through a mass spectrometer. Because of this I have created an application that enables researchers to see which peptides are the most stable to measure. Peptide stability is displayed by plotting the intensity of each peptie before and after 3 days in the autosampler. Users can browse over 150 proteins, thousands of peptides, and view/download data that can be opened and used in Skyline. Passport uses PanoramaWeb's api to fetch the data, UniProt for additional protein information, d3.js for graphing, and node.js as the backend. Lots of updates are in queue, just waiting for funding to continue working on this project.

To learn more, see Passport.


JavscriptD3.jsHTML / CSSJavaTomcatPostgreSQLLabKey Framework

TestResults is part of my work on the Skyline project. We have a tool which runs ~8k-12k unit tests per machine every night. I designed this web GUI to aggregate all of the test result data from all the machines, allowing us to see how our codebase is doing. At this time the tool has been in regular use for almost two years with our database contains over 100 million rows of test data. Every night roughly 200,000 tests are run and the results are sent to this application I've created.

The home page displays a summary of the previous nights test runs, failures, and detected leaks. Based statistics gathered over time for each machine my tool compares each run with previous datasets of what each run is supposed to look like. If anything seems off it gets flagged so a developer known to take a further look.

An interactive d3.js chart showing the results for each machine's test runs the previous night. In this current night, 15 machines ran a total of roughly 150,000 total tests. The page loads this much data in under half a second including server side and front end rendering.

The trends feature allows us to select runs that we consider normal for each user. Using the trained metrics TestResults is able to warn us if a certain users machine is performing abnormally.

This page allows us to view trends from any date range including "The Beginning of Time," which in this case was around 2014(when we started archiving test data). As this application's database grew and things got slower, I did a massive overhaul, brining the page load time of this page down to half a second from 20 seconds.

Every morning the team receives an email with results from the previous nights test runs.

Fishing in Alaska

For five seasons, a month each, I've worked as a set-net fisherman in Egegik, Alaska. It is very physical work and during the main Salmon run, the shifts are long. Here or some photos, curtesy of Stephen, that hopefully explain what it means to work as a set-net fisherman in the bay.

The only way on and off the beach.
This is home for a month out of the year. We have no electricity, internet, or running water. We power a stove with propane, charge car batteries with a wind turbine, and enjoy the peace of this beautiful place. It's hard to tell what time it is in this photo as the day's are 22 hours long, but probably around 1am, just having finished a long shift.
Pulling along our net, Tom and I reel in today's bounty of beautiful Sockeye.
Raising the flag outside the cabin.
Big King in the net!
"Pitching" fish. One of these totes, when filled, holds about 1,000 lbs of fish.
Working fast, two on a net, so we can pull it out before the tide leaves us stranded in the mud. Mud fishing is the worst.
And... we got stuck in the mud.
Unable to pull the net in time because it was full of fish we are left picking fish out of the mud. Chad doesn't look happy and I'm sure I wasn't either.
Salty, mudy, and done with another tide. Time to go clean up.
The wash station.
At the beginning of the season we set a line out and anchor it in the mud using large screw anchors. As we're not allowed to use a motorized boat in this type of fishing we use the running line. We pull out on the line, set our net, and return. Pulling a boat with 3000lbs of fish in it, fighting heavy waves, is the most physically challenging part of any season.
Canning smoked Sockeye.
This is our bathroom for a month. Can't ask for a more scenic view while on the loo.
All of our drinking water comes either from a rainwater collection system or this spring. The spring is far, so as long as we get enough rain to drink we usually stick to that.