Tommy Lee Software development for everyone else

Cook and Code 2: Ribs

Cook and Code 2

As we make this a monthly tradition, we expanded to having another member. This time around, we had Dmytro Malikov from Accenture come along.

Dish of the day: Apple flavored Baby Back Ribs

Part of Cook and Code is developing dishes that cook on their own while we spend our time on projects. Jimmy and Charlene brought a rack of baby back ribs, and plan to cook it in a pressure cooker pot.

Recipe

https://recipes.instantpot.com/recipe/easy-bbq-instant-pot-ribs/

We used a modification of the recipe above, but replaced water with apple juice and apple cider.

We did our best to keep everyone involved. Accompanying the ribs, Dmytro and I prepped some vegetable side dishes.

I love that cooking is an activity that everyone can participate. No matter what their skill level, we all have some experience in different areas of cooking.

Jimmy and Charlene had some minor difficulties in cleaning the ribs, pulling away the membrane on the underside. Dmytro, our resident expert in butchery, stripped the membrane cleanly and quickly with his knife skills.

Ribs in Pot

So Tender

The ribs came out very tender. Fell apart the bone. 10/10 would 1000% recommend.

For vegetables. I cooked up some sweet potato fries in the oven, as well as a slaw from Costco.

Ribs in Pot

Meetup Lessons Learned

As our second meetup, we picked up a few lessons to work on in future meetups.

  • Too many cooks
  • Too Complicated to make food
  • Bring Snacks
  • Time to Talk vs Time to Code

Issue: Too Many Cooks

We wanted to bring more focus on the working on the coding and the side projects. We had issues with people butting heads together and not having everything available for cooking available.

We’re seeing this could lead to scalability issues.

Solution: Less Chefs

So instead of having everyone participate in the cook, we’re going to try in the next Cook and Code having a dedicated chef.

We’re calling this having a Lead Chef, Senior Chef as well as having an assistant chef or two.

Issue: Complex prep recipes

I’m used to spending a lot of time prepping and cooking, but this doesn’t work well while we’re trying to optimize for time. So from now on, we’re trying to use a more simpler recipe to cook.

Solution: Simplified recipes

Simple enough. Make easier stuff, simpler items.

Issue: Time to Talk vs Time to Code

So there are some awkward moments where people weren’t sure when it is a good time to talk, or focus on work. This occurred mostly during the early period of coding, or during cooking.

Solution: Dedicated Scheduling

So this was brought up by Jimmy. We probably need to have dedicated time slots so people know when to do what.

We built a template of the schedule, and try to adhere to it.

  • Social Periods: Intro and Eating. Talk as much as you’d like
  • Work Periods: Keep chit chat to relevant projects. Or take chats to IMs messaging instead. Follow the headset rules.

Selfie including Tommy

Cook and Code 1: Cooking and Coding Together

Cook and Code 1

So Jimmy Vo and I have a time management problem. We try our best to balance what time we have to:

  • Expand and maintain our social circle
  • Learn new tech stuff or work on side projects
  • Cook and learn more recipes

So he came up with this brilliant idea: A meetup dedicated to cook and code.

Introducing Orlando’s Cook and Code Together

The concept is simple. Developers meet up. We prep some food together, then code. While the food is either marinating or cooking, we work on some personal development, with fellow developers nearby to bounce ideas or get assistance.

Meeting #1: Beef Burgundy

We chose a simple recipe, that takes a while to cook. I grew up eating my dad’s beef burgundy. Sweet, rich, meaty stew that takes an entire day to cook.

We found a recipe that can be cooked within 3 hours. You can find it here:

Beef burgundy recipe

Cook and Code Raw Ingredients

We divided the labor so that everyone can get to cook asap. I chopped up the vegetables, while Jimmy browned the steak chunks. Browning the meat is important in stews, since it adds additional flavor, and preserves some of the texture.

Cook and Code Raw Ingredients

So we threw in all the vegetables, with the water and bottle of wine into a large pot, and left it to cook. I made sure that it didn’t boil over and stirred it around occasionally.

Cook and Code Raw Ingredients

Coding Goals

For our coding session, we made goals that can be achieved by the end of the session. I try to use [S.M.A.R.T] goals as a framework of choosing what to complete.

For Jimmy, he’s playing with Azure and working on hosting his website.

For me, I’m playing with React to create a form page for one of my side projects.

As with any learning experience, we ran into some issues. I ran into some issues with the ES6 Syntax and how CommonJS module loading works. Jimmy’s azure account was having issues, and couldn’t get access to the right areas for deployment.

Around the hour mark, I check on the stew.

Cook and Code Raw Ingredients

We decided to work on figuring out how to get Jimmy’s project deployed. I remembered that .NET Core could be deployed on Ubuntu, so we went ahead and worked on some basic deployment on Ubuntu.

It was a great learning experience for Jimmy, since he’s from a Windows shop. It was some configurations of nginx, and installing .NET Core CLI to the server.

Cook and Code Raw Ingredients

We pick up a lot of small, unwritten things when we work with others. We learn habits from our friends and family, lessons from our mistakes. I passed down some of my experience doing sysadmin stuff, and deploying stuff on Linux. We learned from repeated errors trying to get the site to run.

In cooking, I made a great mistake with the initial prep for the stew. Instead of seeing the meat slowly get softer, but remain in one piece, the meat started breaking apart. The sweet, winey liquid I expected was non-existent.

I forgot to coat the meat in flour.

So we had braised beef instead.

Cook and Code Raw Ingredients

Lessons Learned for a Successful Cook and Code

So with a completed work, and a full stomach, we call the inaugural Cook and Code Together a success! We learned some new recipes to serve and food prep, and expanded what we knew and had something to work towards in the future.

Some recommendations we have for others:

  • Choose a recipe you haven’t tried yet
  • Divide the labor
  • Set measurable goals

Cook and Code Schedule Template

We follow the following time-line for most cook and codes.

  • 11:00 AM - Event Start and meal prep and cook. Social time and people get together.
  • 11:30 AM - Start coding. Waiting until the food is ready.
  • 02:00 PM - Food is served.
  • 03:00 PM - Clean up.

We have another cook and code coming up soon. We usually keep it small so that the group can work together well. While we can divide up the cooking for just one person, working together on the recipe lets everyone share in the experience.

Fixing Hot Reloads on Vagrant

One of the common issues that I’ve run into virtualizing my development environments is having to restart my server manually.

Whenever I developed locally, everything was fast, and changes appeared on first refresh. But when I used a VM, the frustration of restarting the server killed a lot of my productivity.

The Issue

Many of the hot reloads rely on detecting when the file has changed in your directory. Usually, we have something that watches the directory, and listens for a change.

Whenever a file is changed, the system kernel emits a notification letting a program know that something changed. In Linux systems, we have inotify, on OSX, it’s fsevents.

Since we’re using a virtual machine, our program is only listening to notifications emitted on the virtual machine.

Solution 1: Forward File Change Events

A recommended solution is to add a vagrant plugin to take care of forwarding file events to the VM.

Currently we’re using vagrant-fsnotify to forward events.

After installing the plugin we.

  1. add the fsnotify:true to our synced file directory.

    config.vm.synced_folder ".", "/vagrant", fsnotify: true
    
  2. Restart the machine

  3. Start vagrant fsnotify in a separate tab.

    vagrant fsnotify
    

Solution 2: Polling

An older recommendation is to switch to polling, which detects changes to the modification date occasionally. This was Ruby on Rails’ default detection method until version 4.

This is a fallback in cases where file event updates aren’t available. You can check your dev-server’s start system options, and see if a polling option exists.

I don’t recommend this, since polling takes up more memory and CPU, but if inotify forwarding is unavailable:

In this stackoverflow post, they enabled polling via vagrant rsync-auto.

Sorting between multiple Models in Ruby on Rails

Recently, I ran into a situation where I had two different Models where I needed to display both types of records in chronological order. We could grab all the records, and apply a simple sort.

  comments = Comment.active.to_ary
  posts = Post.active.to_ary

  # Combine them together and sort.
  comments_and_posts = comments + posts
  comments_and_posts.sort! { |a, b| a.created_at <=> b.created_at }

  # Use Kaminari to paginate across
  results = Kaminari.paginate(comments_and_posts).page(params[:page])

While this is easy to read and understand, this is problematic as Comments and Posts grows. This method would require grabbing all the records before applying a sorted order. Pagination is also affected poorly, since we still need to grab both entire sets of records.

We can leverage the database to handle the sorting for us, but Rails does not have a pretty ORM way in the documentation to perform this kind of operation. We can use plain SQL.

Raw SQL Union

We use the UNION operator in SQL to combine both records together, then select from a subquery.

Since there is a possibility that ids are shared between the models, we need to generate a unique_id in these queries.


SELECT id, group_type, unique_id, created_at
FROM (
  SELECT id,
        'comments' group_type,
        CONCAT('comments', id) as unique_id,
        created_at
  FROM comments
  WHERE active = 1

  UNION

  SELECT id,
         'posts' group_type,
         CONCAT('posts', id) as unique_id,
         created_at
  FROM posts
  WHERE active = 1
) as results

ORDER BY created_at desc
--LIMIT 25 OFFSET 200

We can add a LIMIT and OFFSET to use as pagination. Using a raw SQL query, we can pull out the information easily.

@connection = ActiveRecord::Base.connection
result = @connection.exec_query(sql_query)
comments_and_posts = []

result.each do |row|
  comments_and_posts << row
end

While we can go through each row in comments and posts and find by ID, that would give us about 25 additional queries. We can do a little extra work to reduce the amount of queries down to 2.

# Store their order in the array.
id_positions = comments_and_posts.map { |x| x['unique_id'] }

items = comments_and_posts.group_by { |x| x['group_type'] }
items.each do |group_type, item|

  # Get only the ids to reduce amount of queries needed
  ids = item.collect { |x| x['id'] }

  if group_type == 'posts'
    content_items = Post.where(id: ids).to_ary
  elsif group_type == 'comments'
    content_items = Comment.where(id: ids).to_ary
  end

  # Replace original search with new searched content
  content_items.each do |content|
    unique_id = group_type + content.id.to_s
    content_index = id_positions.index(unique_id)
    comments_and_posts[content_index] = content
  end
end

comments_and_posts

Drawbacks to Raw SQL

By diving into Raw SQL, we have direct control of what to query. For smaller queries, this is manageable.

We lose some key features, like scopes. Scopes are composable, and simplifies our understanding of queries. We would have to maintain a separate set of queries if we decide to add filtering.

Raw SQL is also reliant on having a specific flavor of SQL. If we decide to switch MySQL to Postgres, some queries would change and break.

Using Arel to access existing scopes

We’re looking to use ActiveRecord’s Arel, to improve our codebases readability. Note: Since this is a private API, it may change without warning. It is expected that Arel becomes part of public API in Rails 5.1.

We can replace the raw SQL using Arel as a Query Builder. For ActiveRecord::Relation query, we can call .arel to drop into Arel.

Rewriting SQL into Arel Components

Since Arel is used to build queries only, we’ll end up transforming the resulting query builder to a SQL String with .to_sql.

So we’re going to create each part of our query piece by piece.


comments_query = Comment.active.arel
# Strip all the SELECT choices
comments_query.projections = []

# Select our items that we need
comments_query.projections('comments.id as id',
                    'CONCAT("comments", comments.id) as unique_id',
                    'comments.created_at',
                    '"comments" as group_type')

# comments_query.to_sql generates

# SELECT
#   id,
#   'comments' group_type,
#   CONCAT('comments', id) as unique_id,
#   created_at
# FROM comments
# WHERE active = 1

posts_query = Post.active.arel
posts_query.projections = []
posts_query.projections('posts.id as id',
                  'CONCAT("posts", posts.id) as unique_id',
                  'posts.created_at',
                  '"posts" as group_type')

# post_query.to_sql generates

# SELECT
#   id,
#   'posts' group_type,
#   CONCAT('posts', id) as unique_id,
#   created_at
# FROM posts
# WHERE active = 1

We can then combine both of the queries together with a UNION.

combined_query = comments_query.union(:all, posts_query)

# (#{comments_query} UNION #{posts_query})

query = Arel::SelectManager.new(Arel::Table.engine, Arel.sql("#{combined_query.to_sql} as results"))

# (#{comments_query} UNION #{posts_query}) as results

Calling Arel::SelectManager.new(Arel::Table.engine, sql_string) gives us the equivalent of wrapping up combined_query into a subquery.

limit = 25
offset = 0
sql = query.project(Arel.star)
           .order("created_at DESC")
           .take(limit)
           .skip(offset).to_sql

# SELECT * FROM (#{comments_query} UNION #{posts_query}) as results
# ORDER BY created_at DESC
# LIMIT 25
# OFFSET 0

results = ActiveRecord::Base.connection.select_all(sql)

Standardized Front-end Tools to Increase Productivity

One of the challenges I ran into while learning modern front-end developments and frameworks is the challenge of putting all the ideas together. While I have a few years doing back-end development, and server side languages, my experience with client-side, Javascript, and SPAs was limited.

Learning difficulties

When I played with Angular, React or Meteor back in 2014-2015, I had a lot of questions. Some involved the tooling, but many were based in how to setup a project.

  • How do I structure a project?
  • What are the recommended best practices?
  • How do I deploy?
  • How do I handle testing?
  • How do I manage importing libraries and other packages?

Lost in the forest

These Modern frameworks required a lot more planning, thought and research for me to get running. Having to development my own project structure, or having a few restructures, was frustrating.

I had a little assistance from a local Javascript Expert Chris McClean, and having fellow developers would help out in the learning process. This would be easier though for everyone if there was a simplified process in setting up and getting ready to go.

Starter Packs and Generators

One of the ways we used to start projects was using existing starter packs, or Yeoman. This gave us a better experience in generating an existing project, and allowed us to get up and get going.

Set it up for me

Having a default preference set up allows us to move onto the more important areas of our app. Starters usually fix issues like:

  • ES6 Transpiling
  • Minification and Compilation
  • Generating a Production Target
  • Wiring up the folders

Starter packs

While this made it a lot easier to build out projects, it had some issues. Some starters that exist are specific to a certain configuration. I used an older Yeoman starter for Angular, which used less recent build tools like Bower and Grunt.

It was also structured in an MVC project structure, while the community recommends a Feature based project structure.

  • Relied on the generator’s documentation
  • Could use older, not current best practices
  • Hard to debug if you don’t know what the different tools are doing.

We love Official Command Line Tools

It’s increasingly common for many web developers to use some form of command line tools to better improve productivity. In the Javascript world, it’s almost required to use npm to manage packages.

Recently, the major Javascript Frameworks have released their cli version of the tool to help create new projects, components, and build and serve. These were based on ember-cli, and allow us to start working and not worry about all the other intricacies.

I’ve been playing with Angular 2’s ng-cli, and I feel it improves my qualify of life for development. Other than generating default code, it allows for the following:

  • Well defined conventions
  • Easier support
  • keeps my mental overhead small

Conventions reduces conflict

While people argue about different kinds of conventions, many will agree that conventions are a good thing. By having a defined set of conventions for a project, people would generally know where to put things.

When the documentation is not a best practice

One of the issues I ran into while learning Angular 1 came from a lack of agreed upon conventions. While the official documentation showed one way to create directives, the community documentation and best practices recommended a separate way.

Have defaults cover most common use-case

Angular 2 took recommendations and practices from the community and have built tools that support the best practice. The issue of how you should set up your Components or Services is already made up. Calling ng generate component my-component-name creates all the steps you need.

Easier support

One of the benefits of having a standard structure in an application is easier ways to support the application later.

Problems are Google-able

When we have to create our own structure while we’re learning, lots of minor breakage and mysterious problems occasionally appear. We try to google issues, and many of the cases found are almost close to ours, but don’t work.

When standards exist, we have an understanding of what should happen, and it makes it easier to communicate.

Easier to onboard new developers

Newer people have an easier time looking up documentation, and can use the built-in tools to do accomplish tasks better. When we work on a project that has lower documentation, or using a bespoke framework, it can be difficult to comprehend what is exactly going on.

I can focus on the big things.

There are times where I feel like I’ve written the same set of code over and over again. Looked up similar documentation each time, because it was not obvious on how I should structure an area.

Standard Tools allow me to not think, and have it be easily available to do some things.