The InBetweeners’ Journey

By Amy Brandt, Sarah Breen, Kerri Gardner and Allison Liedtke
Mentors: Alan Chen and Dillon Eng

Our Product: Account Performance Monitor

Our product is a dashboard specifically created to help the Digital Advertising Advocates (Specialists) manage their accounts. When the page loads, a specialist can select their name and that loads their data from the database. Then the specialist can filter by any combination they choose (including none specified at all) from the following filters: customer, package, service type, and search engine. The main feature of the dashboard is a graph that keeps track of the average spend-to-target ratio for the last three months. In addition to that, it keeps track of the current underspend for all of that specialist’s accounts and the current streak for number of days that there has been no underspend.


Why It Is Useful

This dashboard is useful to the specialists because it is faster and more efficient than the current Tableau dashboard the specialists are using. In addition, our dashboard is focused and more user friendly. This dashboard is also useful because the graph shows the average spend to target ratio for the specialist. This means it shows the amount of money spent divided by the target budget. The closer the line is to one, the closer they are to spending their target amount for that month. The dashboard is also useful because it shows the underspend amount. This helps the specialist know when they need to spend more on a certain account. The specialist can easily switch between the filters without hassle, and the information is updated almost immediately.


To create our product, we started by making the UI using a combination of HTML, CSS, JavaScript, jQuery, and D3 (a JavaScript graphing library that we used to build our graph). Most of us were familiar with these languages so this part was not as challenging to complete. At first we used mock data to create our UI, but once we had the majority of the parts functioning, we switched to making a server and database connection. For this, we used node.js and Node.js is a server-side JavaScript framework that allowed us to create a full-stack web application; using the mssql package on npm, we were able get data from the database and then send it to our UI, all in JavaScript. Once the data was sent to the UI, we used a filter that we had created with Crossfilter (a JavaScript filtering library) to narrow down the information being shown based on what the specialist picked on the UI. These technologies were new for us, so implementing them in our project proved more challenging and took longer to complete; however, once we had them up and running, we were able to move on to the testing phase. To test our business logic we used a testing library called Jasmine to make sure all of our filters were working correctly and that the data being shown was accurate. Finally, after testing was complete, we worked on refactoring our code to make it cleaner and more efficient.

How it works

Our application is composed of three main layers: the client, the server, and the database. The client and server were tied together using WebSocket via We used event-based communication, enabling our app to receive information and update on demand. The main interaction between layers occurs upon selection of a specialist’s name from the search box. Once a name is selected, an event is triggered, running a check to make sure the text entered is a specialist’s name. If it is a specialist’s name, the client-side socket emits the text entered in the search box. Our server-side socket then receives this emitted message, causing it to insert the specialist’s name into the query and then send the query to the database to receive all of the information about that specialist. The server contains an event-based function that runs once the database has gathered the information requested and returns it to the server-side socket. This event runs a function containing another socket emit message, sending the received specialist’s information as a list of JSON objects. This emitted message is received by the client-side socket, where it is passed into the Crossfilter function. This function parses the received JSON objects into a series of JavaScript objects of arrays. These objects are received by the UI JavaScript files and used to populate the filter boxes, graph, underspend, and streak. Now that all of the specialist’s information is in the client-side of the application, no more communication between the layers is necessary until another specialist name is selected. Instead of querying the database for a new dataset each time different filters are selected, the select boxes are each linked to the Crossfilter each time their state is changed. The Crossfilter then uses these selections to create a dataset specific to that unique combination and passes in that new dataset to the dashboard. Because all of this processing is done on the client-side, the information changes corresponding to the filter selection appears to be nearly instantaneous.

Our Experience

DSC_6219 copy

From our internship experience this summer, we have taken away many insights and a tremendous amount of useful knowledge. First of all, we were all able to gain experience in technology and programming. This experience will really benefit us as we head off to our first year at college and begin to consider where we want to work in the future. Something that was especially important in relation to technology and programing was our exposure to both the backend and frontend sections of our project. This enabled us to have a well-rounded knowledge of how our project was built. In addition, we leaned a lot about teamwork and how a team runs at a software company. For example, we learned about how to act within a team and that it is often necessary to split up tasks individually or between partners. We also learned about agile development and the benefit of daily standups and weekly sprints. This goes along with another take-away of ours, which is agile development. While creating our project, we saw the advantage of breaking down a big project by creating small and achievable goals. This is valuable because it meant our project was always evolving and it was helpful to complete one small, focused task, get it to work, and then move on to another. Our final takeaway is the idea that one learns from their mistakes. Throughout the internship we each had different struggles we had to work through. These struggles, although difficult at the time, taught us a lot about programming and ended up benefitting our project by making it stronger as we worked through them.

Besides the experience we gained in programming and software development, we gained insight on Cobalt as a whole and useful advice for the future. Leaving this internship, we have a good sense of what Cobalt does and how our project can affect the company. In addition, every Friday the girl interns would attend a lunch where we would meet with an employee of Cobalt and discuss their career. This was very insightful because we received helpful advice for things like interviews, following your passion, and not letting men dominate in your career. In a nutshell, working at Cobalt was a very enjoyable experience for all four of us and we gained an immense amount of knowledge.

About collectivegenius
Everyone has a voice and great ideas come from anyone. At Cobalt, we call it the collective genius. When technical depth and passion meets market opportunity, the collective genius is bringing it’s best to the table and our customers win.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: