Why every developer needs to know google sheets & excel programming

I’ve recently talked about different cloud documentation services Smart(Er) Documents – Quip, Notion, Airtable, Coda Or Good Old GDocs & GSheets and the my take on Smarting up Google Docs.

Let’s dive right in the few key reasons why every developer should know google apps script and get familiar to work with GAS in google sheets and docs.

1) Provide your technical output (data) in common ground (a tool that is known by pretty much any computer user or intuitive to learn if they don’t know)

I highly believe any tool allow developers and non-dev role in a team to effectively communicate complicated information. Generally data sets to be eventually used for a form of story tellin g (a website/blog content, an internal or external product performance report, financial or behavioral analysis etc.). When it comes to developer crew only, we always find 100 different ways to express what we want to show to the world in form of scripts utilizing whatever library, tools comes to our hands. But when it comes to handing over our output (whether it’s a SQL output in CSV format, or a dynamic data set in a service), we are as developers are constrained as well as the party we’re handing over our work to start their work with a lot of constraints. They also have to learn whatever data format we give them. This also makes the collaboration one direction, starting from developer then ending up in the non-dev role working with that data (marketing people, product management or executive roles).

Real trouble starts when we have to repeat same work over and over. Because we can export the desired data model from whatever tool we’re using, generally as static CSV/Excel exports if it’s a tabular data. If you are doing same or similar work multiple times, it’s only beneficial to think ways to automate the process. Let’s think a simple family expenses management on excel/google sheets (because everybody knows or learn to work with sheet tools easily). In this hypothetical scenario, let’s say there is a database or API provides your credit card statements (there are actually services for extracting this information from your bank – but for security reasons, it gets very complicated on the authentication layer). As developer you can start the story from “extracting the data” and your output will almost always will store the extracted data in some place – most likely a database. 

You may or may not be the “analyzing” person in the family or even if you are, you may need to review your analysis with rest of the family members and may get their own take in the dataset for a real collaborative understanding and iteration. 

Google Apps Script has many ways to connect to 3rd party APIs to pull data and if done well, do these operations automatically refreshing data set or manually with UI interactions but essentially a developer’s role to be completely automated in a “smart google sheet” or doc.

Once this is set up right, it can be copied and extended easily. It goes back to the traditional “macro enabled excel templates” approach, but it really works out well for everybody. And google docs products are true real-time collaborative tools that you can literally open computers on a conference call, collaboratively edit and work on same information to bring it to an understandable level with simpler aggregate analysis or breakdowns or even charts without having to reinvent the wheel to code all these up from scratch.

2) Mockup / Bootstrap a lot of ideas in simpler form without needing to code a lot. Perhaps these can be used even long term use cases

As developers, more than half of what we do is automation. We code to not repeat ourselves. This is just a human desire to invest in our brain to not to do time consuming less intellectual work. Our brains are wired that way form the time we born. It always seeks for shortcuts. Developers are the hackers gets these shortcuts discovered and implemented in digital realm.

Whether you are working alone or working with others (team), you will always have things that you have to do multiple steps to produce an output while you work. To not repeat yourself, you will want to create tools to yourself that helps you achieve same output with less steps. The shortest way we always use the buzz word “one-click” to reflect the magical robotic process when we refer to an “easy” tool. Regardless the steps become one or multiple, there is always area for improvement in a work we do every day.

Developers tend to jump in their own comfort environment to create scripts to do things for themselves. I don’t know any developer who doesn’t have scripts that helps resizing images, or orchestrates their common tasks, so whenever we need to do the same operation, we just want to push the “one-click” button and see the output. Most of the time we invest a lot of time to build these scripts regardless of we are super proficient on the languages we know and the environment. Also we will always have issues with sharing this work with others in the team or invest more time to convert these scripts to be utilized by others in the team or publicly by anyone in internet.

Google gdocs products give a great canvas and a lot of limitations that positively shape and give us limits to make sure what we create in google docs or sheets or slides have to be in familiar format that can be used by other developers or non-dev team members.

Just to acknowledge that google apps script has real limitations, generally if you are doing resource heavy task on the cloud servers or the browser. But it is also super easy to start in a very basic form factor so you don’t have to worry about a lot of cosmetics or look and feel.

3) Spreadsheet tools are essentially databases

Every developer has to know relational database modeling and working with any size and structure of the data. At the core of relational databases, you have to work with tables and the relationships between them. 

Google spreadsheet and Excel are great tools that can be extensively used by both developer and non-developer people together because its both visual tool and a structured data by its nature.

The reason both tools have programming interfaces because its one of the core use cases for these tools. I also think the companies behind them are forced to have this programing features that because it’s probably one of the most requested features.

Al in all, utilizing a spreadsheet file as database, where you read the structured information from it as well as inject new rows or query and update the rows becomes one of the most common use cases from its programming perspective. So whether you like it or not, you will eventually come to a work that the “users” of your product is already or will be using spreadsheet files aside of your product. So you will at least have to learn how to process these files (import, ingest) and be able to expose your product’s details (data) in these formats (export features).

The best way to see this is being a very common need to look at the most popular automation tools on internet like IFTTT, Zapier. One of the first integrations they did was google sheets.

I personally have many google sheets files that is fed by IFTTT or workers I run on my servers that exports activities daily/weekly/monthly or by trigger. Anytime I want to see what’s going on and want to analyze my activities (both personal activities like my driving history, or leaving/entering NYC area, or my expenses on my credit cards) I can easily do a pivot table to see the trend, or group them by categories or other angles to look at the same data with different questions. I can’t even describe how many different ways similar scenario can be for a business purpose. Analytics by itself will cover a long list of things you (or a non-dev team member) can and will want to do with data sets like these.

Wrapping Up

The last reason in the list is probably is the strongest from a professional skill perspective, so you kinda have to know your way around in both static excel/csv files as well as knowing how to work with google APIs for google sheets from it’s authentication steps to the actual endpoints/methods in the google sheets api to work with the spreadsheets.

But #1 and #2 are more important if you are (or want to) a resourceful problem solver. Every developer is at the end of the day non-stop problem solver and having Google sheets in developer’s toolbox is a must.

Just launched my newest product in ProductHunt: Sheet2Cal – Organize calendar events inside from your Google Sheets doc.

Sheet2Cal helps you to sync your event data from Google Sheets to your favorite calendar app. Plan and collaborate your events (social media/editorial content, wedding & travel planning) freely on Google Sheets and export it easily to a calendar subscription using Sheet2Cal.

Sheet2Cal creates a complete calendar event by using the information in a Google Sheets doc you want. It helps you to schedule and update your calendar events (Such as social media, wedding & travel planning) inside from your Google Sheets doc.

Back Story

At Nomad Interactive, we love to create various tools to make our lives easier. And we share it with the world if we think it makes the same impact for others! Sheet2Cal was born during such a need.

When uploading a shot to Dribbble, we were doing our content planning, such as title, description, in a Google Sheet document for team review and collaboration. In addition to that, we were also creating a calendar event with similar information as a reminder to ourselves – when to publish each post on our shared calendars.

During the review process, we had to make every change we made on the post document in the relevant calendar entry. To prevent this unnecessary step, we developed Sheet2Cal and started making our entire calendar organization using our main Google Sheets document, from creating an event to updating it. Then we thought that this would work in many different subjects and decided to share it with everyone for free!

Now you can use Sheet2Cal as you wish for your similar needs! If you have any problems or anything you want to ask, we would love to help as a team!

To show your support, you can visit the Product Hunt page and give us a 💬 shout or an 👍 upvote!

You can also visit the project page here: https://sheet2cal.com/

Smarting up google docs and sheets

I’d like to talk about the growing experience with Google Docs and Google Sheets to use them for more complex needs and functions in this post.

It’s been a recent theme of the topic that I’ve been talking about ways to use collaborative cloud services for documentation purposes. Whether for personal or business/team communication. I’ve also talked about going beyond the need to have plain written form of documents to the smarter, more complex form of information forms in Smart(Er) Documents – Quip, Notion, Airtable, Coda Or Good Old GDocs & GSheets post.

I like Google docs from the availability perspective that it’s available without needing a complicated pricing structure and Google allowing google docs to be open for public google accounts that most people can access google docs with their personal Google accounts even if they don’t have company accounts (not using GSuite). I also like google docs is very familiar in the visual form that all of use are used to from Microsoft Office or other office software suites (Open, Libre). This also may look google docs products outdated compared to more modern collaborative editing tools like Notion, Quip.

All of these services have equally powerful APIs but probably not as robust as Google Docs. Google Docs also has a secret weapon that I’d like to introduce lightly in this post called Google Apps Script aside from their powerful API. This is a very wide and under-discussed topic online that gives Google Docs tools a huge edge. I may want to focus on this by talking sub-topics about the Google Apps Scripts down the road.

Google Apps Script

Google Apps script is a scripting/automation feature of google larger Google products including Gmail, google calendar, google drive and few other important google products you may be already using.

Google apps script runs on your browser (or mobile device) within the google tool you’re using. Scripts can register additional UI elements in the tools you use (register a new menu item), or watch/listen changes on the documents you create (events like, when a row is updated in google sheets) or you can even map your script parts to elements in the document content you are creating (such as buttons, dropdowns, checkboxes, etc…).

So far nothing new that other tools won’t do. Microsoft Excel and Word have its famous Macros available around ~20 years or more. Almost all other alternative software also have some form of automation that allows similar capabilities. The real power of Google Apps Script comes when you combine some of its online/cloud tools like google drive or google maps or Gmail with its documents. This makes the documents interactive with other services. Similar in the sense of using the other services APIs. One of the big things that makes me feel warmer to google docs tools are having real-time collaboration in your docs. This makes collaborative writing/editing experience superb. We often real-time write content with my team and I find the conversational aspect of the collaborative work priceless.

Google Apps Script is based on JavaScript. So you write Javascript snippets to define your custom functionality along with a large list of available google services APIs ready to use.

My favorite use cases of using Google Apps Script

Pull a dataset from our internal services or public sources dynamically to google sheets

We use this more often with our analytics services internally we use. These sheets are generally reports we create but often update with the latest version of the data.

Google sheets already have IMPORTDATA and IMPORTXML functions that pull a CSV or XML formatted data easily. But often we use a service that we haven’t built that doesn’t have CSV or XML exposure of its data. Often it’s a REST API returns JSON. You can use a helper function like https://github.com/bradjasper/ImportJSON or create your custom processor to pull the data and shape it to the way you want in google apps script. We often do the latter.

Add custom functions to google sheets

We use this a lot to create custom functions (generally pull data from other cloud tools we use), like getting Trello card details (title, status, assignee) i.e: =TRELLO(eio3u48d) or you can do other public services like getting weather forecast for a zip code =WEATHER(11222)

Send mails from google docs or sheets using your Gmail account

This goes into the automation of your workflow. As I mentioned above, with google apps script, you can map UI elements (menus or a button like an element in the document content) to a custom trigger in your script that does something for you. We sometimes create sheets or docs containing form-like formats or in the google sheet scenario, an action to be taken by the user for each row there is data. For this examples sake, think like a contact list with name, email and thank-you-note columns. We use google apps script to create a button-like action item in a column we define (let’s say it’s the next column to the thank you note) with label say “Send Thank You Note”. With google apps script, we can register this column to accept clicks and trigger a google apps script function. The google apps script function then can pull the clicked row number and the values in that row for the name, email and thank you note. Then with few lines of code, we can utilize gmail service api (without needing to do complicated SDK installations and -more importantly- deal with authentication) to send an email to the recipient with the content we want (in this case use the thank you note column as the email content. This is a huge convenience compared to building out this capability in a service or custom code from scratch.

Put a google sheet data to your calendar and update accordingly

Another great use case is to have timeline-based planning to be pushed to google calendar and update accordingly. We do this in a similar fashion as the previous scenario, but utilizing the google calendar service in the google apps script.

Wrapping up

There are many more interesting use cases for google apps script. There are also many communities created/maintained lists, directories for great google apps script examples and resources.

https://github.com/contributorpw/google-apps-script-awesome-list is a good start. Also checking GitHub tag search is worth looking https://github.com/search?q=topic%3Agoogle-apps-script

The official documentation of google apps script is here: https://developers.google.com/apps-script

Launch of my latest product Screenshot Tracker

🚨 I’ve just launched a new product (also in 😸 Product Hunt): 📸 Screenshot Tracker, a desktop all that helps capture full web page screenshots in different resolutions at once.

Screenshot Tracker helps you to collect:

  • 🖼 Full-size screenshots for your web pages
  • 🎳 Multiple URLs at once
  • 📱 In multiple resolutions (desktop, tablet portrait, landscape, and mobile)
  • 🔁 Repeat/Retake screenshots

with one click!

Screenshot Tracker is great for designers and developers to test their work while iterating on a web page design and development.

And it’s 100% free & open source. PRs welcome!

Project Homepage: https://screenshot-tracker.nomadinteractive.co/

Also, give me your support in Product Hunt page

Working with Memcache on your NodeJS app on Heroku

I wrote about Heroku in the past by showing my love for this amazing service. Heroku helps developers to start writing and deploying apps much easier than any other service. This allows developers to try things out and bootstrap their ideas in a matter of minutes. More about Heroku I talked in this article: https://mfyz.com/using-heroku-for-a-quick-development-environment/

Heroku Marketplace

I also mentioned before that Heroku has a 3rd party add-on marketplace. There are all types of application infrastructure, cloud services from fully hosted database services or CDNs to transactional email delivery services. Its marketplace is filled with countless gems that are worth a weekend to just explore these great services. That also works with Heroku within a few clicks/commands.

Why cache? Why Memcache?

One of them which is almost every distributed cloud application’s primary need is memory-based fast caching. There are different ways and models to store and use data from memory based caches but most common is for key-value based storage with TTL (time to live) which makes the data stored short-lived and cleaned up regularly.

The primary reason to use a memory-based cache is to optimize the performance of an application. In general terms, you cache frequently used data to cut time on queries to the database, reads to disks as long as you have a certain confidence that the data you are caching is not frequently changing within a short time while it lives in the cache. Even with the scenario that the data can be changed, there are simple ways to refresh the cache on the “update” events to make sure cached data is always valid and it cuts expensive operation time in the code flow.

As I mentioned, there are many different memory-based storage engines, protocols, services. One of the most simple yet effective and widely used engine is “Memcache”. It is an open-source tool that can be easily installed and hosted on any server that will run as a deamon and simply using an open port on a server to accept storage requests and queries. Memcache is designed to store and serve all of its data from the physical memory of the server. Therefore it’s very very fast and responsive when it’s combined with a high throughput network between the Memcache and application servers.

Heroku also supports multiple providers for Memcache and other memory-based storage engines. One particular service is very seamless to create an instance and set it up on a Heroku application, called Memcachier.

Memcachier

The memcachier instance free and can be added without any further account details from an external site. But it comes with very limited resources. It is only 25mb total cache size and 10 concurrent connection limit. Even though this number is small, for a node.js application, it will be mostly long-living. It will be more than enough if you are not planning to have many environments connecting to it at once. It is certainly enough to have a quick Memcache server to build stuff up and running. For more details about Memcachier service and its pricing model here is the Heroku elements page explaining all that: https://elements.heroku.com/addons/memcachier

To add this add-on to your application, in your project folder, run following command:

This will add memcachier to Heroku config as an environment variable. Copy the variable to your local .env file to be able to connect from your local development application.

To use memcache, on a node.js app, we will use a npm package called “memjs”. To install, run:

The usage of the package is pretty straightforward, here is the short example it’s quoted from memcachier’s Heroku get started guide:

Usage is as simple as using get and set async functions to set and get a value with a key.

I created a more visible version of this example with a relatively long, time-consuming mathematical calculation, so caching an expensive calculation (or data retrieval like a slow database query) to be cached in Memcache, so it’s calculated first, then when requested consecutively, it is served from Memcache instead of re-calculating. In my example, I used a calculation of a “primal” number which takes time.

https://github.com/mfyz/heroku-memcachier-example

Developing and Deploying Nodejs (Express) apps on Heroku

Heroku is an amazing platform for getting quick development up and running in a smart virtual instances. There is no hussle to get additional services you may need for a quick and dirty app to ground up. I’ve already wrote about how to use heroku for quick development environment before: https://mfyz.com/using-heroku-for-a-quick-development-environment/

This short article will be about specifically developing and deploying node.js and express apps on heroku. There is actually not much difference for deploying a node.js app than a php application or in another language. Heroku CLI tool automatically detects the application type from the package.json file for a node.js application and it’s entry point from there.

For the express related parts, just go ahead and see the very simple example I put up in github the past:
http://github.com/mfyz/heroku-nodejs-express-helloworld

Another more detailed express example that uses pug template engine for it’s layouts and views:
https://github.com/mfyz/express-pug-boilerplate

Aside of the application itself, there are few key points I found helpful when creating and deploying node.js apps. 

Use environment variables

Using environment variables is the best way to set configuration details for your application.

Setting which node.js version your app will use

As simple as adding “engines” object in package.json and having your node.js version defined in “node” property inside engines object like:

Same applies for npm and yarn versions to be defined within engines object as well.

Use prebuild and postbuild steps to prepare additional steps needed for your application build

By default, heroku will build your application on every deployment. This is not very meaningful for pure node.js applications but you app may need build. Like gulp, grunt, webpack builds. For this, heroku will read “build” npm script if exists in package.json. Aside of this, heroku will always install dependencies with npm install as a minimum build step. If you need additional steps before or after the build, you can define these in npm scripts as heroku-prebuild and heroku-postbuild named scripts.

Utilize heroku add-ons

Remember, Heroku comes with tons of 3rd party services which a lot of them have free packages that will be enough to try things out and start coding stuff up quickly. One of my favorite is heroku’s internal database service providing postgresql database with single command line command:

Wrapping up

All in all, heroku is a great cloud platform allow developers to kick off ideas, starting with simple code to grow into complex distributed applications very easily. In my opinion, it should be in the go-to tools for every engineer’s arsenal.

Importance of changelogs

It’s very important to keep a changelog and be disciplined about maintaining it as your product progresses. This applies to both product management and software development/management processes.

I want to mention a great source that talks, describe and almost accepted as a standard in open source community for changelogs:

https://keepachangelog.com

What is a changelog?

A changelog is a file that contains a curated, chronologically ordered list of notable changes for each version of a project.

Why keep a changelog?

To make it easier for users and contributors to see precisely what notable changes have been made between each release (or version) of the project.

Who needs a changelog?

People do. Whether consumers or developers, the end-users of software are human beings who care about what’s in the software. When the software changes, people want to know why and how.

An example Changelog (from Stretchly)

Github renders this markdown changelog beautifully: https://github.com/hovancik/stretchly/blob/master/CHANGELOG.md

Single JavaScript file node/express/Instagram authentication (OAuth) and get user photos

Get Ready

Register your Instagram app in their developer portal and obtain client id and client secret keys. To do that, follow the steps below:

  • Go to IG Developer page: https://www.instagram.com/developer/
  • Click “Register your application” button (if you are not logged in, you will be asked to log in on this step)
  • Fill all fields. The only field you need to pay attention to is the “valid redirect URLs”. This is where your app is publicly hosted. Below, we will create a URL on the application to capture Instagram authentication after the user goes to the Instagram page for permission dialog, then comes back to this URL. It’ll be something like https://yoursite.com/instagram/callback
  • Once you register the app, the page will display client id and secret. Keep this ready on the next steps.

Code it up

Let’s set up the plain node.js and express the application.

First, install the required packages:

index.js (or server.js)

Deploy and run

Either locally or after you place it on your server, run:

Tip 1: use “forever” on your server to run this application permanently.

Tip 2: For experimental purposes, you can run this app on your local and have a tunneling tool like “ngrok” to open your local port to the public with a quick domain assignment. Ngrok will provide a URL (random subdomain on their domain), you have to update the IG developer app’s settings to add this domain as a valid redirect URL, otherwise, after this app redirects user for authentication to Instagram, it will give an error.

Get the real thing

The code above in this article was a quick and dirty version. I put the little bit more proper express application version on Github. It uses pug for its views and has proper layout/content block separation as well.

https://github.com/mfyz/express-passport-instagram-example

Make your car smarter with Automatic

After I got my last car about 8 months ago, I’ve invested some money to make my car smarter like getting Apple car play installed. One of the things I did was to install a mini device to my car that makes my car a little bit smarter. The device name is Automatic.

This tiny gadget connects to the car’s diagnostic port which it receives it’s power and read and monitor car’s data like engine lights and stuff. The device contains GPS tracker and a sim card to continuously have low-speed connectivity like GPRS. This is more than enough to store it’s monitoring data in the cloud.

Here is the cool stuff you can do with having automatic on your car:

  • Parking tracking – probable the best feature. I hate the idea of trying to remember where I parked in the street. This removes the need for that. Open the app and check where the car is 🙂
  • Track when you started and finish driving. It’s like anti-theft alarm. You get a notification if someone starts the engine – sweet
  • Track the whole tour you drove. It draws the driving path on the map and you can see your driving history like uber receipts.
  • These driving records also contain avg speed, gas consumption…
  • Driving style – how often you stop. how aggressive you accelerate…
  • IFTTT integration – this opens the possibilities 100 times. You can set triggers like driving away from am geofence, or arriving at the geofence. This allows setting smart behaviors like Turn on garage lights when I arrive home.

It’s a little bit of a novelty features but if you are data nerd like me, it’s automatic tracking of your driving data which is enough value to me.

https://automatic.com