ONLINE EXCLUSIVE

Programming in a Pandemic

Meet Avi Schiffman, a teen who created a Covid-19 tracking website that has been viewed by millions of people

COURTESY OF AVI SCHIFFMANN 

TEENAGE TECH GENIUS: Avi Schiffmann, a 17-year-old web developer from Washington state, is responsible for creating and maintaining one of the web’s most popular Covid-19 trackers.

When Avi Schiffman was 7 years old he started teaching himself about computers. Now, at age 17, he runs one of the world’s most visited web sites from his home in Washington State. The site, nCoV2019.live, provides up-to-the-minute information about Covid-19 cases and deaths around the world. At first, Avi thought the website would just be an interesting project. He had no idea it would take the world by storm.

Science World recently spoke with Avi about why he built his Covid-19 tracker. He shares how he learned to program, or write instructions that tell a computer what to do, and what it’s like to maintain one of the world’s most popular sites—all from his bedroom.

How did you learn how to make websites?

For me, the best way to figure out new things is to start a project and learn as I go. One example is how to make a website load quickly on someone’s phone or computer so the site is interesting and easy to use.

My advice for anyone who wants to learn new computer programming skills is to start small and learn by doing. You don’t have to know everything about HTML—the computer language used to display things like text and images on websites—before you start making a site. Just learn one little thing, like how to write HTML code to change a website’s color, and go from there.

Did you work on any other websites before building the Covid-19 website?

Courtesy of Avi Schiffman

EARLY WORK: Before building the nCoV2019 hub, Avi had created a website called School Sports using similar computer code.

Before making this site I made a site for my school that pulled together the sports scores for all the schools in my county. I called it School Sports and it looked a lot like ESPN’s website, which was pretty cool. It used a technology called web scraping. It’s a way to gather lots of data from many websites quickly, using a computer code that I wrote. I used the same idea for my Covid-19 site. Health departments all around the world are always updating and posting their statistics online, and web scraping is the best way to get that information as soon as it is available.

Why did you decide to build your Covid-19 website?

In late December or early January, I began hearing about coronavirus, which causes Covid-19, in the news. But it was basically impossible to find accurate information about out how many cases there were and what was going on. I would read articles, but the information was out of date by the time I saw it. I wanted to make it easier for people to get the latest data on cases. So I created my website. The data collection was easier back then because not every country was affected yet. I never expected this to become a global pandemic. At the time, it was just a project I was interested in.

As Covid-19 became more widespread, what challenges did you face in gathering data from around the world?

By February or March it was so hard! Health departments kept changing their formats for their Covid-19 case numbers. One day they might display data in a table, the next day a PDF document. So I had to build web scrapers to get data from every kind of format. Now, thankfully, the formatting of more recent data has become standardized.

Are you planning to make additions to the site?

I do still want to add new features to display data in different ways that might be useful to people visiting the site. For example, I also want to make a simulator of disease spread. It could show, for example, how if more people in an area wear a mask, how many fewer cases there might be.

The information on your website loads very quickly. How do you make that happen?

This is through something called server-side programming. There are 247 different web scrapers at work to make my site possible pulling information from around 250 websites. Can you imagine if every one of those scrapers had to run every single time somebody went to the site? That would make the load time much slower. Instead, I run the scrapers in advance, every few minutes, and store the latest results on my web server—a dedicated platform to run computer software. When you go to my site, it’s just pulling up those stored results, which happens very quickly.

How has working on this site has changed your life?

To be honest, the project has been a lot of work. I’d love to just lie on a beach for awhile and not have to think about it. But with so many people relying on the site right now, I can’t just stop.

Text-to-Speech