Jan 31, 2020 newsletter

GitHub gets serious about deep learning

GitHub deep learning

GitHub announced a new feature to better match open source contributors with issues that are likely to fit their interests and skill level. GitHub now recommends "good first issues" in open source projects for developers who are hoping to get more involved but are not sure what tasks to complete first.

This is the first deep-learning-enabled product to launch on the Github platform. That marks a massive step forward for GitHub into unchartered territory.

Take one: GitHub first announced an early version of its "good first issues" feature in early 2019, but it placed the burden of triaging and labeling issues—with tags like "good first issue" and "beginner friendly"—on project maintainers. As a result, only 40% of repositories had issues that could be recommended to developers.

Take two: GitHub’s updated recommendation system now works automatically and learns from existing issues. GitHub can recognize patterns in issues that were closed by developers who had never contributed to a repository before. That helps it identify similar issues that could be worthwhile for other beginners.

Why it's a big deal: GitHub’s new feature combines the power of GitHub’s troves of developer data—more than 40 million registered users—with its growing automation capabilities. Add machine learning into the mix, and GitHub is uniquely positioned to make existing developer workflows smarter and create powerful new ones that were not possible before.

GitHub already plans to create a personal set of recommended next issues for developers who have already contributed to a project so that they know what to work on next.

Now that GitHub has taken the first step in expanding its machine learning capabilities, it can begin to replace even more manual workflows.

Google pays big bucks for bug bounties

Google bounties

Like many big technology companies, Google pays developers for discovering security issues in its ecosystem of products, tools, and services. After disclosing its latest yearly bug bounty report, the trend is clear: Google is paying more than ever—and is investing huge amounts of money and resources into expanding its bounty programs.

Google pays big money. Since starting its bug bounty program in late 2010, Google has paid out $21M in rewards to developers and security researchers.

Google paid out $6.5M in just the last year. That’s up from $3.4M in 2018. More developers are participating, too: 461 bug hunters received a reward in 2019, up from 317 in 2018.

What products saw the most rewards? Google spent $1.9M for issues with its Android platform, $1M for Chrome bugs, and $800K for flaws in the Google Play store.

More money is on the way. Google is boosting its bug bounty program for the future, too.

Google’s top reward for hacking Android jumped to $1M. That’s in addition to a new $500K maximum reward for bugs detected in preview versions of Android. Google is even willing to pay researchers for finding security issues in third-party apps in the Play Store that have over 100 million installs.

It’s not just Android that's seen increased rewards: last year, rewards for bugs found in Chrome and Chrome OS doubled to $30K. Rewards for fuzz testing—which involves adding random data to a product to discover problematic inputs—also doubled to $1K.

One developer walked away with a $201K reward—the largest single bug payout to date. With so much money pouring into the bounty program, that record will likely fall.

What’s the takeaway? Security is an increasingly lucrative focus for the developer community. Bug bounty platforms like HackerOne—where hackers made $19M in 2018 and $11.7M in 2017—are rapidly growing.

And as much effort as developer tools have put into adding security controls to development processes, crowdsourced solutions are only getting more important. Companies increasingly rely on ethical hackers from the community to provide a final—yet vital—layer of security.

Zombie JavaScript and the web of undying libraries

Zombie JavaScript

Cloudflare, a web infrastructure company known for its content delivery network services, recently analyzed the typical lifespans of JavaScript resources across the web. According to its analysis, most websites never even bother to update their JavaScript libraries.

Cloudflare collects valuable data. Cloudflare helps run CDNJS, a popular tool that lets developers easily include web development resources—like JavaScript libraries—in their web apps. Cloudflare then collects anonymized data to better understand what libraries are frequently requested and from what sites.

What does the data reveal? JavaScript libraries are almost never updated once installed. Even more worrisome, new versions of libraries often don’t lead to a decrease in use—known as the deprecation rate—of older versions.

Cloudflare focused its analysis on jQuery, the most popular JavaScript library in the world used by more than 70% of the 10 million most popular websites. Cloudflare found that despite the release and rapid growth of new jQuery versions in early 2019, older versions did not show an accelerated decline in usage throughout the rest of the year.

Instead, usage of older jQuery versions consistently saw deprecation rates of just 20% per year.

Why? Assuming that the average website lasts between two and four years, that means most deprecation is caused by major website changes or overhauls, rather than incremental upgrades.

In other words, most websites don’t upgrade their JavaScript dependencies until they’re completely rebuilt.

What can we do? Cloudflare’s research raises questions about how developers can keep the web updated and how much effort should be spent supporting so much legacy software baggage.

Cloudflare is considering ways to reconfigure CDNJS to encourage developers to update and maintain JavaScript libraries more frequently—either automatically or with stronger deprecation notifications. That could redefine how developers use web resources and how such tools evolve.

Go slow to go fast, securely

State of DevOps

Puppet—a popular provider of infrastructure automation and delivery software—released its eighth annual State of DevOps Report. Nearly 3,000 developers, ops, and security professionals from around the world took part in the survey and answered questions about development speed and security.

Many companies choose to go slow. While many companies are building faster product pipelines, there is still a noticeable gap between how frequently development teams are able to deploy and how frequently they actually deploy.

Across all industries, roughly 46% of companies are able to deploy on demand, yet only 24% actually opt to deploy on demand.

When do most companies prefer to deploy? About 33% of companies choose to deploy once per day or more. Nearly 43% of companies opt to deploy at most once per week—even though just 19% of companies feel they are constrained to doing so.

Security is important, but can slow down development. According to the survey, 78% of respondents agreed that security is a shared responsibility across both delivery and security teams.

Yet many developers think their companies could do a better job of implementing security practices into their development workflows. Roughly 41% of developers felt that security is a major constraint on their ability to deliver software quickly.

The main takeaways: Companies are set up for fast development speed, but intentionally choose to deploy less frequently. And as security-driven development becomes more widespread, teams will likely encounter even more speed bumps along the way.

Small bytes

  • JetBrains plans to add new machine learning capabilities to its Java IDE. Developers can expect to see better code completion and code suggestions included right inside their IDE. [DEVCLASS]
  • Google Cloud released Secret Manager, a new tool to help developers store API keys, credentials, certificates, and authentication. Google Cloud joins other cloud providers in helping developers cope with the increasingly complex world of secrets management. [GOOGLE]
  • RapidAPI, an API marketplace for developers to discover and implement new APIs, announced support for GraphQL APIs. With API-driven development becoming increasingly important in the developer world, GraphQL is quickly growing in popularity. [RAPIDAPI]
  • Nuweba, a serverless computing provider, raised $10.2M in funding. After emerging from stealth last year, the startup is working to build a faster and more secure serverless computing platform. [SILICON ANGLE]


  • Playwright is a Node library to automate the Chromium, WebKit and Firefox browsers with a single API [GITHUB]
  • MirageJS is a client-side server to develop, test and prototype your JavaScript app [MIRAGEJS]
  • Userbase lets you create secure and private web apps using only static JavaScript, HTML, and CSS [USERBASE]
  • Hex Engine is a 2D Game Engine for the browser that is written in TypeScript and designed to feel similar to React [GITHUB]
  • DeepSource offers static code analysis for Go that uncovers 150+ bug risks, anti-patterns and security vulnerabilities [DEEPSOURCE]
Never miss the big news

Every week, our team will send you three of the most important stories for developers, including our analysis of why they matter. Software development changes fast, but src is your secret weapon to stay up to date in the developer world.

Featured articles
Made with by Software. Read more about our mission.