The State of AI

Alejandro (Alex) Jaimes

Posted: May 31, 20175 min read
<- Back to Blog Home

Share

Try DigitalOcean for free

Click below to sign up and get $200 of credit to try our products over 60 days!Sign up

This post is the first in a three-part series we’re publishing this year on artificial intelligence, written by DigitalOcean’s Head of R&D, Alejandro (Alex) Jaimes.

In recent months, the amount of media coverage on AI has increased so significantly that a day doesn’t go by without news about it. Whether it’s an acquisition, a funding round, a new application, a technical innovation, or an opinion piece on ethical and philosophical issues (“AI will replace humans, take over the world, eat software, eat the world”), the content just keeps coming.

The field is progressing at amazing speeds and there’s a lot of experimentation. But with so much noise, it’s hard to distinguish hype from reality, and while everyone seems to be rushing into AI in one way or another, it’s fair to say there is a good amount of confusion on what AI really is, what sort of value it can bring and where things will go next.

While the reality is that AI has the potential to impact just about everything and be embedded in just about anything—just like software already is—getting started can be daunting, depending on who you ask.

In this post, I will first explain why computing is now AI. Then, in future posts, I’ll describe the most significant trends, outline steps to be taken in actually implementing AI in practice, and say a few words about the future.

Computing Is Now AI

AI is already embedded, in some form, in most of the computing services we use on a daily basis: when we search the web, visit a webpage, read our email, use social media, use our phone, etc. Most of those applications use some form of machine learning to perform “basic” tasks, like spam detection, personalization, and advertising. But like computing itself, penetration of AI doesn’t stop there. Our transportation systems, security, cargo shipping, banking, dating, and just about everything else is likely “touched” by algorithms that use machine learning.

Ai umbrella

AI is really an umbrella term that encompasses many subfields. For the sake of simplicity, most of what people think of as AI currently has machine learning, and/or deep learning. The ideas behind the three concepts are rather straightforward: AI aims to “emulate or supercede human intelligence,” machine learning is concerned with algorithms that learn models from data, and deep learning is “simply” a subset of machine learning algorithms that learn from data with less human intervention. In building “traditional” machine learning algorithms an engineer has to design features, but in a deep learning framework the features themselves are learned by the algorithm—those algorithms, however, need significantly greater amounts of data.

Some industries use computing technology in more advanced ways than others. Tech companies, in particular, have taken the lead in developing products and services around data and AI (in various forms), and scaling to millions and billions of users. This has led to significant advances in some areas where having large, diverse datasets can improve performance to the point where problems that seemed out of reach now seem solvable. Other industries, such as healthcare and education, have been slower to adapt, but we’re beginning to see significant progress with very promising prospects.

If we look closely at trends, and technical requirements (for AI to deliver in products and services), it’s easy to see that AI can already be applied everywhere. More specifically, where repetitive patterns occur, and those patterns can be recorded, whether the data is individual or aggregated. One could easily argue that everything in life—and business—consists of cycles, and what’s changed significantly in recent years is our ability to record, store, and process behavioral patterns at every level. AI adds prediction, which is extremely valuable.

The power of AI comes at multiple granularities. There are a plethora of decisions made every day based on simple, repetitive patterns—and those apply to businesses as much as they do to individuals. It’s no surprise then, that most companies are using AI today to cut costs and improve efficiency. As more processes become digital, AI, then, becomes not just a critical part of the ecosystem, but the driving force, in large part because its main benefit is efficiency. And if we look at things from this perspective, it’s easy to see why computing and AI are already converging to the point where there’s no distinction. It the very near future, it will be assumed that AI is part of computing, just as networking and other technical components are.

This is not a minor shift, however. It is massive because it emphasizes processes that leverage data, and evolving models (vs. “fixed” algorithms), impacting how software is developed. This has several ripple effects that I’ll describe in future posts, including pushing the hardware boundaries. I would argue that the companies and teams that understand this and think and operate with this mindset will now have a significant advantage over others that try to “add” AI at a later stage.

On one hand, this means that individuals and teams must constantly learn and grow, remain up to date, and rely on the larger community for the exchange of models, ideas, code, and knowledge. It also means that applications will be increasingly built by layering components and data—nothing will be built from scratch. For hobbyists, “professional” developers, engineering teams, the open source community, and companies, this translates into significant synergies—an ecosystem that relies on the cloud, which is the perfect platform to combine multiple resources and scale with a single click. Ultimately, this implies that AI skills will be as critical to individuals as they are to companies, and they will form the basis of economic progress for decades to come.

We’d love to get your thoughts on AI. How it has impacted the way you build software? What do you think you need to make AI part of your workflow? What opportunities and barriers do you see? What are the topics you’d like to learn more about or the tools you’d like to use? Let us know in the comments below!

*Alejandro (Alex) Jaimes is Head of R&D at DigitalOcean. Alex enjoys scuba diving and started coding in Assembly when he was 12. In spite of his fear of heights, he’s climbed a peak or two, gone paragliding, and ridden a bull in a rodeo. He’s been a startup CTO and advisor, and has held leadership positions at Yahoo, Telefonica, IDIAP, FujiXerox, and IBM TJ Watson, among others. He holds a Ph.D. from Columbia University.

Learn more by visiting his personal website or LinkedIn profile. Find him on Twitter: @tinybigdata.*

Share

Try DigitalOcean for free

Click below to sign up and get $200 of credit to try our products over 60 days!Sign up

Related Articles

How startups scale on DigitalOcean Kubernetes: Best Practices Part VI - Security
Engineering

How startups scale on DigitalOcean Kubernetes: Best Practices Part VI - Security

Introducing new GitHub Actions for App Platform
Engineering

Introducing new GitHub Actions for App Platform

How SMBs and startups scale on DigitalOcean Kubernetes: Best Practices Part V - Disaster Recovery
Engineering

How SMBs and startups scale on DigitalOcean Kubernetes: Best Practices Part V - Disaster Recovery