Code Compiled: A Short History of Programming — Part III

Programming for Commerce, Banking, and FinTech

Look at how far we’ve come. Just seven decades ago, the word “computer” referred to someone in an office with a pile of charts and a slide rule. Then ENIAC, the first electronic computer, appeared. It was followed quickly by the Commodore 64 personal computer and later, the iPhone. Today, the Sunway TaihuLight crunches data by combining 10,649,600 cores and performing 93 quadrillion floating point operations every second. Where will we be in another seven decades?

In part one of this series, we covered how the evolution of hardware shaped the development of programming languages. We then followed up in part two with the impact of the software created by those languages. In this final blog in the series, we’ll look at how programming has taken the leap beyond computers into devices, with an emphasis on how programming is rewriting the rules of commerce, banking, and finance.

Technology in Society Over the Past Decade

Society as a whole has adopted new technology with great enthusiasm, and the pace of that adoption has accelerated due to a few key developments over the past ten years or so. The Pew Internet Project has been keeping a close watch on the demographics of the internet. They reported that at the turn of the millennium, only 70 percent of young adults and 14 percent of seniors were online. That’s still the general perception of internet users, but it’s no longer true. In 2016, nearly all young adults (96 percent) and the majority of those over 65 years old (58 percent) are online.

The single biggest driver of internet access growth has been mobile devices. Simple cell phone ownership went from around 70 percent by American adults in 2006 to 92 percent a decade later. Smartphones, as well as the vast data-crunching resources made available by the app ecosystem, went from an ownership rate of 35 percent just five years ago to 68 percent today. Tablets have taken a similar explosive trajectory, going from 3 percent ownership in 2010 to 45 percent today.

This growing hunger for mobile devices required exponentially more data processing power and a vast leap in traffic across wireless networks. The growth rate can only be described in terms of zettabytes (trillions of gigabytes). In 2009, the world had three quarters of one zettabyte under management. One year later, data generated across networks had nearly doubled to 1.2 ZB, most of it enterprise traffic. By the end of last year there were 7.9 ZB of data generated and 6.3 ZB were under management by enterprises. In the next four years, you can expect there to be 35 ZB of data created by devices, with 28 ZB managed by enterprises.

Developers have had to work furiously to restructure their approach to software, databases, and network management tools just to avoid being swamped in all this data.

A Brief History of Financial Record Keeping

In the world of commerce, the data that matters most all relates to financial records. These define the health of the business today and the potential for growing the customer base in the future. That’s why financial data has become ground zero in the war between cybercriminals and data-security experts.

In the 1980s, the financial industry was still dominated by mainframes. The personal computer revolution that was sweeping the rest of the business world didn’t impact finance. Huge servers and clients were the only way to manage the vast amount of data (compared to processor speeds at the time) that had to be crunched. You might have to use COBOL or write SQL queries against a DB2 database to get the financial answers you needed to make the right business decisions (versus what is possible today). Mainframes were normally closed systems with applications written specifically for them by outside consultants.

All that changed dramatically in the 1990s, with the growth of faster servers, open systems, and the connectivity of internet protocols. Mid-sized computers for business gained immense processing power at lower costs. Mainframes began to be repurposed for back end processing of transaction data as the finance industry consolidated batch-processing projects like billing.

Computers like the IBM AS/400, which had run on IBM proprietary software in the past, gained the facility for running financial software like SAP, PeopleSoft, and JD Edwards. By the late 1990s, the appearance of Linux and virtual machines running inside mainframes opened up the entire finance sector to a flurry of new open-source development projects.

Simultaneously, network connectivity to the internet and then the web opened up financial data providers to a new threat from outside: hackers. Before, password management and inside jobs were the biggest threat to financial data security. Connectivity opened a window to a new generation of cybercriminals.

Programming Challenges for Data Security

In the weeks after Black Friday in 2013, as the holiday shopping rush was in full swing, a data-security specialist announced that a supermarket chain had become a target. His report warned that the breach was serious, “potentially involving millions of customer credit and debit card records.” There had been attacks on companies before, but this attack netted financial data on 40 million shoppers during the busiest retail period of the year.

The biggest problem is that attacks like these are increasing in their intensity and sophistication. In fact, 2016 saw a 458 percent jump in attacks that searched IoT connections for vulnerabilities. Meanwhile, another front has opened up on employee mobile devices for enterprises. Last year alone, there were over 8 billion malware attacks, twice the number of the year before, most of which went after weaknesses in the Android ecosystem. In terms of the data most sought after by hackers, healthcare businesses registered slightly more attacks than even those in the financial industry.

Data-security experts have to stay ahead of risks from both the outside and the inside, whether they are malicious or accidental. Both can be equally devastating, regardless of intent.

Ian McGuinness recommends six steps for security experts to help them concentrate on covering as many vulnerabilities as possible early on, before moving on to custom development:

  1. Protect the physical servers by tightly limiting access to a shortlist of only employees who must access them for work. Make sure the list is updated regularly.
  2. Create a network vulnerability profile. Assess any weak points, update antivirus software, test firewalls, and change TCP/IP ports from default settings.
  3. Protect every file and folder that captures database information like log files. Maintain access levels and permissions dynamically.
  4. Review which server upgrade software is necessary. All features and services not in common use provide an attack vector for cybercriminals.
  5. Find and apply the latest patches, service packs, and security updates.
  6. Identify and encrypt your most sensitive data, even if it resides on the back end with no interface to end users.

These are really just the basics, though. Monitoring network traffic, recognizing malicious code, and responding in time to make a difference represent the biggest challenges for the future.

What’s Next for FinTech

Programming for financial technology (FinTech) is among the most exciting, fastest changing areas of IT right now. Startups involved in online or peer-to-peer payments, wealth management, equity crowdfunding, and related innovations were able to bring in $19 billion in investment in the past year alone. The White House signaled its support for the contribution of financial software to the greater economy in saying, “Technology has always been an integral part of financial services — from ATMs to securities trading platforms. But, increasingly, technology isn’t just changing the financial services industry, it’s changing the way consumers and business owners relate to their finances, and the way institutions function in our financial system.”

On the road ahead, among the top challenges for FinTech developers will be:

  • The creation of original processes for secure access to financial data through mobile platforms
  • The integration of blockchain capabilities into enterprise financial systems
  • A secure, real-time global transaction system that can automatically adjust to currency fluctuations

The FinTech panel at one recent AppDynamics event concluded that:

“All banks have the same problems, but the capabilities to solve these problems have changed. Banks are taking different approaches, but the endgame is the same, making sure the customers can access their money when they want and how they want.”

Imagining the Future

The technologies developed to handle the above difficulties have much wider applications for programmers in 2020 and beyond. The bigger picture is that the coming age of ubiquitous connectivity will require much greater enterprise commitment to infrastructure maintenance and software performance. As the IoT brings machine-to-machine (M2M) networking to our homes and cars, that will also demand a vastly higher bar in terms of experience. Fortunately, those embracing the latest in DevOps best-practices are uniquely qualified to approach these problems with one eye on what customers expect and another on what will keep the business flourishing.

Code Compiled: A Short History of Programming – Part II

This is the story of software. The initial blog in this series was all about the structural formation of programming languages. We went all the way back to steampunk days to see how the framework for programming grew out of Charles Babbage’s Analytical Engine in the 1840s. We ended up with a list of the most active programming languages in use at the moment. Now we’ll take the next logical step to examine what programming has done for enterprises and SMBs. We’ll also trace the effects of shockwaves in the world of databases, communications, and mobility.

Technological Change Blindness

There’s a strange phenomenon known as change blindness that describes how normal people don’t notice massive, obvious changes in their environment. It can emerge from gradual shifts or very rapid transformations that are interrupted by a distraction. For example, a study by Cornell found that test subjects didn’t notice when a researcher, posing as a lost tourist, was replaced by someone else who looked completely different midway through the questioning.

Change blindness is happening right now on a societal level when you reflect on what programmable software has accomplished. Consider how radically our world has been transformed over the past two decades, partially due to hardware upgrades, but mostly due to programming.

In the last decade alone, we’ve seen society rebuilt due to the popularity of:

For anyone too young to have seen it or too busy to remember, here’s a recap of how business records and communications operated in the pre-software era.

Life Before Software

How many times per day do you use your computer? That question really doesn’t make sense for most workers today because they never stop using their computers. This goes beyond developers to every single person in the organization. Every time you check the time, write a note, or make a call you probably did it on the web or using a mobile device. Here are just a few of the jobs that didn’t exist in the recent past:

10 years ago

Global total app developers = roughly 0. There were the basics of social media, but no social media managers. There were no departments devoted to cloud engineering. Big data analysis was primarily academic. Development and operations didn’t become DevOps until 2009. Even the title “web developer” didn’t get a Bureau of Labor Statistics (BLS) designation until 2010.

20 years ago

There was no such thing as an online marketer. PPC didn’t exist before 1996, and the first keyword auction kicked off in 1998. In 1995, there were only 16 million internet users on the entire planet. Wireless engineers were battery specialists, because the 802.11 WiFi protocol came out in 1997 and widespread adoption would take another decade.

40 years ago

The late 1970s introduced personal computers to the business world, and the modern digital world as we know it can be traced back to that moment. Before that, computers were room-sized monsters like the IBM S/360. In 1976, there were no Apple computers, no Tandy TRS-80s, no Commodore 64s, and no Texas Instruments 99/4s — and IBM PCs were many years away. If you were a programmer, you might be working in UNIX, Pascal, COBOL, C, or Prolog and carrying around a suitcase full of punch cards. You might have a job switching reels of giant magnetic tapes that computers used as memory. There was no such thing as a reboot and crashes were common. You might spend the day pulling up floor tiles and looking for twisted cables. Perhaps the most astonishing fact about this picture is that some of the people you work with right now probably remember those days.

When Windows Were Only Glass

Before computers, offices tended to be loud and smoke-filled. Typewriters rattled everywhere and you could tell who was at work by the cigarette smoke curling above the desk.

Customer data, billing, legal documents, and other important records were made of paper and stored in boxes. The boxes were usually kept in a giant file room that had to be kept updated daily. Security was often non-existent and a disaster like a fire could wipe out a business in minutes. Contacts were often kept on paper rolodex files and everyone had their own.

With the arrival of personal computers, software fundamentally changed all business processes, making them repeatable, transferable, and vastly more productive.

The Database That Changed the World

You can spend endless hours arguing about which software has had the biggest impact on history, but every story has to start with 1974’s Relational Database Management System (RDBMS). There was no systematic way for storing and accessing data from the time electronic computers took off in the 1940s until the early 1970s. To find and retrieve information, you had to know where it was stored and how the program worked that did the data storage.

When IBM’s Ted Codd published his twelve rules for relational databases, it became the universal model for storing and structuring data. DB2 and its many children, like Percona and MariaDB, still underpin the global web. This led directly to Structured Query Language (SQL), Oracle, and the database wars of the 1980s. Today, software that has to manage the sheer volume and velocity of big data requires non-relational databases, but even these have their origins in Codd’s matrix.

The Grid and Cloud-Based Software

The history and impact of the internet are too large a subject to be discussed here, but cloud-based software is its latest expression. Software as a Service (SaaS) grew out of “The Grid,” a concept by Ian Foster and Carl Kesselman in the early 1990s, at the same time as the birth of the World Wide Web.

They imagined that software should be a metered utility, like electricity, where people just plugged into a grid of resources. Doing that depended on the development of effective cluster management and data residency. Clustered and networked computers used the rapidly developing internet protocols to fetch, process, and deliver data.

That meant that you had plenty of CPU capacity, but the actual machine doing the operations could be thousands of miles away. The connectivity speed of the communications channels hadn’t caught up to the network, generating delays in fetch and execution commands. Bottlenecks in I & O were common and cloud-based software started to gain a reputation for unreliability.

In terms of cloud security, the earliest threats are still the strongest: data breaches from malicious actors, data leakage from developer errors, identity blurring from insecure credentials, and APIs from untrusted sources. Today, whole industries are entirely reliant on cloud-based deployments despite the ongoing security challenges. SaaS was soon joined by Platform as a Service (PaaS) and Infrastructure as a Service (IaaS). The mobile workforce revolution would not have been possible without it.

Mobile Software for Working Remotely

Over the past 20 years, telecommuting has gone from a dream to a necessity. A Gallup poll showed that over a third (37 percent) of U.S. workers telecommute some of the time, compared with single digits before 1996. Of those who do telecommute, one in four work remotely more than ten days every month. In terms of effectiveness, 74 percent of those surveyed said that telecommuters are just as productive or much more productive than their co-workers.

The mobile workforce revolution is tied closely to the development of BYOD (“bring your own device”) and “workshifting,” which is the process of moving work to non-traditional times and locations. The three software trends that made this possible were the business app ecosystem, tighter security management tools for remote logins, and data center control panels that could handle all that network traffic. Put them together and the traditional office starts to look more like an unnecessary capital expense whose main function is serve as a backdrop for press conferences. The IDC now projects that 72 percent of the US workforce will be remote workers by 2020.

Industries Without Supply Chains

Arguably, the area that has seen the most dramatic changes due to recent software advances has been the finance industry. Finance has no logistics and no production supply chain to worry about. Information about money is what they sell and companies differentiate themselves on how well they manage that information. That’s why the expansion of internet access to more people and robust data analysis has meant so much to the industry. Unlike other information-driven industries, finance concerns every single individual alive today and each entity — whether it is a person or corporation — can have unlimited accounts.

The financial industry has been rocked by more disruptions than any other in terms of software created by SMBs as compared to other large enterprises. It has seen the introduction of new business models like crowdfunding, new forms of online currency like Bitcoin, data integrity disruptions like Blockchain, and new concepts in transactions like peer-to-peer lending.

We’ll go much deeper into these issues for the third and final blog in this series. We’ll look back at how programming changed banks and insurance companies with databases in the 1960s, then follow that through to the latest big data analytics driving capital markets today. You’ll see how programming and software advances have affected all business concerns, from precision marketing to risk management.

Learn More

In case you missed it, read about ‘Code Compiled: A Short History of Programming – Part I.’ Stay tuned for ‘Code Compiled: A Short History of Programming – Part III.’

Code Compiled: A Short History of Programming – Part I

There are more than 2,500 documented programming languages with customizations, dialects, branches, and forks that expand that number by an order of magnitude. In comparison, the Ethnologue: Languages of the World research officially recognizes 7,097 official language groups that humans use to communicate with each other all around the world.

It can be hard to grasp what’s happening in the world of programming today without a solid grounding in how we got here. There are endless fascinating rabbit holes to disappear down when you look back over the past 173 years of programming. This abstract can only give you a high-level review with a strong encouragement to follow any thread that engages you.

The Prehistory of Programming

Ada Lovelace, daughter of the poet Lord Byron, is generally recognized as the world’s first programmer, though she never wrote a single line of code as we understand it today. What she did in 1843 was very carefully describe a step by step process of how to use Charles Babbage’s theoretical Analytical Engine to generate Bernoulli’s numbers. Her idea was to take a device for calculating large numbers and use it to generate new concepts.

Take a moment to consider how monumental that was. Bernoulli’s numbers are essential for analytics, number theory, and differential topology — all fields of knowledge that most people in the world couldn’t even comprehend during the Victorian era. Babbage was never able to build his Analytical Engine, so she had to do all of this in her head. Nevertheless, her schematic for machine language became the default framework for programming when technology caught up to her one hundred years later.

ENIAC: The Digital Analytical Engine

After the Great War, before it was known as World War I, the U.S. military realized that their bullets and bombs had not been accurate enough. Inefficient ballistics had been a colossal waste of resources, and another war was imminent. Generals agreed they needed a faster way to crunch vast numbers and get them to artillery gunners in the field.

As World War II began, six women known as “computers” sat in a room at Army HQ with artillery charts and numeric calculating machines to compute ideal trajectories. They were the world’s first programming team. The need for faster computing spurred the U.S. Army to fund the creation of the Electronic Numerical Integrator and Computer (ENIAC), developed from Babbage’s original design. Instead of using mechanical cogs like Babbage’s device, ENIAC performed calculations by holding up to ten digits in memory, making it the earliest case of digital transformation. Ironically, ENIAC wasn’t fully operational until the fall of 1945, just in time to see the end of World War II.

The biggest problem with ENIAC was that this team of human computers had to reset the machine’s switches after each program they prepared. That failing was addressed by John von Neumann’s proposal of an Electronic Discrete Variable Computer (EDVAC). Starting with the construction of EDVAC in 1949, programming languages began to proliferate.

From The Garden of Languages to The Apple

For the next three decades, electronic computers were monstrous machines. UNIVAC, the first commercially available computer, was the size of a room and ran on giant vacuum tubes. Programmers wrote commands using machine code and assembly language, which was then translated into punch cards, as in Babbage’s original design, or paper tape. The first higher level language was COBOL, created by Grace Hopper in 1953. COBOL and its associated Assembler is still used today in traditional industries like banking and insurance. This was soon followed up by IBM’s creation of FORTRAN, which included its own compiler. The programming training publisher O’Reilly has created a language timeline showing how fifty of the most popular languages that have grown from there.

The next big shockwave, still being felt today, was the introduction of personal computers in the late 1970s. The first wave of personal computers was characterized by a hobbyist/DIY aesthetic, like the Tandy TRS-80 and the Commodore 64 (remarkably still in operation today). These ran simple programs using the language BASIC. During this period, language wars really began to heat up as a rise in amateur programmers developed their own logic systems. Some of the top languages developed during this time included Pascal, Scheme, Perl, and ADA (named for Lovelace).

Perhaps the most influential development at this time was a variation on C called C With Classes, by Bjarne Stroustrup. This would grow into C++ and anchor a growing catalog of object-oriented (OO) languages. The 1980s brought the rapid growth of two hardware groups that dominated the personal computer industry and virtually locked down the operating system (OS) market for many years: IBM and Apple.

Programming for Mac vs. PC

Apple made computing visual with the introduction of the Macintosh in 1984, and IBM PC’s association with Microsoft Windows soon followed suit. The Mac introduced the mouse, the on-screen desktop, and icons for programs. The average user no longer associated computing with typing text into a command line on a black screen with a blinking cursor. This changed programming in two fundamental ways.

First, it led to the introduction of visual programming languages (VPLs) like Visual C and Visual J, where developers can manipulate coding elements spatially. Second, it opened up developers to considerations of the graphical user interface (GUI). In many ways, this was the beginning of the DevOps split between concern for the user experience vs. operational efficiency.

Although programming languages themselves were normally OS agnostic, the Mac vs. PC camps tended to support different types of software development. In the 1990s, the PC favored software for business, developed from languages like C++, Visual Basic, and R. Apple was better known as a home for graphics and communications software using new languages like Ruby, Python, and AppleScript. In the mid-1990s, the explosive popularity of the World Wide Web and gaming systems changed everything.

Gaming and The Web

The web moved HTML, Java, JavaScript, and PHP to the top of every developer’s list. Cold Fusion, Game Maker, and UnrealScript are a few of the languages built expressly for gaming. More recently, game developers often rely on rich ecosystems like JavaScript, C++, Cool (later renamed as C#), Ruby, and Python. These have been the workhorses for both web applications and game development. High-end graphics often call for supplemental support by specialized languages like OpenGL or DirectX.

Languages in Demand Now

Here’s an outline of the languages most in demand in 2016, according to the TIOBE index and Redmonk:

TIOBE (September 2016)

These TIOBE rankings are based on a concatenation of the total number of developers employed to use specific languages, instruction courses offered, and third-party consultants. This data is compared to results across all major search engines and hits on language pages within Amazon, Baidu, Wikipedia, and YouTube. The goal is to identify on a monthly basis where the greatest number of lines of code are being compiled.

  1. Java
  2. C
  3. C++
  4. C#
  5. Python
  6. JavaScript
  7. PHP
  8. Assembly language
  9. Visual Basic .NET
  10. Perl

Redmonk’s Top 10 (Mid-year 2016)

Redmonk’s methodology is to compare the popularity and performance of specific languages against each other on GitHub and Stack Overflow. The amount of discussion on Stack Overflow and number of working uploads to GitHub is an indication of where development and software defined processes are trending.

  1. JavaScript
  2. Java
  3. PHP
  4. Python
  5. C#
  6. C++5
  7. Ruby (tie)
  8. CSS
  9. C
  10. Objective-C

It’s easy to see why many developers say that Java and C run the world. A foundation in these two languages and related branches will prepare you for the widest range of coding work. Of all the languages on these two lists, the one the stands out immediately is Assembly. This is an indication that the IoT has arrived and the need is intensifying for engineers who can code for short processing devices.

The Next Wave

Looking to the future, programming for enterprise business or individual apps both offer substantial financial possibilities. The Bureau of Labor Statistics (BLS) estimates the median pay for programmers to be approximately $79,530 annually. They are projecting an 8% decline in jobs through 2024 due to growing competition from lower-priced coders all over the world. However, the BLS also shows that software developers have a median income of $100, 690 annually with a 17% growth spike, much faster than other industries.

The difference is that low-level programming will be increasingly outsourced and automated in the years ahead. On the other hand, there is already a shortage of people who know how to do the higher-level thinking of engineers and DevOps professionals.

In fact, there are many developers who are now at work trying to converge programming languages with natural spoken languages. That’s the goal of the Attempto Controlled English experiment at the University of Zurich. The hope is to open up the power of programming to as many people as possible before the IoT surrounds us with machines we don’t know how to control. We may all be programmers in the future, but DevOps skills will be critical to keep the business world running.

Learn More

Stay tuned for ‘Code Compiled: A Short History of Programming – Part II.’