history

Flashback Friday: The Invisible Giant

Flashback Friday: The Invisible Giant

This week at SlashDB we want to honor our computer science forebear, Dennis Ritchie, in our Flashback Friday blog series.

Ritchie is the creator of the programming language C, which basically means he is the creator of digital life as we know it. For those of you who don’t know, C and its successors are the languages that serve as the foundation for most browsers including Firefox.1 In addition to this, your smartphone is brought to you courtesy of Ritchie’s technology. With such tremendous contributions to computer science, Ritchie has clearly earned a spot in our Flashback Friday series.

So let’s flashback and remember the man who made Googling on your phone possible.

Ritche Seated

Dennis Ritchie, creator of C and Unix, seated.

Ritchie graduated from Harvard University with degrees in physics and applied mathematics and, later on, a Ph.D in mathematics.2 He quickly found his niche in the world, not in mathematics or physics, but in computer science which he was introduced to while working at the computer center at MIT while still a graduate student.3 After finishing school, Ritchie began working at Bell Labs where he collaborated with Ken Thompson on C and Unix.4

ritchie_c_programming

Kernighan and Ritchie’s famous C Programming Language manual.

The C programming language (so named for being the successor language to B, a programming language created by Ken Thompson for Bell Labs5) is still widely used today and its descendants (C++ and Java) expand on the ideas and grammar that Ritchie created.6 The Unix operating system that Ritchie designed has also had a tremendous impact in the field of computer science – it underpins Mac OS X (Apple’s desktop system) and iOS, which runs the iPhone and iPad systems.7 Furthermore, Unix’s open-source variant, Linux, underpins major data centers, like those for Google and Amazon.8 The Unix kernel (written in C, incidentally) is basically what the entire Internet runs on9 and almost all web servers and browsers are written in C or its descendants (C++ and Java or in Python or Ruby, which are implemented in C).10

It’s not an exaggeration to say that Ritchie’s major creations (C and Unix) and their descendants run almost everything we use in our digital lives today. Ritchie’s contributions to computer science are truly staggering.

Knowing these facts makes you question why Ritchie’s name is not better known. Many compare his contributions to those of Steve Jobs, a media giant, who, incidentally, died seven days before Ritchie. Jobs’ death was heavily covered in the media, but Ritchie’s death received only minimal attention. There are two possible reasons for Ritchie’s lack of fame – the first is that he simply didn’t want it. Ritchie was reportedly an unusually private person. The second is that his contributions, vast and profound as they are, are essentially invisible. Ritchie created code, the fundamental yet invisible component to all of our digital gadgets. The defining difference between Jobs and Ritchie is that you hold Jobs’ creations in your hands, while Ritchie’s creations are completely invisible to the eye, an unseen ghostly power that whispers commands to your browser and Apple products. Such intangibility makes it easy to remain ignorant of Ritchie’s contributions.

At SlashDB, we deeply admire Ritchie’s contributions as well as his modesty. Ritchie cared more about his work than he cared about fame. We wish to commemorate his life and achievements, especially his commitment to advancing computer technology and making computers instruments of knowledge, entertainment, and business. Ritchie made technology easier to use and adapt – two things that we constantly strive for at SlashDB.

You’ll be glad to know that Ritchie’s achievements have not gone completely unnoticed by the world – after all his work really did lay the foundation for the modern digital world. Alongside his colleague Ken Thompson, he was awarded the Turing Award in 1983 (the highest computer science award), the National Medal for Technology and Innovation in 1999, and several months before his death, the Japan Prize in 2011.11 Let’s hope this article helps him be more widely known and appreciated, even if only posthumously – which presumably he would have preferred.


  1. Steve Lohr, “Dennis Ritchie, Trailblazer in Digital Era, Dies at 70,” NY Times, accessed     October 15, 2015. http://www.nytimes.com/2011/10/14/technology/dennis-ritchie-programming-trailblazer-dies-at-70.html?_r=0.
  2. “Dennis Ritchie, 1997 Fellow,” Computer History.org, accessed October 15, 2015. www.computerhistory.org/fellowawards/hall/bios/Dennis,Ritchie/.
  3. Steve Lohr, “Dennis Ritchie, Trailblazer in Digital Era, Dies at 70,” NY Times, accessed     October 15, 2015. http://www.nytimes.com/2011/10/14/technology/dennis-ritchie-programming-trailblazer-dies-at-70.html?_r=0.
  4. Ibid.
  5. Cade Metz, “Dennis Ritchie: The Shoulders Steve Jobs Stood On,” Wired.com, accessed     October 15, 2015. http://www.wired.com/2011/10/thedennisritchieeffect/.
  6. Steve Lohr, “Dennis Ritchie, Trailblazer in Digital Era, Dies at 70,” NY Times, accessed     October 15, 2015. http://www.nytimes.com/2011/10/14/technology/dennis-ritchie-programming-trailblazer-dies-at-70.html?_r=0.
  7. Cade Metz, “Dennis Ritchie: The Shoulders Steve Jobs Stood On,” Wired.com, accessed   October 15, 2015. http://www.wired.com/2011/10/thedennisritchieeffect/.
  8. Steve Lohr, “Dennis Ritchie, Trailblazer in Digital Era, Dies at 70,” NY Times, accessed     October 15, 2015. http://www.nytimes.com/2011/10/14/technology/dennis-ritchie-programming-trailblazer-dies-at-70.html?_r=0.
  9. Cade Metz, “Dennis Ritchie: The Shoulders Steve Jobs Stood On,” Wired.com, accessed   October 15, 2015. http://www.wired.com/2011/10/thedennisritchieeffect/.
  10. Ibid.
  11. “Dennis Ritchie, 1997 Fellow,” Computer History.org, accessed October 15, 2015. www.computerhistory.org/fellowawards/hall/bios/Dennis,Ritchie/.
Flashback Friday: Computer Scientist Who Invented Debugging

Flashback Friday: Computer Scientist Who Invented Debugging

This week at SlashDB we honor Grace Hopper as our Flashback Friday forebear. Hopper is one of the most accomplished and well-known computer scientists in history – having famously popularized the terms “bug” and “debugging” 1 we so often use today.

While many remember Hopper only for her association with these terms, her accomplishments in the field of computer science are equally memorable. Hopper was one of the first programmers in history – a singular distinction that makes her worthy of our attention.

So let’s flashback and remember the many amazing achievements of Hopper – sometimes known by the nickname “Amazing Grace” 2 for her remarkable contributions to computer science.

Hopper began her career as a mathematics professor at Vassar, having earned her PhD in mathematics at Yale.3 In 1943, during World War II, she joined the United States Naval Reserve and was assigned to the Bureau of Ordnance Computation Project at Harvard University. While there she worked on one of the first computers, Mark I, which computed mathematical tables used for the Manhattan Project.4

Hopper's logbook with the moth ("bug") displayed.

Hopper’s logbook with the moth (“bug”) displayed.

After WWII Hopper remained at Harvard as a research fellow and worked extensively with the Mark II and Mark III computers. It was while working on Mark II that Hopper popularized the term “bug.” Hopper reportedly loved recounting the story of the night the computer stopped working and after much troubleshooting it was discovered that a moth caught in one of the relays was the cause of the problem5 – an actual bug in the system – and presto our favorite computer term was born. Had Hopper not felt the patriotic duty to serve her country, we may not be bandying about the term “bug” in reference to computer glitches – which we can all agree would diminish our lives.

Hopper’s most lasting contribution to computer science (other than the anecdotal hilarity of the origin of “bug”) was made later in her career while working at Remington Rand, where in 1952 she developed the first compiler.6 Two years later her team delivered the first compiler-based programming languages, FLOW – MATIC and MATH – MATIC. Hopper’s FLOW – MATIC language was later extended and re-developed into COBOL (COmmon Business Oriented Language).7 While many have never heard of COBOL, it is the forebear of English-like syntax programming languages like SQL.

Hopper had a strong belief that programming languages should be as easy to read as English. Her efforts to make this a reality are truly outstanding. Hopper’s leadership in developing programming languages like COBOL is the reason why programmers now use if/then statements in place of the 0s and 1s in binary code.8 This influence paved the way for highly readable programming languages like Python and Ruby that we use today.

Hopper was all about simplicity and efficiency, qualities that we work to achieve at SlashDB. Like Hopper we want not only solutions and results, but the simplest solutions and the most dynamic results. That’s why we spend so much time listening to our customers’ views and ideas – tracking down and “debugging” any imperfections to meet the needs of our users. At SlashDB we deeply admire Hopper’s pioneering work and strive to emulate her visionary leadership.

The USS Hopper at sea.

The USS Hopper at sea.

We can’t claim to be the only ones to admire and applaud Hopper for her contributions. Hopper retired from the Navy for the final time in 1986, at the age of 80 (YOLO and Hopper was truly determined to make the most of her time – she went on to work at Digital Equipment Corporation until her death in 19929). At her retirement ceremony she was awarded the Defense Distinguished Service Medal, the highest non-combat award given by the Department of Defense.10 In addition to this Hopper has a U.S. Military vessel named after her, the USS Hopper,11 a distinction held by very few women. She is also a recipient of the National Medal of Technology and Innovation (the second woman to be given this award) and is the first ever recipient of the Computer History Museum Fellow Award.12

Hopper’s contributions and memory remain very much alive today, despite her death more than 20 years ago. We continually evoke her lively spirit whenever we claim that there’s a “bug” in the system. Let’s hope that in another 20 years this small part of Hopper is still alive.

 


  1. “The Queen of Code,” NPR, accessed October 7, 2015. http://www.npr.org/sections/alltechconsidered/2015/03/07/390247203/grace-hopper-the- queen-of-code-would-have-hated-that-title.
  1. KeriLynn Engel, “Admiral “Amazing Grace” Hopper, Pioneering Computer Programmer,”Amazing Women in History, accessed October 7, 2015. http://www.amazingwomeninhistory.com/amazing-grace-hopper-computer-programmer/.
  1. “Grace Hopper Biography” Biography.com, accessed October 4, 2015. http://www.biography.com/people/grace-hopper-21406809#later-years-and-legacy.
  1. Cohen, Bernard (2000). Howard Aiken, Portrait of a Computer Pioneer. Massachusetts: The MIT Press.
  1. KeriLynn Engel, “Admiral “Amazing Grace” Hopper, Pioneering Computer Programmer,”Amazing Women in History, accessed October 7, 2015.  http://www.amazingwomeninhistory.com/amazing-grace-hopper-computer-programmer/.
  1. Ogilvie, Marilyn and Joy Harvey (2000). The Biographical Dictionary of Women in Science:  Pioneering Lives from Ancient Times to the Mid- 20th Century. New York: Routledge.
  1. Ibid.
  1. KeriLynn Engel, “Admiral “Amazing Grace” Hopper, Pioneering Computer Programmer, Amazing Women in History, accessed October 7, 2015. http://www.amazingwomeninhistory.com/amazing-grace-hopper-computer-programmer/
  1. Ibid.
  1. Ibid.
  1. “Grace Hopper Biography” Biography.com, accessed October 4, 2015.  http://www.biography.com/people/grace-hopper-21406809#later-years-and-legacy.
  1. “Grace Hopper – Computer History Museum Fellow Award Recipient”. Computerhistory.org, accessed October 4, 2015.

 

Flashback Friday: Charles Bachman

Flashback Friday: Charles Bachman

We currently live in a world filled with technological possibilities. Computers and software like SlashDB help us in our daily lives by providing us with information, helping us track information, and storing information – streamlining our lives, allowing us to work smarter, not harder. Flashback Friday is about acknowledging our computer science forebears, remembering their innovation and leadership, and honoring them for their accomplishments.

So let’s flashback and remember the contributions of Charles Bachman – inventor of the first database management system, an achievement that makes him uniquely qualified to be our first Flashback Friday forebear.

Charles Bachman, while not as well-known as Bill Gates or Steve Jobs (whose death has done nothing to diminish his media presence), is still a true leader in the field of computer science. Bachman was developing software before developers were a thing. In fact he is credited with creating the first database management system ever in 1963 while working at General Electric.1

Bachman created the Integrated Data Store (IDS), a database management system that is still influential today. One of the most striking facts about Bachman is that his ideas about databases, now more than 50 years old, are conceptually similar to today’s API and linked data. In fact SlashDB implements his concept, albeit using modern technology (SQL database for a backend and HTTP for transport protocol).

Bachman’s visionary database management system allowed files to be located and modified without the need for the programs to be rewritten when accessing files. IDS accomplished this feat by using a separate data dictionary which allowed users to track data and study relationships between data in different records. 2 For example data on clients and data on manufacturing orders could be easily compared and tracked. This was an innovative movement toward integrating varied types of data that allowed the computer to become a tool for managing information.

Interestingly enough, SlashDB is not the only one making use of Bachman’s ideas today. Database designers even now rely on graphical tools or data structure diagrams to illustrate the complex data structures they use.3 These diagrams are called Bachman diagrams as he was the first to use this method.

An example of a Bachman diagram.

An example of a Bachman diagram.

So after his amazing contribution to the field of computer science, why hasn’t a film about Bachman been made – a tale of the strikingly innovative computer geek in the tradition of The Social Network and the soon to be released Steve Jobs film? There’s no clear answer to this. We can only hope that Bachman has a sufficiently emotionally complicated backstory to warrant such a film – fingers crossed.

So let’s take a moment to examine the man behind this huge contribution to computer science.

Charles Bachman was an engineer rather than a computer scientist, although his greatest contribution is to computer science rather than engineering. Bachman’s exceptional contribution to database technology hasn’t gone completely unnoticed (despite the absence of a film chronicling his invention). In 1973 Bachman became the 8th recipient of the A.M. Turing Award 4– the highest honor in the field of computer science, and doubly appropriate as an homage to Turing himself (an undoubtedly influential computer scientist – who incidentally has three films, a documentary, a play, and a novel based on him) and for its closeness to the word turning, as recipients’ work represents a specific turning point in the field of computer science.

While a film on Bachman has not yet been made, he is far from being forgotten. In 2014 President Barack Obama awarded Bachman the National Medal of Technology and Innovation.5 Let’s hope that a movie is soon to follow this huge honor.


  1. Thomas Haigh, “Fifty Years of Databases,” ACM SIGMOD Blog, December 11, 2012. http://wp.sigmod.org/?p=688.
  2. Thomas Haigh, “A.M. Turing Award Winners,” ACM, accessed September 17, 2015. http://amturing.acm.org/award_winners/bachman_1896680.cfm.
  3. Ibid.
  4. Thomas Haigh, “Fifty Years of Databases,” ACM SIGMOD Blog, December 11, 2012. http://wp.sigmod.org/?p=688.
  5. “Charles W. Bachman,” Computer History Museum, accessed September 17, 2015. http://www.computerhistory.org/fellowawards/hall/bios/Charles,Bachman/.