History of Database Applications

One of the most important purposes of development of computer systems was the database application which could have been used on them. Data processing drove growth of computer processor speed.

In fact, data processing predates the computers; punched cards were used in the US for collecting data for census during beginning of 20th century. Earliest data processing was done by punched cards on mechanical devices. The real development in data processing speed, storage of data and development of DB applications started much later i.e. from 1950s.

Magnetic tapes were used to store data and being read from it. These database applications had hierarchical structure and used network systems. They were extremely efficient when used with the original query, exactly developed for them, but the DB was not designed to handle new queries or transactions. Also the magnetic tapes must be in same sorted order so as to retrieve the authentic data.

Later in 60s hard disks came about and data retrieval was faster and did not need be stored sequentially. This period was also remarkable in terms of advancement in DB Systems.

Later in 1970 Edgar Codd, father or Relational Database Model, conceptualized a new structure for Database construction and wrote a groundbreaking paper ‘A Relational Model of Data for Large Shared Data Banks’. He freed database from procedural ways of querying and marked the beginning of Data Abstraction i.e. hiding details of how Database is implemented to application programmers and end users.

System R, based on Codd’s concept was developed by IBM and it was first to have a language for querying called SQL or Structured Query Language. Later, System R was further developed to a mainstream commercial DBMS product known as DB2.

Object oriented programming was rapidly developing in the 80s and it also helped break into what we know as Object Oriented Databases. The idea was to treat data as objects and it became easier to conceptualize and program using this idea.

Another great development which happened was processing speed of processors and also conceptualization of indexing which greatly increased data access times, and performances of DB.

90s was a time of a World Wide Web, so unprecedented like world had never seen before. The data was here on the internet.

Databases to which links were forwarded were varied and different and it needed a technique to interchange data efficiently. Also the database had to be of very high availability working 24x7.

XML or eXtended Markup Language is a standard for providing data exchange among different databases and WebPages.

More recently, there has been a growing trend of NoSQL database. These are different from so called classical databases and do not rely on Relational Model for their structure. They do not query data using Structured Query Language but UnQL or Unstructured Query Language which is still in development stage (it is similar to XQuery). These databases are generally used when working with huge quantities of data. Some examples are Mongo DB, CouchBase, HBase used by facebook, Big Table used by Google and Dynamo DB used by Amazon.


❮❮   Previous Next   ❯❯

Authorship/Referencing - About the Author(s)

The article is Written and Reviewed by Management Study Guide Content Team. MSG Content Team comprises experienced Faculty Member, Professionals and Subject Matter Experts. We are a ISO 2001:2015 Certified Education Provider. To Know more, click on About Us. The use of this material is free for learning and education purpose. Please reference authorship of content used, including link(s) to ManagementStudyGuide.com and the content page url.


Database Management System (DBMS)