FORTRAN: the primary development tool for supercomputers

by Marius Marinescu / 21 September

The list of high-tech tools in continuous use since the early 1950s isn’t very long: the Fender Telecaster, the B-52 and Fortran.

Fortran (which started life as FORTRAN, or FORmula TRANslator) was first created by IBM programmer John Backus in 1950. By the time John F. Kennedy was inaugurated, FORTRAN III had been released and FORTRAN had the features with which it would become the predominant programming language for scientific and engineering applications. To a nontrivial extent, it still is.

 

Whereas COBOL was created to be a general-purpose language that worked well for creating applications for business and government purposes in which reports and human-readable output were key, FORTRAN was all about manipulating numbers and numeric data structures.

 

Its numeric capabilities meant that Fortran was the language of choice for the first generation of high-performance computers and remained the primary development tool for supercomputers: Platform-specific versions of the language power applications on supercomputers from Burroughs, Cray, IBM, and other vendors.

 

Of course, if the strength of Fortran was in the power of its mathematical processing, its weakness was actually getting data into and out of the program. Many Fortran programmers have horror stories to tell, most centering upon the “FORMAT” statement that serves as the basis of input and output.

 

While many scientific applications have begun to move to C++, Java, and other modern languages because of the wide availability of both function libraries and programming talent, Fortran remains an active part of the engineering and scientific software development world.

 

If you’re looking for a programming language in use on everything from $25 computers that fit in the palm of your hand to the largest computers on earth you only have a couple of choices. If you want that programming language to be the same one your grandparents might have used when they were beginning their careers, then there’s only one option. But that option is not necessarily the safest one.

 

Some professionals argue that the legacy systems significantly increase security incidents in the organizations. Other professionals disagree with this claim and argue that the legacy systems are “secure by antiquity”. Due to lack of adequate documentation on the legacy systems, they argue that it is very difficult and costly for potential attackers to discover and exploit security vulnerabilities in the systems.

 

New research is turning on its head the idea that legacy systems such as Cobol and Fortran are more secure because hackers are unfamiliar with the technology.

 

Current studies found that these outdated systems, which may not be encrypted or even documented, were more susceptible to threats.

 

By analyzing publicly available federal spending and security breach data, the researchers found that a 1% increase in the share of new IT development spending is associated with a 5% decrease in security breaches.

 

In other words, federal agencies that spend more in maintenance of legacy systems experience more frequent security incidents, a result that contradicts a widespread notion that legacy systems are more secure. That’s because the integration of legacy systems makes the whole enterprise architecture too complex, too messy.

 

A significant amount of public IT budgets is spent maintaining legacy systems although these systems often pose significant security risks, such as the inability to utilize current security best practices, including data encryption and multi-factor authentication, which make them particularly vulnerable to malicious cyber activity.

 

There is no simple solution in addressing these legacy systems, but one option could be moving them to the cloud. Migration of legacy systems to the cloud offers some security advantages versus running the legacy systems on premise because cloud vendors have more resources and capabilities to build effective guardianship of valuable information than clients. Cloud vendors use common IT platforms to achieve economies of scale and scope in the production and delivery of IT services to a large number of client organizations.

 

Thanks to economies of scale and scope, it is more feasible for the vendors to use dedicated information security teams to protect the clients’ systems over the common IT platforms. By comparison, a client organization is unlikely to have adequate resources to afford even a fraction of the dedicated information security team of the vendors. In addition, the cloud vendors are better able to attract, motivate, promote, and retain the top security talent, which is necessary as the security threat landscape dynamically evolves. On the other hand, the legacy system environment of a client organization is unlikely to offer attractive and sustainable career paths for security professionals who look for opportunities to continuously develop and advance their professional skills and knowledge. In the legacy environments, IT professionals spend most of their careers in maintaining and operating specific legacy systems and have fewer opportunities to learn about emerging new technologies.

 

Migration of legacy systems to the cloud requires standardization of IT interfaces in the client organization, which can in turn make it easier for the cloud vendors to effectively guard information flows at the access and interaction points around the enterprise architectures. To be able to connect to the cloud and make use of its common standardized IT services and interfaces, a client organization needs to adhere to the standards mandated by the vendors. Thus, migrating legacy systems to the cloud often requires the standardization of the IT interfaces in the client’s enterprise architectures. The highly standardized interfaces with the client make it easier and less costly for the cloud vendor to apply common security governance and control mechanisms to guard the sensitive information exchanged through those interfaces.