[show_avatar firstname.lastname@example.org align=left avatar_size=30]It’s hard to believe that just 25 years ago seismic interpretation was a manual process. Rather than workstations, provisioning a geoscientist meant providing a solid drafting table, good light, plenty of colored pencils, and paper weights. The process of developing a prospect could take months or even years. The advent of computer-based systems was a boon to the industry and has drastically reduced workflow times and the risk of drilling dry holes. The infrastructure evolved from a large, expensive turnkey system shared by many geoscientists, to large, isolated, individual workstations. For data backup, disk was expensive and tape ubiquitous.
The client/server model coupled with shared storage decreased costs, reduced the use of tape, and streamlined access to data. Thin client technology, which puts minimal hardware on the desktop and relies on the computing power of the server, was yet another innovation used to reduce costs. One could argue that today we have come back to thick clients with “fat” (large memory, multiple CPUs) Windows and Linux based workstation applications.
A new IT revolution is upon us – the shift towards cloud computing. The private cloud, deployed on (more…)