PC's David Battles UNIX Goliath

Hey LINUX, Why Can't We All Just Be Friends?

Who would argue that 3-D seismic technology revolutionized the oil and gas industry, a feat made possible by the now-ubiquitous computer?

We have come a long way, baby -- and, all things considered, it's been a fairly quick trip.

Zoom back to 1982

That's when history was made in the petroleum industry, when the first 3-D seismic interpretation and visualization workstation debuted in the marketplace. The DOS-based "Landmark III" system was roughly the size of a sideways refrigerator, weighed more than a 1,000 pounds and could hold 440 megabytes of data.

It sported a price tag of $250,000.

Fast forward to 1999

The contemporary office landscape includes workstations for less than half that price perched on desktops, beside personal computers that hum along with available storage space of as much as 15 or more gigabytes.

It's these personal computers that are creating big waves today in the information systems milieu, providing even the smallest player with inexpensive, yet powerful tools for state-of-the-art geoscience applications.

The momentum was kick-started with the advent of the 32-bit Windows NT operating system, which gave PC users a tool that now rivals UNIX's performance as a powerful operating system capable of running large, mission critical applications.

And -- noteworthy during these trying times in the oil and gas industry -- the price is right.

"I use both a UNIX (workstation) and a PC," says geophysicist Steve Anderson, president of Kinnickinnick Exploration, "but I paid about $120,000 for the hardware and software of the workstation, which is about eight times more costly than the PC."

Mixed Environments

Given the up-front cost advantage of the easy-to-use NT platform -- along with the fact that the high-maintenance UNIX is not standardized and has a steep learning curve, according to Anderson -- the growing popularity of the PC for geoscience applications, particularly among the smaller independents, comes as no surprise.

Cost aside, it's an impressive workhorse.

Please log in to read the full article

Who would argue that 3-D seismic technology revolutionized the oil and gas industry, a feat made possible by the now-ubiquitous computer?

We have come a long way, baby -- and, all things considered, it's been a fairly quick trip.

Zoom back to 1982

That's when history was made in the petroleum industry, when the first 3-D seismic interpretation and visualization workstation debuted in the marketplace. The DOS-based "Landmark III" system was roughly the size of a sideways refrigerator, weighed more than a 1,000 pounds and could hold 440 megabytes of data.

It sported a price tag of $250,000.

Fast forward to 1999

The contemporary office landscape includes workstations for less than half that price perched on desktops, beside personal computers that hum along with available storage space of as much as 15 or more gigabytes.

It's these personal computers that are creating big waves today in the information systems milieu, providing even the smallest player with inexpensive, yet powerful tools for state-of-the-art geoscience applications.

The momentum was kick-started with the advent of the 32-bit Windows NT operating system, which gave PC users a tool that now rivals UNIX's performance as a powerful operating system capable of running large, mission critical applications.

And -- noteworthy during these trying times in the oil and gas industry -- the price is right.

"I use both a UNIX (workstation) and a PC," says geophysicist Steve Anderson, president of Kinnickinnick Exploration, "but I paid about $120,000 for the hardware and software of the workstation, which is about eight times more costly than the PC."

Mixed Environments

Given the up-front cost advantage of the easy-to-use NT platform -- along with the fact that the high-maintenance UNIX is not standardized and has a steep learning curve, according to Anderson -- the growing popularity of the PC for geoscience applications, particularly among the smaller independents, comes as no surprise.

Cost aside, it's an impressive workhorse.

"I can do 80 percent of what's needed on a PC -- the bread and butter, the important things -- plus it's easier to load data and faster," Anderson said. "And there seems to be a greater acceptance from companies to see a prospect on a PC."

Not to mention that a laptop beats a cumbersome workstation any day when in-field processing and geometry QC are required during seismic data acquisition in the middle of the jungle.

Still, the 20-something percent of work that can't be done with a PC means that the powerful UNIX system is alive and well -- and suggests why Anderson's dual office setup is fast becoming more the rule than the exception.

"Mixed environments are more common than homogeneous environments, and we're seeing a lot of dual usage today," said Bill Curtis, director of computer science at Landmark. "UNIX is still the only way to do very large-scale processing.

"Very big servers consisting of 128 processors or even more are not unusual on UNIX, whereas NT servers are commonly available with up to four processors."

He noted that the scale of data, especially in seismic applications, can be so vast that it demands the 64-bit memory addressability available on UNIX.

The coming debut of Windows 2000 looms as a defining moment in the PC world for myriad industries, including oil and gas.

"One thing certainly planned for Windows 2000 is enhanced support for using multiple displays on the same computer, which is really important in the business because you have massive amounts of highly technical data you want to display in three dimensions," Curtis said. "Having more pixels available allows the application developers more freedom to make customers' jobs easier."

While this can be done today, he emphasizes it will be better and more flexible with Windows 2000, which will be a more powerful platform than NT.

LINUX' Potential

There's a lot of proverbial breath holding among the tech-savvy of the world as they await this new operating system.

"Windows 2000 must come out stable and crash-proof," asserted Arthur Paradis, president of Dynamic Graphics, which has been quick to take advantage of the merits of the PC for its software products.

"Running large jobs with large amounts of data, you must have a certain level of reliability -- and this is a key issue that has all industries concerned."

There's an intriguing alternative operating system, though, which is attracting the attention of both the PC manufacturers and the business community.

Dubbed LINUX, it was spawned in 1991 when a Finnish college student, Linus Torvalds, began tweaking an experimental version of UNIX. The ensuing software product, which can substitute for established UNIX programs, can be downloaded off the Internet free of charge.

It became a sort of overnight sensation among the software programming community and, in short order, the business world. Most of the leading hardware vendors have announced plans to pre-install it on at least some of their systems.

LINUX is known as freeware or open-source software, meaning it's not just free but is available complete with its source code. Because of this, programmers are continually improving and adding features to it, which are then made available to users via the Internet -- a far different scenario than the typical annual, or even less frequent, release of updated versions of commercial software.

The fledgling software's formidable potential was demonstrated when it was used to run 68 PCs as a single parallel processing machine at Los Alamos National Laboratory, where it reached a peak speed of 19 billion calculations per second.

After three months, it still did not require a re-boot.

There are still relatively few software applications designed to run on LINUX, but that is beginning to change and appears not to detract from users' enthusiasm for its flexibility and tendency not to crash.

While not for novices, it is possible to set up a dual boot on a PC, according to Anderson, and this ability to boot both LINUX and NT brings increasing versatility to the PC.

Paradis noted that much of the UNIX software can be ported into the PC via LINUX. This offers the user a way to learn the UNIX operating system on a PC, meaning that beginners can make the usual mistakes without the possibility of destroying a very expensive workstation.

Given the current mix of UNIX and NT platforms in various far-flung locales, it is advantageous for the oil and gas explorationist to be fluent in both software languages.

Help Is On the Way

Regardless of the quasi-revolutionary aspects of Windows 2000, LINUX and whatever as-yet-unknown operating systems may be waiting in the wings, the switchover from UNIX is slow-going and may never be complete. Nor does it necessarily have to be.

The maintenance and interface issues that must be addressed when the large companies adopt new platforms are vast. A company with a strong investment in UNIX -- say 200 machines -- may well opt for a few more of the same when more capacity is needed

Even a seemingly simplistic, all-PC environment is no slam-dunk when it comes to installation and maintenance. Users must export/import pertinent data among one another because there is no automatic updating of data unless all machines are networked, which entails time and money for the company.

In the case of a combination UNIX/PC environment, the two systems don't "talk" to one another.

But help is on the way.

Curtis said that his company is now providing enhanced UNIX/NT interoperability for its exploration, production and data management solutions and that they will continue development and support for both platforms.

"For the foreseeable future," he noted, "it's UNIX and NT, not UNIX or NT."

But the current petroleum industry downturn will do nothing if not promote a more pronounced shift toward the less expensive, increasingly powerful PC side of the business, according to Anderson's succinct comment:

"All these people being laid off will buy the PC stuff."

You may also be interested in ...