Getting a new machine is always an exciting moment for customers and the installation engineer’s who often spend many hours testing and configuring the systems ready for acceptance.
Your install picture are solicited for this entry. Email over to the contact address.
Install Timelapse YouTubes:
Jim Reports : Here is a picture of me installing the J90 at the University of Tasmania sometime in 1995. The system has long ago been scrapped.
This photo from a Cray-1 install at USC as printed in Channels
These photos from the final preparation of YMP SN1040 at Chippewa Falls in ’96 ( after it was converted to a model E IOS ) but before it was shipped to Moscow for use in weather forecasting.
Photo submitted by the yours truly the software analyst in the last frame.
Vic Kindly submitted these photos of the Cray XMP-EA being installed at ADNOC. Do you recognise the faces ?
Tony forwarded these photos of Colin Broomham and Charlie from an XMP install at ADNOC in 1986
Peter G. recalls that at that install Charlie did not want to pay overtime so we only worked 8 hour shifts. But it meant that he had to send more engineers over for the install.
Tony h. adds: The thing I remember from it was after we bolted the IOP power supplies on we found that it had moved a bit and the floor holes would’t line up for the fridge hoses. We all gave it a good shove, I was surprised how easily it moved.
From Cray Channels V07_N2
Cray announces international orders In March, Cray announced that the Abu Dhabi National Oil Company (ADNOC) ordered a CRAY X-MP/14 computer system. The system, which will be purchased, is to be installed in the fourth quarter of 1985 at ADNOC’s headquarters in Abu Dhabi, subject to U.S. export license approval. The system will be used for oil reservoir engineering. ADNOC and its subsidiary companies operate large oil fields that require advanced management techniques supported by the Cray supercomputer.
EPCC Are happy to welcome a Cray Shasta Mountain system “Archer2” 2020/21
Unboxing a supercomputer – “Magnus” XC30 at Pawsey Supercomputing Centre in Australia.
Install photos from DKRZ German Weather Compuetr centre.
CERN XMP/48 January 1987
YMP cab being installed next to a Red XMP – Site and personal detail lost – Let me know if you know where this is being installed.
Cray-1 SN3 and XMP going in at NCAR. Note that SN3 has now (2022) moved from it’s original delivery location at the Mesa lab over to NCAR new data centre in Wyoming.
Cray-2 going in at the CMOA
From http://www.ucar.edu/communications/staffnotes/9612/C90.html[4/18/2013 3:08:12 PM] [All photos by Carlye Calvin.]
UCAR > Communications > Staff Notes > December 1996
A new C-90 and J90 and STK silo arrives for climate modeling at NCAR
NCAR’s climate system model (CSM) and other integrated models have a spacious new home. Over Thanksgiving week, a CRAY Y-MP8I supercomputer leased by NCAR since 1991 was replaced by a CRAY C-90. The new machine–dubbed antero, like the one it replaced–has 256 million words of memory and 16 processors, twice as many as its predecessor. It can produce up to 5 billion floating-point operations per second.
“We believe it’ll give us a factor of three to four increase in speed over the previous antero,” says Bill Buzbee, SCD director. As the linchpin of the Climate System Laboratory (CSL), the new Cray will be dedicated to extensive climate simulations.
The C-90 has arrived at SCD through an 18-month extension of the previous lease arrangement for the Y-MP8I with Cray Research, Inc. The old lease had been due to expire in mid-1997.
The arrival of the C-90 comes after a stalled process for longer-term acquisition of a new supercomputer for climate modeling. In May, NCAR announced the selection of the Japan-based NEC Corp. to provide four large vector supercomputers over five years. However, the acquisition has been put on hold pending a formal complaint by Cray. Investigations are now under way by the International Trade Commission and the U.S. Commerce Department. Decisions are not expected until well into 1997.
“We spent the better part of two years developing requirements for supercomputing support for the CSM,” says Jeff Reaves, UCAR associate vice president for finance and administration. “The extension of our lease with Cray enables us to provide the additional computing power we need for the Climate Modeling, Analysis, and Prediction program and other modeling projects.”
The C-90 is in its testing-and-acceptance phase through December. Shortly after the two-day hardware installation and checking process, software was delivered and installed. The next step was to ask “friendly” users (primarily NCARbased modelers) to begin running familiar programs and verify that the algorithms worked as expected. “Typically on a big mainframe, the acceptance tests take one or two weeks.” says Bo Connell, head of SCD’s Computer Production Group (formerly the Operations Group). •BH
[All photos by Carlye Calvin.]
A new C-90 arrives for climate modeling
Above and below: Workmen lower the C-90 into SCD’s staging area.
Before the C-90’s bright blue “skin” was attached, the computer’s innards were in full view.
The C-90 weighs in at 256 million words of memory with 5 gigaflops. On the left is the machine’s solid-state storage device; at right is its central processing unit.
Next step: a J-9
Hot on the heels of the C-90, another powerful new computer is joining the ranks in SCD, this one for the overall user community. Just before press time, NSF gave its approval for acquisition of a CRAY J-9 to be delivered this month. NCAR already has two other J-9s, one with 20 processors devoted to the CSL and the other with 16 processors for users at large. The new machine also has 20 processors, but it boasts more random-access memory than any other NCAR machine to date: a billion words. “The large memory will offer community users the opportunity to run much larger jobs than they can now run on the [community] Y-MP,” says Bill Buzbee.
This is one of two NCAR’s current J-9 models, soon to be joined by a third. The new J-9 earned a strong recommendation from the SCD advisory panel. Purchased outright rather than leased, it should be on hand for two or three years, said Buzbee. “If we can keep the old community Y-MP running as well, then the net effect will be a 50 percent increase in community computing power.”
When SCD got its first robotic silo from StorageTek in 1989, the machine was dubbed “Fred,” as in Flintstone. (For better or worse, the name didn’t stick.) The silo’s job was to store up to 6000 data-stuffed cartridges and provide access to them for users of NCAR supercomputers. Visitors were soon able to watch Fred at work via an internal camera and a lobby-mounted monitor.
Now Fred has company. A second silo arrived at SCD in late August and is currently going through its final stages of testing. From the outside, it looks identical to its predecessor. Inside, though, it’s a silo on steroids. Its cartridges hold up to 50 gigabytes of data, compared to 800 megabytes for the older cartridges. Thus, the new machine eventually could hold up to 60 times more data than the old one does in its present configuration.
“The robotics are identical. The only difference is in the cartridges and in the tape drives attached to the silo,” says Gene Harano, head of the SCD Mass Storage Systems group. The old silo has 16 drives, while the new one has 8 drives configured to accept the new, higher-capacity cartridges. If budgets allow, 8 additional drives could be added to the new silo later on.