Thursday, March 22, 2007

Raytheon Develops World's First Polymorphic Computer

EL SEGUNDO, Calif., March 20, 2007 -- The world's first computers whose architecture can adopt different forms depending on their application have been developed by Raytheon Company (NYSE: RTN).

The architecture of the MONARCH processor with key elements identified
The architecture of the MONARCH processor with key elements identified

Dubbed MONARCH (Morphable Networked Micro-Architecture) and developed to address the large data volume of sensor systems as well as their signal and data processing throughput requirements, it is the most adaptable processor ever built for the Department of Defense, reducing the number of processor types required. It performs as a single system on a chip, resulting in a significant reduction of the number of processors required for computing systems, and it performs in an array of chips for teraflop throughput.

"Typically, a chip is optimally designed either for front-end signal processing or back-end control and data processing," explained Nick Uros, vice president for the Advanced Concepts and Technology group of Raytheon Space and Airborne Systems. "The MONARCH micro-architecture is unique in its ability to reconfigure itself to optimize processing on the fly. MONARCH provides exceptional compute capacity and highly flexible data bandwidth capability with beyond state-of-the-art power efficiency, and it's fully programmable."

In addition to the ability to adapt its architecture for a particular objective, the MONARCH computer is also believed to be the most power- efficient processor available.

"In laboratory testing MONARCH outperformed the Intel quad-core Xeon chip by a factor of 10," said Michael Vahey, the principal investigator for the company's MONARCH technology.

MONARCH's polymorphic capability and super efficiency enable the development of DoD systems that need very small size, low power, and in some cases radiation tolerance for such purposes as global positioning systems, airborne and space radar and video processing systems.

The company has begun tests on prototypes of the polymorphic MONARCH processors to verify they'll function as designed and to establish their maximum throughput and power efficiency. MONARCH, containing six microprocessors and a highly interconnected reconfigurable computing array, provides 64 gigaflops (floating point operations per second) with more than 60 gigabytes per second of memory bandwidth and more than 43 gigabytes per second of off-chip data bandwidth.

The MONARCH processor was developed under a Defense Advanced Research Project Agency (DARPA) polymorphous computing architecture contract from the U.S. Air Force Research Laboratory. Raytheon Space and Airborne Systems led an industry team with the Information Sciences Institute of the University of Southern California to create the integrated large-scale system on a chip with a suite of software development tools for programs of high value to the Department of Defense and commercial applications. Besides USC major subcontractors included Georgia Institute of Technology, Mercury Computer Systems and IBM's Global Engineering Solutions division.

Raytheon Space and Airborne Systems is the leading provider of sensor systems giving military forces the most accurate and timely information available for the network-centric battlefield. With 2006 revenues of $4.3 billion and 12,000 employees, SAS is headquartered in El Segundo, Calif. Additional facilities are in Goleta, Calif.; Forest, Miss.; Dallas, McKinney and Plano, Texas; and several international locations.

Raytheon Company, with 2006 sales of $20.3 billion, is an industry leader in defense and government electronics, space, information technology, technical services, and business and special mission aircraft. With headquarters in Waltham, Mass., Raytheon employs 80,000 people worldwide.

(c) www.shoutwire.com

Tuesday, March 20, 2007

DRAM prices continue to plummet

Mark LaPedus

SAN JOSE, Calif. — Prices for DRAMs continue to plummet, as the tags for mainstream devices have fallen by a staggering 44 percent since the beginning of 2007, according to a report from Gartner Inc.

Average DRAM spot prices across all densities were down 6.5 percent for the seven-day period ended March 16, compared to the previous period, according to Gartner. Average spot prices stood at $3.67 on a 512-megabit basis for the period, down 39 percent since the beginning of 2007, according to the firm.

Prices for mainstream 512-Mbit DDR2-based chips are down 44 percent since the beginning of this year. ''Ample supply in the market and little fear of a tightening of supply gave the overall market a negative outlook,'' said Andrew Norwood, an analyst with Gartner.

At the beginning of February, the DRAM market crashed, as average selling prices (ASPs) had already fallen by 30 percent since the beginning of this year. DRAM ASPs were projected to fall by 30 percent for the entire year.

At that time, vendors insisted that the DRAM free-fall was temporary, claiming that a rebound is due in the second half of 2007, thanks in part to Microsoft's Vista operating system software.

In general, the memory market is lousy. The NAND flash market is also "brutal," according to Intel Corp. Some believe the ASPs on NAND chips will decline 65 percent this year.

(c) www.eetimes.com

Uni.PC Tags: ,

Thursday, March 15, 2007

NVIDIA GeForce 8600-Series Details Unveiled

by Anh Huynh

NVIDIA prepares its next-generation mid-range and mainstream DirectX 10 GPUs

Earlier today DailyTech received it's briefiing on NVIDIA’s upcoming GeForce 8600GTS, 8600GT and 8500GT graphics processors. NVIDIA’s GeForce 8600GTS and 8600GT are G84-based GPUs and target the mid-range markets. The lower-positioned G86-based GeForce 8500GT serves as the flagship low to mid-range graphics card.
The budget-priced trio feature full support for DirectX 10 features including pixel and vertex shader model 4.0. NVIDIA has yet to reveal the amount of shaders or shader clocks though. Nevertheless, the trio supports NVIDIA SLI and PureVideo technologies.


NVIDIA GeForce 8600GTS

 


NVIDIA GeForce 8600GT

NVIDIA touts three dedicated video engines on the G84 and G86-based graphics cards for PureVideo processing. The video engines provide MPEG-2 high-definition and WMV HD video playback up to resolutions of 1080p. G84 and G86 support hardware accelerated decoding of H.264 video as well; however, NVIDIA makes no mention of VC-1 decoding. G84 and G86 also feature advanced post-processing video algorithms. Supported algorithms include spatial-temporal de-interlacing, inverse 2:2, 3:2 pull-down and 4-tap horizontal, and 5-tap vertical video scaling.
At the top of the mid-range lineup is the GeForce 8600GTS. The G84-based graphics core clocks in at 675 MHz. NVIDIA pairs the GeForce 8600GTS with 256MB of GDDR3 memory clocked at 1000 MHz. The memory interfaces with the GPU via a 128-bit bus. The GeForce 8600GTS does not integrate HDCP keys on the GPU. Add-in board partners will have to purchase separate EEPROMs with HDCP keys; however, all GeForce 8600GTS-based graphics cards feature support for HDCP.
GeForce 8600GTS-based graphics cards require an eight-layer PCB. Physically, the cards measure in at 7.2 x 4.376 inches and available in full-height only. NVIDIA GeForce 8600GTS graphics cards feature a PCIe x16 interface, unlike ATI’s upcoming RV630. GeForce 8600GTS-based cards still require external PCIe power. NVIDIA estimates total board power consumption at around 71-watts.
Supported video output connectors include dual dual-link DVI, VGA, SDTV and HDTV outputs, and analog video inputs. G84-based GPUs do not support a native HDMI output. Manufacturers can adapt one of the DVI-outputs for HDMI.
NVIDIA’s GeForce 8600GT is not as performance oriented as the 8600GTS. The GeForce 8600GT GPU clocks in at a more conservative 540 MHz. The memory configuration has more flexibility, letting manufacturers decide between 256MB or 128MB of GDDR3 memory. NVIDIA specifies the memory clock at 700 MHz. The GeForce 8600GT shares the same 128-bit memory interface as the 8600GTS. HDCP support on GeForce 8600GT is optional. The GPU and reference board design support the required HDCP keys EEPROM, however, the implementation is up to NVIDIA’s add-in board partners.
GeForce 8600GT-based graphics cards only require a six-layer PCB instead of the eight-layer PCB of the 8600GTS. The physical board size is also smaller too – measuring in at 6.9 x 4.376 inches. GeForce 8600GT-based cards do not require external PCIe power. NVIDIA rates the maximum board power consumption at 43-watts – 28-watts less than the 8600GTS.
The GeForce 8600GT supports similar video outputs as the 8600GTS, however, the 8600GT does not support video input features.
NVIDIA has revealed very little information on the GeForce 8500GT besides support for GDDR3 and DDR2 memory. It supports dual dual-link DVI, VGA and TV outputs as well.
Expect NVIDIA to pull the wraps off its GeForce 8600GTS, 8600GT and 8500GT next quarter in time to take on AMD’s upcoming RV630 and RV610.

(c)  www.dailytech.com

Thursday, February 22, 2007

Gates: Vista Has Been "Incredibly Well Received"

by Brandon Hill

Microsoft Chairman Bill Gates lays praise on Windows Vista

Windows VistaYesterday, DailyTech reported that Microsoft CEO Steve Ballmer was cautious about "overly aggressive" forecasts for Windows Vista. Ballmer went on to say that Vista’s slow retail start can be attributed to piracy which has become increasingly popular in emerging markets.

It appears that Microsoft Chairman Bill Gates and Ballmer haven't had much communication on the matter recently. Reuters asked Gates about any trepidation Microsoft might have about the outlook for Vista to which he responded "I don't know what you mean. Vista's had an incredible reception."

Gates deflected the questioning and instead decided to focus on what he sees as positive progress for Microsoft's newest consumer operating system. "The reviews have been fantastic. This is a big, big advance in the Windows platform. It's the world's most used piece of software... Overall, the reliability feedback has been well better than we expected," said Gates.

"People who sell PCs have seen a very nice lift in their sales. People have come in and wanted to buy Vista," Gates continued.

Gates is right about the lift in PC sales. According to NPD, PC unit shipments were up 67% the week Vista launched in comparison to the same period in 2006. That is a key measure for Microsoft as 80% of its OS revenue comes from PC OEMs. Vista's retail performance, however, was down 60% in comparison to Windows XP's opening week in 2001.

(c) www.dalilytech.com

Intel Pulls 45nm Xeon Launch Into 2007

by Kristopher Kubicki

Intel promises 45nm server processors this yearIntel Logo

Earlier today, Intel revealed to DailyTech more details regarding 45nm server products, including launch windows and compatibility.
Kirk Skaugen, general manager of Intel's Server Platform Group, opened his statements with "We were originally in the Q1'08 timeframe. Today I'm happy to announce to report for the first time that our server 45nm Xeon products based on the Penryn core will be available into production for the second half of 2007."
Intel's latest desktop guidance claims 45nm desktop SKUs will also launch in late 2007, with volume shipments occurring in 2008.  As it stands right now, only the mobile 45nm SKUs are expected to launch in 2008.
Skaugen also confirmed that Penryn-based Xeon processors will utilize the same server platform as Xeon 5000, 5100 and 5300.  Nehalem, Intel's next-generation micro architecture on the 45nm node slated for 2008, will require new platform technology and is not compatible with the Penryn platform. 
45nm quad-core Harpertown and dual-core Wolfdale were originally slated to spearhead the next-generation Xeon launch in Q1 2008.  The existing Bensley platform, Intel 5000P chipset, will still provide the heavy lifting for volume dual-socket on 45nm Xeon.  A new platform, Cranberry Lake, will replace Bensley-VS for value dual-socket Intel platforms, and will support Harpertown and Wolfdale
Intel hinted earlier this year it might pull some of its launches in after the Penryn tape-out proved slightly more successful than anticipated.

(c) www.dailytech.com

Sunday, January 7, 2007

Asus launches XG Station, the world’s first external graphics card for laptops

by Doug Berger

Asus XG Station

Today at CES Unveiled, we had a chance to look at Asus’ new XG Station - an external graphics card station for your laptop. The unit includes USB 2.0 ports, and a Dolby headphone jack, and supports both HDCP and HDMI for all of your high-def enjoyment. You get that? You can plug your regular laptop into the XG Station, plug the XG Station into an HD monitor, then watch your screen in awe. According to Asus, “Lab experiments on a notebook based on Intel 945GM graphics connected to the XG station with an ASUS EN7900GS graphics card showed an astounding 9 times increase in acceleration.”

The XG Station not only adds extra graphics, but it’s easy on the eyes with its LED information display - showing volume, clock speed, GPU temperature, Dolby status, Frames Per Second (FPS) and more.

(c) www.gadgetell.com

Dell's Black Ice Finds Home in New Quad-Core XPS Desktop?

Dell%20Black%20Ice%20flyer.JPGAn anonymous tipster sent us this flyer of Dell's mysterious Black Ice technology. Apparently it's a two-stage thermoelectric liquid cooling solution that will be a part of Dell's forthcoming XPS 710 H2C, which our tipster says will come with an Inel Core 2 Extreme QX6700 CPU and two GeForce 8800 GTX cards (in SLI configuration). We'll keep our eyes peeled for the full scoop as the show progresses. – Louis Ramirez

(c) www.gizmodo.com

Tag Cloud