By Tony Smith [More by this author]
Intel is said to be preparing a pair of dual-core Core 2 Extreme processors specifically developed for gamer-friendly notebooks, the first coming in Q2/Q3, the second arriving as a follow-up in Q4.
So claim Taiwanese motherboard-maker moles cited by Chinese-language site HKEPC. Codenamed 'Merom XE', the two 65nm parts will contain the usual 4MB of shared L2 cache and operate over an 800MHz frontside bus. The first of the two, the Core 2 Extreme X7800 will be clocked at 2.6GHz, it's claimed; the second, the X7900, will run at 2.8GHz.
Both processors are said to support SpeedStep, Virtualisation Technology and 64-bit addressing. At this stage, it's not known how much power the chips consume.
The sources alleged the X7800 will be priced at $795 when it ships late Q2/early Q3. The X7900's price is unknown.
Monday, February 12, 2007
Intel shows off Penryn chips
By Tom Krazit, CNET News.com
Intel says its 45-nanometer chips are almost ready for prime time.
The company demonstrated PCs and servers running its upcoming Penryn family of chips this week during a briefing for the press and analysts on its new transistor design for the 45-nanometer generation. Penryn is the code name for a family of desktop, notebook and server chips based on Intel's Core microarchitecture, and systems with the chips will be available before the end of this year, said CEO Paul Otellini at the event.
Penryn chips will come with the SSE4 instructions Intel announced at the Intel Developer Forum in September, said Stephen Smith, vice president and director of desktop platform operations. Smith called the new instructions "the biggest change to our instruction set in about five years," and said they improve the performance of multimedia applications and technical computing.
The Penryn chips are the first iteration of the new manufacturing strategy outlined by Otellini earlier this year. Intel wants to introduce new chip microarchitectures and manufacturing technologies on a regular two-year cadence, which the company refers to as the "tick-tock" strategy.
Penryn is essentially a shrink of the Core 2 Duo chips, with a few extras like the SSE4 instructions. It's being introduced along with the new manufacturing technology, the "tick" of Intel's plans. Then next year, when the 45-nanometer manufacturing technology is mature, Intel will introduce a new chip microarchitecture code-named Nehalem--the "tock"--with more significant changes to the chip design.
The rapid cadence is designed to ensure Intel won't get fooled again. Advanced Micro Devices caught Intel off guard earlier this decade, introducing a new chip architecture that represented a significant improvement in performance and power efficiency over Intel's chips at the time. Intel would like to avoid having to scrap years of planning again, so it is making smaller changes to its chip blueprints on a more frequent basis to keep up with the times.
The tide has started to turn back in Intel's favor with the Core 2 Duo chips. But one area where Intel has never fallen behind AMD is chip manufacturing.
Intel has been shipping chips based on its 65-nanometer manufacturing technology since late 2005, while AMD just last month introduced its first 65-nanometer chips. If Intel successfully introduces the Penryn family, it will have 45-nanometer chips out well before AMD's planned 2008 rollout of similar chips.
Smaller transistors have lots of benefits. Chip makers can improve performance by putting more transistors on the same size chip and dial-down power consumption by getting more work done. There's an economic upside as well, in that the chips themselves can be made smaller. This allows Intel and AMD to cut more chips from a single silicon wafer, reducing the cost to build an individual chip and making investors happy with fatter profit margins.
AMD has outlined plans to try to catch up to Intel, vowing to introduce its own 45-nanometer chips 18 months after its 65-nanometer chips, instead of the usual two years. Intel's Penryn demonstration puts additional pressure on that transition.
Intel says its 45-nanometer chips are almost ready for prime time.
The company demonstrated PCs and servers running its upcoming Penryn family of chips this week during a briefing for the press and analysts on its new transistor design for the 45-nanometer generation. Penryn is the code name for a family of desktop, notebook and server chips based on Intel's Core microarchitecture, and systems with the chips will be available before the end of this year, said CEO Paul Otellini at the event.
Penryn chips will come with the SSE4 instructions Intel announced at the Intel Developer Forum in September, said Stephen Smith, vice president and director of desktop platform operations. Smith called the new instructions "the biggest change to our instruction set in about five years," and said they improve the performance of multimedia applications and technical computing.
The Penryn chips are the first iteration of the new manufacturing strategy outlined by Otellini earlier this year. Intel wants to introduce new chip microarchitectures and manufacturing technologies on a regular two-year cadence, which the company refers to as the "tick-tock" strategy.
Penryn is essentially a shrink of the Core 2 Duo chips, with a few extras like the SSE4 instructions. It's being introduced along with the new manufacturing technology, the "tick" of Intel's plans. Then next year, when the 45-nanometer manufacturing technology is mature, Intel will introduce a new chip microarchitecture code-named Nehalem--the "tock"--with more significant changes to the chip design.
The rapid cadence is designed to ensure Intel won't get fooled again. Advanced Micro Devices caught Intel off guard earlier this decade, introducing a new chip architecture that represented a significant improvement in performance and power efficiency over Intel's chips at the time. Intel would like to avoid having to scrap years of planning again, so it is making smaller changes to its chip blueprints on a more frequent basis to keep up with the times.
The tide has started to turn back in Intel's favor with the Core 2 Duo chips. But one area where Intel has never fallen behind AMD is chip manufacturing.
Intel has been shipping chips based on its 65-nanometer manufacturing technology since late 2005, while AMD just last month introduced its first 65-nanometer chips. If Intel successfully introduces the Penryn family, it will have 45-nanometer chips out well before AMD's planned 2008 rollout of similar chips.
Smaller transistors have lots of benefits. Chip makers can improve performance by putting more transistors on the same size chip and dial-down power consumption by getting more work done. There's an economic upside as well, in that the chips themselves can be made smaller. This allows Intel and AMD to cut more chips from a single silicon wafer, reducing the cost to build an individual chip and making investors happy with fatter profit margins.
AMD has outlined plans to try to catch up to Intel, vowing to introduce its own 45-nanometer chips 18 months after its 65-nanometer chips, instead of the usual two years. Intel's Penryn demonstration puts additional pressure on that transition.
Sunday, February 11, 2007
Intel shows off 80-core processor
By Tom Krazit Staff Writer, CNET News.com
Intel has built its 80-core processor as part of a research project, but don't expect it to boost your Doom score just yet.
Chief Technical Officer Justin Rattner demonstrated the processor in San Francisco last week for a group of reporters, and the company will present a paper on the project during the International Solid State Circuits Conference in the city this week.
The chip is capable of producing 1 trillion floating-point operations per second, known as a teraflop. That's a level of performance that required 2,500 square feet of large computers a decade ago.
Intel first disclosed it had built a prototype 80-core processor during last fall's Intel Developer Forum, when CEO Paul Otellini promised to deliver the chip within five years. The company's researchers have several hurdles to overcome before PCs and servers come with 80-core processors--such as how to connect the chip to memory and how to teach software developers to write programs for it--but the research chip is an important step, Rattner said.
A company called ClearSpeed has put 96 cores on a single chip. ClearSpeed's chips are used as co-processors with supercomputers that require a powerful chip for a very specific purpose.
Intel's research chip has 80 cores, or "tiles," Rattner said. Each tile has a computing element and a router, allowing it to crunch data individually and transport that data to neighboring tiles.
Intel used 100 million transistors on the chip, which measures 275 millimeters squared. By comparison, its Core 2 Duo chip uses 291 million transistors and measures 143 millimeters squared. The chip was built using Intel's 65-nanometer manufacturing technology, but any likely product based on the design would probably use a future process based on smaller transistors. A chip the size of the current research chip is likely too large for cost-effective manufacturing.
The computing elements are very basic and do not use the x86 instruction set used by Intel and Advanced Micro Devices' chips, which means Windows Vista can't be run on the research chip. Instead, the chip uses a VLIW (very long instruction word) architecture, a simpler approach to computing than the x86 instruction set.
There's also no way at present to connect this chip to memory. Intel is working on a stacked memory chip that it could place on top of the research chip, and it's talking to memory companies about next-generation designs for memory chips, Rattner said.
Intel's researchers will then have to figure out how to create general-purpose processing cores that can handle the wide variety of applications in the world. The company is still looking at a five-year timeframe for product delivery, Rattner said.
Now on News.com:
Viacom moves on without YouTube
Sun likes what it sees in the new GPL
Photos: In Chicago, a car show gets into gear
Extra: EMI in talks to sell unprotected MP3s
But the primary challenge for an 80-core chip will be figuring out how to write software that can take advantage of all that horsepower. The PC software community is just starting to get its hands around multicore programming, although its server counterparts are a little further ahead. Still, Microsoft, Apple and the Linux community have a long way to go before they'll be able to effectively utilize 80 individual processing units with their PC operating systems.
"The operating system has the most control over the CPU, and it's got to change," said Jim McGregor, an analyst at In-Stat. "It has to be more intelligent about breaking things up," he said, referring to how tasks are divided among multiple processing cores.
"I think we're sort of all moving forward here together," Rattner said. "As the core count grows and people get the skills to use them effectively, these applications will come." Intel hopes to make it easier by training its army of software developers on creating tools and libraries, he said.
Intel demonstrated the chip running an application created for solving differential equations. At 3.16GHz and with 0.95 volts applied to the processor, it can hit 1 teraflop of performance while consuming 62 watts of power. Intel constructed a special motherboard and cooling system for the demonstration in a San Francisco hotel.
Chief Technical Officer Justin Rattner demonstrated the processor in San Francisco last week for a group of reporters, and the company will present a paper on the project during the International Solid State Circuits Conference in the city this week.
The chip is capable of producing 1 trillion floating-point operations per second, known as a teraflop. That's a level of performance that required 2,500 square feet of large computers a decade ago.
Intel first disclosed it had built a prototype 80-core processor during last fall's Intel Developer Forum, when CEO Paul Otellini promised to deliver the chip within five years. The company's researchers have several hurdles to overcome before PCs and servers come with 80-core processors--such as how to connect the chip to memory and how to teach software developers to write programs for it--but the research chip is an important step, Rattner said.
A company called ClearSpeed has put 96 cores on a single chip. ClearSpeed's chips are used as co-processors with supercomputers that require a powerful chip for a very specific purpose.
Intel's research chip has 80 cores, or "tiles," Rattner said. Each tile has a computing element and a router, allowing it to crunch data individually and transport that data to neighboring tiles.
Intel used 100 million transistors on the chip, which measures 275 millimeters squared. By comparison, its Core 2 Duo chip uses 291 million transistors and measures 143 millimeters squared. The chip was built using Intel's 65-nanometer manufacturing technology, but any likely product based on the design would probably use a future process based on smaller transistors. A chip the size of the current research chip is likely too large for cost-effective manufacturing.
The computing elements are very basic and do not use the x86 instruction set used by Intel and Advanced Micro Devices' chips, which means Windows Vista can't be run on the research chip. Instead, the chip uses a VLIW (very long instruction word) architecture, a simpler approach to computing than the x86 instruction set.
There's also no way at present to connect this chip to memory. Intel is working on a stacked memory chip that it could place on top of the research chip, and it's talking to memory companies about next-generation designs for memory chips, Rattner said.
Intel's researchers will then have to figure out how to create general-purpose processing cores that can handle the wide variety of applications in the world. The company is still looking at a five-year timeframe for product delivery, Rattner said.
Now on News.com:
Viacom moves on without YouTube
Sun likes what it sees in the new GPL
Photos: In Chicago, a car show gets into gear
Extra: EMI in talks to sell unprotected MP3s
But the primary challenge for an 80-core chip will be figuring out how to write software that can take advantage of all that horsepower. The PC software community is just starting to get its hands around multicore programming, although its server counterparts are a little further ahead. Still, Microsoft, Apple and the Linux community have a long way to go before they'll be able to effectively utilize 80 individual processing units with their PC operating systems.
"The operating system has the most control over the CPU, and it's got to change," said Jim McGregor, an analyst at In-Stat. "It has to be more intelligent about breaking things up," he said, referring to how tasks are divided among multiple processing cores.
"I think we're sort of all moving forward here together," Rattner said. "As the core count grows and people get the skills to use them effectively, these applications will come." Intel hopes to make it easier by training its army of software developers on creating tools and libraries, he said.
Intel demonstrated the chip running an application created for solving differential equations. At 3.16GHz and with 0.95 volts applied to the processor, it can hit 1 teraflop of performance while consuming 62 watts of power. Intel constructed a special motherboard and cooling system for the demonstration in a San Francisco hotel.
Java 2007: The year in preview
Open source Java programming means developers are driving -- but where to?
06 Feb 2007
2007 will go down in history as the year Sun Microsystems gave up the reins of the Java™ platform, releasing it under an open source license to the Java developer community. In this article, Java developer Elliotte Rusty Harold predicts new directions for the Java platform, in everything from scripting to bug fixing to new syntax.
");
}
}
}
//-->
2006 was another boom year for the Java platform. The Java language retained its title as the world's most used programming language, despite an onslaught of competition from both Microsoft (C#) and the scripting community (Ruby). And, while the release of Java 6 would have been cause enough for celebration, that paled in comparison to the announcement that Java was going to go fully open source under the GNU General Public License. Can the momentum continue in 2007? Let's consider the odds.
The Java platform goes open source
Before 2007 is half up, Sun will release the Java Development Kit (JDK) under an open source license. Freeing the JDK is a huge step for the Java developer community, and it will drive the evolution of the Java platform for the next decade.
Expect the quality of the JDK to improve dramatically as programmers stop merely reporting bugs and start fixing them. Bug reports at the Java Developer Connection will include detailed analysis of what's broken in the JDK and provide patches for fixing it. As Linus's Law states, "Given enough eyeballs, all bugs are shallow." That is, debugging is parallelizable. The same is true of optimization. Open source makes both massively parallelizable.
Forks in the road
Unfortunately, design is not as parallelizable as debugging and optimization. A clean API occasionally requires a dictatorial hand. The downside of dictators, however, is that sometimes they know what they're doing and sometimes they don't. Competition among would-be dictators is often the only way to discover the best solution to a problem.
Few companies can afford to develop multiple independent implementations of a product with the goal of throwing all but one away, but the open source community thrives on that sort of thing. So look for forks at all levels of the Java platform: language, virtual machine, and libraries. Most of these will fail, but that's okay. The good ideas will rise to the top. Some will take on a life of their own, and some will be merged back into the standard JDK. It probably won't be obvious by this time next year which are which, but the process should be well underway.
Sun will get the ball rolling in a few months by releasing an early beta of Java 7, Dolphin. The company can't release earlier versions of the JDK because of build problems and license encumbrances that are only cured in Dolphin. However, look for third parties to start chopping pieces out of the Sun release to produce passable, open source implementations of Java 6, Java 5, Java 1.4, and maybe even earlier versions.
Some of these early forkers will probably run afoul of Sun's trademarks and get nasty letters from the company's lawyers. We'll need a generic, untrademarked name for the language that everyone can use. I propose "J" -- hopefully no one can trademark a single letter.
Open source projects never die, they just fade away. Like the Blackdown Project before them, GNU Classpath, Kaffe, and other open source JDK projects are going to see their developers move on to other things. If a project hasn't reached 1.0 yet, it is unlikely to do so in the future.
06 Feb 2007
2007 will go down in history as the year Sun Microsystems gave up the reins of the Java™ platform, releasing it under an open source license to the Java developer community. In this article, Java developer Elliotte Rusty Harold predicts new directions for the Java platform, in everything from scripting to bug fixing to new syntax.
");
}
}
}
//-->
2006 was another boom year for the Java platform. The Java language retained its title as the world's most used programming language, despite an onslaught of competition from both Microsoft (C#) and the scripting community (Ruby). And, while the release of Java 6 would have been cause enough for celebration, that paled in comparison to the announcement that Java was going to go fully open source under the GNU General Public License. Can the momentum continue in 2007? Let's consider the odds.
The Java platform goes open source
Before 2007 is half up, Sun will release the Java Development Kit (JDK) under an open source license. Freeing the JDK is a huge step for the Java developer community, and it will drive the evolution of the Java platform for the next decade.
Expect the quality of the JDK to improve dramatically as programmers stop merely reporting bugs and start fixing them. Bug reports at the Java Developer Connection will include detailed analysis of what's broken in the JDK and provide patches for fixing it. As Linus's Law states, "Given enough eyeballs, all bugs are shallow." That is, debugging is parallelizable. The same is true of optimization. Open source makes both massively parallelizable.
Forks in the road
Unfortunately, design is not as parallelizable as debugging and optimization. A clean API occasionally requires a dictatorial hand. The downside of dictators, however, is that sometimes they know what they're doing and sometimes they don't. Competition among would-be dictators is often the only way to discover the best solution to a problem.
Few companies can afford to develop multiple independent implementations of a product with the goal of throwing all but one away, but the open source community thrives on that sort of thing. So look for forks at all levels of the Java platform: language, virtual machine, and libraries. Most of these will fail, but that's okay. The good ideas will rise to the top. Some will take on a life of their own, and some will be merged back into the standard JDK. It probably won't be obvious by this time next year which are which, but the process should be well underway.
Sun will get the ball rolling in a few months by releasing an early beta of Java 7, Dolphin. The company can't release earlier versions of the JDK because of build problems and license encumbrances that are only cured in Dolphin. However, look for third parties to start chopping pieces out of the Sun release to produce passable, open source implementations of Java 6, Java 5, Java 1.4, and maybe even earlier versions.
Some of these early forkers will probably run afoul of Sun's trademarks and get nasty letters from the company's lawyers. We'll need a generic, untrademarked name for the language that everyone can use. I propose "J" -- hopefully no one can trademark a single letter.
Open source projects never die, they just fade away. Like the Blackdown Project before them, GNU Classpath, Kaffe, and other open source JDK projects are going to see their developers move on to other things. If a project hasn't reached 1.0 yet, it is unlikely to do so in the future.
RSA: Microsoft says Vista follow-up likely in 2009
February 09, 2007 (IDG News Service) -- With Vista just out the door, Microsoft Corp. is now drawing up plans to deliver its follow-up client operating system by the end of 2009, according to the executive in charge of building the product's core components.
That would be a much faster turnaround than Vista, which shipped more than five years after Windows XP, but Vista was exceptional, said Ben Fathi, corporate vice president of development in Microsoft's Windows core operating system division, who spoke this week at the RSA Conference 2007 in San Francisco.
Microsoft originally planned for its XP follow-up to include several radical changes to Windows, including a new file system and a reinvented user interface, but after the company's products were hit by widespread worm outbreaks in 2003, Microsoft redirected almost its entire engineering effort to locking down Windows with the XP Service Pack 2 release.
"We put Longhorn on the back burner for a while," Fathi said. "Then, when we came back to it, we realized that there were incremental things that we wanted to do, and significant improvements that we wanted to make in Vista that we couldn't deliver in one release."
Vista shipped about two and a half years after XP SP 2, and Vista's follow-up is expected to take about the same amount of time, according to Fathi. "You can think roughly two, two and a half years is a reasonable time frame that our partners can depend on and can work with," he said. "That's a good time frame for refresh."
That timeline would put Microsoft's next client operating system out by the end of 2009.
Last year, Microsoft said that the code name for this Vista follow-up is Vienna, but Fathi said he could not disclose the current name. "We've been told not to use it publicly," he said.
So what will be the coolest new feature in Vienna?
According to Fathi, that's still being worked out. "We're going to look at a fundamental piece of enabling technology. Maybe its hypervisors, I don't know what it is," he said. "Maybe, it's a new user interface paradigm for consumers."
"It's too early for me to talk about it," he added. "But over the next few months I think you're going to start hearing more and more."
That would be a much faster turnaround than Vista, which shipped more than five years after Windows XP, but Vista was exceptional, said Ben Fathi, corporate vice president of development in Microsoft's Windows core operating system division, who spoke this week at the RSA Conference 2007 in San Francisco.
Microsoft originally planned for its XP follow-up to include several radical changes to Windows, including a new file system and a reinvented user interface, but after the company's products were hit by widespread worm outbreaks in 2003, Microsoft redirected almost its entire engineering effort to locking down Windows with the XP Service Pack 2 release.
"We put Longhorn on the back burner for a while," Fathi said. "Then, when we came back to it, we realized that there were incremental things that we wanted to do, and significant improvements that we wanted to make in Vista that we couldn't deliver in one release."
Vista shipped about two and a half years after XP SP 2, and Vista's follow-up is expected to take about the same amount of time, according to Fathi. "You can think roughly two, two and a half years is a reasonable time frame that our partners can depend on and can work with," he said. "That's a good time frame for refresh."
That timeline would put Microsoft's next client operating system out by the end of 2009.
Last year, Microsoft said that the code name for this Vista follow-up is Vienna, but Fathi said he could not disclose the current name. "We've been told not to use it publicly," he said.
So what will be the coolest new feature in Vienna?
According to Fathi, that's still being worked out. "We're going to look at a fundamental piece of enabling technology. Maybe its hypervisors, I don't know what it is," he said. "Maybe, it's a new user interface paradigm for consumers."
"It's too early for me to talk about it," he added. "But over the next few months I think you're going to start hearing more and more."
Dell Laptop Burned Down My House
Dan writes is to let us know that a Dell Laptop was the most probable cause of a fire that destroyed his home. We must say he seems in good spirits about it, all things considered:
On Monday morning I was heading to work like any other day.. little did I know that my home was being consumed by a raging fire. I arrived my desk to find that the phone was ringing - I needed to come home immediately was the message.
When I arrived home the entire street was blocked with fire and rescue crews. My 130 year old former farm house was engulfed in flames, with thick dark smoke pouring out of the windows and roof. Over 60 firefighters from 4 departments fought the blaze and battled equipment failures due to the near zero degree weather.
Hours later, after investigation the fire Marshall investigator took me aside asked me if I had a laptop computer. Yes -- I told him I had a Dell Inspiron 1200...
It was determined that the laptop, battery, or cord malfunctioned around 15 minutes after I left for work, and quickly spread through the living room, the nearby dining room, and then up a stairwell into the bedrooms. Virtually the entire house suffered extensive fire, smoke, or water damage. The cold weather ensured that water quickly turned to ice which has further burdened and damaged the structure. All of our possessions have been lost - photos, keepsakes, clothes --everything. It looks like the house will be a total loss.Dan is asking for our help contacting Dell, as they've been unresponsive to his queries:
" I have tried to call Dell to at least notify them of my problems, but each time I have called I get transferred into an endless loop of "Joe" or "Alan" all speaking a delectable version of English I presume emanates from Bangalore. I have been outright hung up on each time I get someone who speaks a reasonable version of English, or sounds like they might be in charge of something. Promises of call backs have gone, of course, unreturned.
Maybe you can help notify them. Or maybe pass along my new motto for Dell--
"Dude, you're getting a burned down house!"Ouch, this is so horrible. We suppose you could try emailing Dell's Customer Advocate, Marie at:Email: Customer_Advocate [at] dell [dot] com. Any suggestions from the rest of you? —MEGHANN MARCO
Dan Writes: Dear Consumerist:
I've been a reader of your site for sometime. When the recentproblems began with laptop batteries I read about it atconsumerist.com. I own two Dell laptops -- one for my wife and onefor myself -- so I immediately checked out the model numbers andbatteries against dells website designed for the purpose.
On Monday morning I was heading to work like any other day.. littledid I know that my home was being consumed by a raging fire. Iarrived my desk to find that the phone was ringing - I needed to comehome immediately was the message.
When I arrived home the entire street was blocked with fire and rescuecrews. My 130 year old former farm house was engulfed in flames, withthick dark smoke pouring out of the windows and roof. Over 60firefighters from 4 departments fought the blaze and battled equipmentfailures due to the near zero degree weather.
Hours later, after investigation the fire Marshall investigator tookme aside asked me if I had a laptop computer. Yes -- I told him I hada Dell Inspiron 1200. I had used it briefly while waiting for my carto warm up. My wife had also used it to check her email and newsbefore she left for work and to drop our 18-month old daughter off towork. I had left the laptop in sleep mode with the lid closed on theedge of the sofa in the living room.
It was determined that the laptop, battery, or cord malfunctionedaround 15 minutes after I left for work, and quickly spread throughthe living room, the nearby dining room, and then up a stairwell intothe bedrooms. Virtually the entire house suffered extensive fire,smoke, or water damage. The cold weather ensured that water quicklyturned to ice which has further burdened and damaged the structure.All of our possessions have been lost - photos, keepsakes, clothes --everything. It looks like the house will be a total loss.
Since the incident my homeowners company has been very interested inthe information about the laptop. I have tried to call Dell to atleast notify them of my problems, but each time I have called I gettransferred into an endless loop of "Joe" or "Alan" all speaking adelectable version of English I presume emanates from Bangalore. Ihave been outright hung up on each time I get someone who speaks areasonable version of English, or sounds like they might be in chargeof something. Promises of call backs have gone, of course, unreturned.
Maybe you can help notify them. Or maybe pass along my new motto for Dell--
"Dude, you're getting a burned down house!"
--dan
Saturday, February 10, 2007
PRESS RELEASE: HP's Redesigned Ink, Toner Packaging to Reduce Greenhouse Gas Emissions by 37 Million Pounds in 2007
PALO ALTO, Calif., Feb. 8, 2007 -- HP today announced that its redesigned print cartridge packaging for North America will reduce greenhouse gas emissions by an estimated 37 million pounds in 2007, the equivalent of taking 3,600 cars off the road for one year.(1)
The emissions savings are the result of smaller, lighter packages that both reduce the total carbon footprint of each cartridge and the truck and freighter transportation traffic required to ship them. Newer packaging also contains more recyclable and recycled content.
"What I see here is smart design," said Greg Norris, Ph.D., environmental life cycle assessment instructor at Harvard University and creator of the Earthster project (http://www.earthster.org/), an open source software platform designed to make opportunities for sustainable production and purchasing globally accessible. "The changes all go in the right direction environmentally and all in ways that make economic sense to HP and its customers. More power to these designers."
For retailers, the new packaging is also expected to save significant transportation and storage costs while freeing up valuable display space.
"Innovation at HP goes beyond just product design," said Pradeep Jotwani, senior vice president, Supplies, Imaging and Printing Group, HP. "Developing environmentally responsible packaging is not only valued by HP, our customers and our partners -- it's also good business."
Environmental Benefits
HP estimates its redesigned print cartridge packaging will eliminate the use of nearly 15 million pounds of materials, including 3 million pounds of corrugated cardboard in 2007.(1) The packaging also will eliminate the use of more than 6.8 million pounds of polyvinyl chloride (PVC) plastic through material reduction and substitution of recycled content plastic and paperboard.(2)
HP inkjet cartridge multipacks, for example, are now made with recycled content paperboard instead of PVC. In fact, since 2003, HP has reduced overall package weight for inkjet cartridge multipacks by 80 percent and quadrupled the number of packages that can be carried in a single truckload.
Additionally, HP inkjet cartridge photo value packs are now packaged completely in recycled paperboard instead of PVC plastic. Also, PVC has been replaced by recycled plastic (PET-RPETG) in HP inkjet cartridge tripack packaging sold in club stores.(3)
New HP LaserJet toner cartridge packaging uses 45 percent less packaging material by weight. The more compact package also contains an innovative multi-chamber air bag that protects the cartridge from transport damage, dust, moisture and light. The smaller boxes can be shipped 30 percent more efficiently -- a standard shipping pallet holds 203 cartridges instead of the previous 144.
Overall, the more efficient packaging is expected to reduce truck traffic in the United States and Canada by an estimated 1.5 million miles in 2007.(4)
Retail Benefits
Retailers also should realize savings in shelf space from the new packaging. Front-facing surface area for multipacks has been reduced by 80 percent.(5) New HP inkjet cartridge tripacks sold at club stores can be stacked three-high on shelves, as opposed to two-high in the past. And new HP LaserJet toner cartridge packaging offers retailers more than 30 percent shelf space savings.
"Environmental considerations are key to Office Depot's business," said Yalmaz Siddiqui, environmental strategy advisor, Office Depot. "We are pleased to see a manufacturer like HP make changes that are in step with our environmental objectives and can also benefit our business goals."
More information on HP's environmental programs is available at http://www.hp.com/go/environment.
Footnotes
(1) Estimates are based on projected 2007 print cartridge sales in the United States and Canada. Global warming gas (carbon dioxide equivalents) emissions reductions calculated based on anticipated 2007 sales, using packaging configurations before and after recent improvements. Environmental impacts modeled with SimaPro 7 (PRe Consultants, The Netherlands, 2006) lifecycle inventory software. Carbon equivalency factors from Intergovernmental Panel on Climate Change. Calculations from http://www.usctcgateway.net/tool/.
(2) Estimated reductions compare current to prior packaging designs, using anticipated 2007 sales.
(3) The term "tripacks" is used here to describe club store packages, most but not all of which contain three cartridges.
(4) Based on anticipated 2007 sales, shipping in full truckloads, 1,000 mile average trip distance from distribution centers in California and Virginia.
(5) Inkjet retail multipack example (display width x height): current dimensions: 4.8 inches x 6.4 inches; previous dimensions: 10.7 inches x 13.4 inches.
The emissions savings are the result of smaller, lighter packages that both reduce the total carbon footprint of each cartridge and the truck and freighter transportation traffic required to ship them. Newer packaging also contains more recyclable and recycled content.
"What I see here is smart design," said Greg Norris, Ph.D., environmental life cycle assessment instructor at Harvard University and creator of the Earthster project (http://www.earthster.org/), an open source software platform designed to make opportunities for sustainable production and purchasing globally accessible. "The changes all go in the right direction environmentally and all in ways that make economic sense to HP and its customers. More power to these designers."
For retailers, the new packaging is also expected to save significant transportation and storage costs while freeing up valuable display space.
"Innovation at HP goes beyond just product design," said Pradeep Jotwani, senior vice president, Supplies, Imaging and Printing Group, HP. "Developing environmentally responsible packaging is not only valued by HP, our customers and our partners -- it's also good business."
Environmental Benefits
HP estimates its redesigned print cartridge packaging will eliminate the use of nearly 15 million pounds of materials, including 3 million pounds of corrugated cardboard in 2007.(1) The packaging also will eliminate the use of more than 6.8 million pounds of polyvinyl chloride (PVC) plastic through material reduction and substitution of recycled content plastic and paperboard.(2)
HP inkjet cartridge multipacks, for example, are now made with recycled content paperboard instead of PVC. In fact, since 2003, HP has reduced overall package weight for inkjet cartridge multipacks by 80 percent and quadrupled the number of packages that can be carried in a single truckload.
Additionally, HP inkjet cartridge photo value packs are now packaged completely in recycled paperboard instead of PVC plastic. Also, PVC has been replaced by recycled plastic (PET-RPETG) in HP inkjet cartridge tripack packaging sold in club stores.(3)
New HP LaserJet toner cartridge packaging uses 45 percent less packaging material by weight. The more compact package also contains an innovative multi-chamber air bag that protects the cartridge from transport damage, dust, moisture and light. The smaller boxes can be shipped 30 percent more efficiently -- a standard shipping pallet holds 203 cartridges instead of the previous 144.
Overall, the more efficient packaging is expected to reduce truck traffic in the United States and Canada by an estimated 1.5 million miles in 2007.(4)
Retail Benefits
Retailers also should realize savings in shelf space from the new packaging. Front-facing surface area for multipacks has been reduced by 80 percent.(5) New HP inkjet cartridge tripacks sold at club stores can be stacked three-high on shelves, as opposed to two-high in the past. And new HP LaserJet toner cartridge packaging offers retailers more than 30 percent shelf space savings.
"Environmental considerations are key to Office Depot's business," said Yalmaz Siddiqui, environmental strategy advisor, Office Depot. "We are pleased to see a manufacturer like HP make changes that are in step with our environmental objectives and can also benefit our business goals."
More information on HP's environmental programs is available at http://www.hp.com/go/environment.
Footnotes
(1) Estimates are based on projected 2007 print cartridge sales in the United States and Canada. Global warming gas (carbon dioxide equivalents) emissions reductions calculated based on anticipated 2007 sales, using packaging configurations before and after recent improvements. Environmental impacts modeled with SimaPro 7 (PRe Consultants, The Netherlands, 2006) lifecycle inventory software. Carbon equivalency factors from Intergovernmental Panel on Climate Change. Calculations from http://www.usctcgateway.net/tool/.
(2) Estimated reductions compare current to prior packaging designs, using anticipated 2007 sales.
(3) The term "tripacks" is used here to describe club store packages, most but not all of which contain three cartridges.
(4) Based on anticipated 2007 sales, shipping in full truckloads, 1,000 mile average trip distance from distribution centers in California and Virginia.
(5) Inkjet retail multipack example (display width x height): current dimensions: 4.8 inches x 6.4 inches; previous dimensions: 10.7 inches x 13.4 inches.
Subscribe to:
Posts (Atom)