Mickey getting ready for the 1934 Macy's Thanksgiving parade. Note the Glendale, CA Grand Central Terminal airport hangers in the background.
Friday, November 28, 2008
Thursday, November 27, 2008
Tuesday, November 25, 2008
Redstone to Sell National Amusement Theater Chain?
1,500 screens on the block to pay off debt....
http://www.nytimes.com/2008/11/25/business/media/25redstone.html?_r=1&oref=slogin
http://www.nytimes.com/2008/11/25/business/media/25redstone.html?_r=1&oref=slogin
Monday, November 24, 2008
A Fiat with Megaphones
Using sound to drive Rome's starling population away...
http://online.wsj.com/article/SB122748681405851845.html
http://online.wsj.com/article/SB122748681405851845.html
NFL in 3D
By SARAH MCBRIDE
With sports fans still getting used to their high-definition television sets, the National Football League is already thinking ahead to the next potential upgrade: 3-D.
Next week, a game between the San Diego Chargers and the Oakland Raiders will be broadcast live in 3-D to theaters in Los Angeles, New York and Boston. It is a preliminary step on what is likely a long road to any regular 3-D broadcasts of football games.
The idea is a "proof of concept," says Howard Katz, NFL senior vice president of broadcasting and media operations. "We want to demonstrate this and let people get excited about it and see what the future holds."
The several hundred guests at the three participating theaters Dec. 4 will include representatives from the NFL's broadcasting partners and from consumer-electronics companies. Burbank, Calif.-based 3ality Digital LLC will shoot the game with special cameras and transmit it to a satellite. Thomson SA's Technicolor Digital Cinema is providing the satellite services and digital downlink to each theater, and Real D 3D Inc. will power the display in the theaters.
This isn't the first time the NFL has participated in a 3-D experiment. In 2004, a predecessor company to 3ality filmed the Super Bowl between the New England Patriots and the Carolina Panthers. When Sandy Climan, 3ality's chief executive officer, shows the footage, "people crouch down to catch the ball," he says. "It's as if the ball is coming into your arms."
Technology has advanced considerably since then, and now makes live transmission possible. Boxing in 3-D, Mr. Climan says, particularly "raises your blood pressure."
Real D, which has rolled out 3-D systems in 1,500 theaters around the world, has long advocated the transmission of live events to theaters in 3-D. "We look forward to giving fans of live events the opportunity to feel like they're in the front row," says Michael Lewis, Real D's CEO.
Some live events, including opera broadcasts and circus performances, already pop up on screens at theaters across the country.
Next week's demonstration will also include television displays, to show what might one day be available in homes. While 3-D television sets are already available in stores, mainly for the handful of DVDs available in 3-D, the industry is still working on technical standards for 3-D.
That process raises the possibility that 3-D TV sets purchased today might not be compatible with programs aired in a few years' time. Just as in theaters, home viewers must wear special 3-D glasses.
Write to Sarah McBride at sarah.mcbride@wsj.com
With sports fans still getting used to their high-definition television sets, the National Football League is already thinking ahead to the next potential upgrade: 3-D.
Next week, a game between the San Diego Chargers and the Oakland Raiders will be broadcast live in 3-D to theaters in Los Angeles, New York and Boston. It is a preliminary step on what is likely a long road to any regular 3-D broadcasts of football games.
The idea is a "proof of concept," says Howard Katz, NFL senior vice president of broadcasting and media operations. "We want to demonstrate this and let people get excited about it and see what the future holds."
The several hundred guests at the three participating theaters Dec. 4 will include representatives from the NFL's broadcasting partners and from consumer-electronics companies. Burbank, Calif.-based 3ality Digital LLC will shoot the game with special cameras and transmit it to a satellite. Thomson SA's Technicolor Digital Cinema is providing the satellite services and digital downlink to each theater, and Real D 3D Inc. will power the display in the theaters.
This isn't the first time the NFL has participated in a 3-D experiment. In 2004, a predecessor company to 3ality filmed the Super Bowl between the New England Patriots and the Carolina Panthers. When Sandy Climan, 3ality's chief executive officer, shows the footage, "people crouch down to catch the ball," he says. "It's as if the ball is coming into your arms."
Technology has advanced considerably since then, and now makes live transmission possible. Boxing in 3-D, Mr. Climan says, particularly "raises your blood pressure."
Real D, which has rolled out 3-D systems in 1,500 theaters around the world, has long advocated the transmission of live events to theaters in 3-D. "We look forward to giving fans of live events the opportunity to feel like they're in the front row," says Michael Lewis, Real D's CEO.
Some live events, including opera broadcasts and circus performances, already pop up on screens at theaters across the country.
Next week's demonstration will also include television displays, to show what might one day be available in homes. While 3-D television sets are already available in stores, mainly for the handful of DVDs available in 3-D, the industry is still working on technical standards for 3-D.
That process raises the possibility that 3-D TV sets purchased today might not be compatible with programs aired in a few years' time. Just as in theaters, home viewers must wear special 3-D glasses.
Write to Sarah McBride at sarah.mcbride@wsj.com
Sunday, November 23, 2008
Architects Run for Shelter
Fear usual strategies won't work as crisis goes global; financing scarce
Architect Bradford Perkins has endured three recessions in his 39-year career, so when business started slowing earlier this year, he acted quickly to bolster revenues. The chairman of Perkins Eastman, the city's largest architectural firm, opened two more international offices and hired two renowned architects to help win more commissions.
While the firm was searching for more business overseas, activity was tanking at home. Twenty projects—roughly 10% of the firm's total in New York—were suspended or canceled in the past five months. That forced Perkins Eastman to lay off about 40 workers, or 10% of the staff—an action unprecedented in the company's 24-year history.
“We always knew the business ran in cycles,” says Mr. Perkins. “But what surprised me is how the effects of this downturn came on so fast.”
As both office and residential development in the city grinds to a halt, architectural firms are scrambling to find more work. They are lowering their fees, chasing smaller projects, seeking more international assignments and bidding on more institutional contracts to generate revenues—all tried-and-true methods employed during past economic slowdowns.
But architects fear their traditional coping strategies will fall short as the economy craters. For example, they note that as the recession spreads globally, work is evaporating in former construction hot spots like Dubai and China.
Architects also worry that clients that have long provided lifelines, such as municipalities, universities and hospitals, will retreat as donations and taxes shrivel.In October, the Architecture Billings Index plummeted to 36.2, its lowest level since the survey began in 1995.
Any score below 50 indicates a decline in billings. The index, calculated by the American Institute of Architects, is considered a leading indicator of construction activity. One area that architects traditionally count on to carry them through recessions—government construction—is in danger of being curtailed because municipalities are having difficulties getting bonds approved to finance projects, the institute says.
“This is unprecedented,” says Kenneth Drucker, senior principal at architecture firm HOK New York. “Usually when one business dries up, another takes its place.” New York City's gloomy financial outlook unnerves architect Paul Eagle because his firm was tapped to design the new police academy in Queens.
“I check my e-mail every day to make sure it is still on,” says Mr. Eagle, the principal of Perkins+Will's New York office. “We are moving ahead, but you hear the news every day and you get nervous.”
Architects note that many factors that influence their business, such as clients' ability to get financing, are beyond their control. So they concentrate on other aspects during tough times.
Mr. Perkins' response to tough times has been to rev up marketing, but he says that even with his increased sales efforts, the company's revenues could fall 10% next year. The decrease will be larger if the economy further curdles, he says.
“What I learned over the years is that you shoot your way out of the recession,” he says. “You've got to put a lot more emphasis on selling.”Mr. Perkins' new marketing tools include offices in Ecuador and India. He's also hoping clients will hire his two recent additions: Steve Rosenstein, who specializes in designing science and research facilities, and Thomas Fridstein, who is known for his international expertise. Stanton Eckstut, principal of Ehrenkrantz Eckstut & Kuhn Architects, is taking to the road to drum up more business. Last week, he flew to Los Angeles to discuss with local colleagues how they can capitalize on the city's plans to build more schools.
Recently, the firm formed a joint venture with two Washington, D.C.-based engineering firms and just won a bid to build a school there.“We are out there. We are canvassing,” says Mr. Eckstut.
Despite his strenuous marketing efforts, the firm laid off 10 people—about 10% of the New York staff—in the past six weeks because work is slow.Robin Klehr Avia, a managing partner of Gensler, is in a similar situation.
“The problem is 20 firms respond to RFPs,” she says.Like other firms, Gensler has cut its fees but has still lost business. Ms. Klehr Avia says that in the last six weeks, 10 projects have either been scaled back, canceled or suspended.
Architect Bradford Perkins has endured three recessions in his 39-year career, so when business started slowing earlier this year, he acted quickly to bolster revenues. The chairman of Perkins Eastman, the city's largest architectural firm, opened two more international offices and hired two renowned architects to help win more commissions.
While the firm was searching for more business overseas, activity was tanking at home. Twenty projects—roughly 10% of the firm's total in New York—were suspended or canceled in the past five months. That forced Perkins Eastman to lay off about 40 workers, or 10% of the staff—an action unprecedented in the company's 24-year history.
“We always knew the business ran in cycles,” says Mr. Perkins. “But what surprised me is how the effects of this downturn came on so fast.”
As both office and residential development in the city grinds to a halt, architectural firms are scrambling to find more work. They are lowering their fees, chasing smaller projects, seeking more international assignments and bidding on more institutional contracts to generate revenues—all tried-and-true methods employed during past economic slowdowns.
But architects fear their traditional coping strategies will fall short as the economy craters. For example, they note that as the recession spreads globally, work is evaporating in former construction hot spots like Dubai and China.
Architects also worry that clients that have long provided lifelines, such as municipalities, universities and hospitals, will retreat as donations and taxes shrivel.In October, the Architecture Billings Index plummeted to 36.2, its lowest level since the survey began in 1995.
Any score below 50 indicates a decline in billings. The index, calculated by the American Institute of Architects, is considered a leading indicator of construction activity. One area that architects traditionally count on to carry them through recessions—government construction—is in danger of being curtailed because municipalities are having difficulties getting bonds approved to finance projects, the institute says.
“This is unprecedented,” says Kenneth Drucker, senior principal at architecture firm HOK New York. “Usually when one business dries up, another takes its place.” New York City's gloomy financial outlook unnerves architect Paul Eagle because his firm was tapped to design the new police academy in Queens.
“I check my e-mail every day to make sure it is still on,” says Mr. Eagle, the principal of Perkins+Will's New York office. “We are moving ahead, but you hear the news every day and you get nervous.”
Architects note that many factors that influence their business, such as clients' ability to get financing, are beyond their control. So they concentrate on other aspects during tough times.
Mr. Perkins' response to tough times has been to rev up marketing, but he says that even with his increased sales efforts, the company's revenues could fall 10% next year. The decrease will be larger if the economy further curdles, he says.
“What I learned over the years is that you shoot your way out of the recession,” he says. “You've got to put a lot more emphasis on selling.”Mr. Perkins' new marketing tools include offices in Ecuador and India. He's also hoping clients will hire his two recent additions: Steve Rosenstein, who specializes in designing science and research facilities, and Thomas Fridstein, who is known for his international expertise. Stanton Eckstut, principal of Ehrenkrantz Eckstut & Kuhn Architects, is taking to the road to drum up more business. Last week, he flew to Los Angeles to discuss with local colleagues how they can capitalize on the city's plans to build more schools.
Recently, the firm formed a joint venture with two Washington, D.C.-based engineering firms and just won a bid to build a school there.“We are out there. We are canvassing,” says Mr. Eckstut.
Despite his strenuous marketing efforts, the firm laid off 10 people—about 10% of the New York staff—in the past six weeks because work is slow.Robin Klehr Avia, a managing partner of Gensler, is in a similar situation.
“The problem is 20 firms respond to RFPs,” she says.Like other firms, Gensler has cut its fees but has still lost business. Ms. Klehr Avia says that in the last six weeks, 10 projects have either been scaled back, canceled or suspended.
Thursday, November 20, 2008
Legemdary Composer Dies
By Dennis McLellan 7:01 PM PST, November 19, 2008
Irving Gertz, a film and television composer who contributed music to 1950s science-fiction films such as "It Came From Outer Space" and "The Incredible Shrinking Man" and to 1960s TV series such as "Voyage to the Bottom of the Sea," has died. He was 93.
Gertz died Friday at his home in West Los Angeles, said David Schecter, a record producer and film-music historian who was a close friend. No specific cause of death was given.
From the late 1940s to the late '60s, Gertz wrote music for about 200 movies and television episodes. Among his film credits are "Abbott and Costello Meet the Mummy," "Francis Joins the WACS," "The Alligator People," "The Monolith Monsters," "The Creature Walks Among Us," "Overland Pacific," "To Hell and Back," "The Thing That Couldn't Die" and "Flaming Star."
Among his TV credits were "Daniel Boone," "The Invaders," "Land of the Giants," "Peyton Place" and "Voyage to the Bottom of the Sea."
Irving Gertz, a film and television composer who contributed music to 1950s science-fiction films such as "It Came From Outer Space" and "The Incredible Shrinking Man" and to 1960s TV series such as "Voyage to the Bottom of the Sea," has died. He was 93.
Gertz died Friday at his home in West Los Angeles, said David Schecter, a record producer and film-music historian who was a close friend. No specific cause of death was given.
From the late 1940s to the late '60s, Gertz wrote music for about 200 movies and television episodes. Among his film credits are "Abbott and Costello Meet the Mummy," "Francis Joins the WACS," "The Alligator People," "The Monolith Monsters," "The Creature Walks Among Us," "Overland Pacific," "To Hell and Back," "The Thing That Couldn't Die" and "Flaming Star."
Among his TV credits were "Daniel Boone," "The Invaders," "Land of the Giants," "Peyton Place" and "Voyage to the Bottom of the Sea."
Wednesday, November 19, 2008
Triangulating Your Location
U.S. government able to track mobile phones without involving operators
November 19, 2008 — 2:44pm ET By Paul Mah
New documents made available under a Freedom of Information Act request brought additional information pertaining to the use of triggerfish technology to the foreground. Triggerfish is also known as cell-site simulators or digital analyzers. By posing as a cell tower, it is possible to use it to trick nearby phones into transmitting their serial numbers and other data, which can be used to triangulate the location of mobile phones. While earlier understanding of this technology assumes the cooperation of mobile phone operators, one of the uncovered documents explicitly noted that it can be deployed without having to "involve the cell phone provider."
November 19, 2008 — 2:44pm ET By Paul Mah
New documents made available under a Freedom of Information Act request brought additional information pertaining to the use of triggerfish technology to the foreground. Triggerfish is also known as cell-site simulators or digital analyzers. By posing as a cell tower, it is possible to use it to trick nearby phones into transmitting their serial numbers and other data, which can be used to triangulate the location of mobile phones. While earlier understanding of this technology assumes the cooperation of mobile phone operators, one of the uncovered documents explicitly noted that it can be deployed without having to "involve the cell phone provider."
Windows 7.0 Won't Have USB 3.0?
USB 3.0 finalized, at last
November 18, 2008 — 6:10am ET By Paul Mah
The USB Promoter Group finalized the USB 3.0 specification on Monday this week--almost eight years after the launch of USB 2.0 at WinHEC.
Also known as "SuperSpeed USB," discrete controllers based on this standard are expected in the second half of next year, with consumer products poised to follow in 2010. About 10 times faster than USB 2.0, transferring a 25GB HD movie file will take just 70 seconds compared to 13.9 minutes using USB 2.0's 480Mbps data transfer rate.
On the other hand, the delays in rectifying the USB 3.0 standard meant that Microsoft will not have support for USB 3.0 in Windows 7 at RTM, according to Lars Giusti of Microsoft. At this moment, Microsoft is trying to decide if it will even incorporate USB 3.0 support in Microsoft Vista.
We talked a bit more about USB 3.0 a few months back, you can check out the in-depth examination of USB 3.0 here.
November 18, 2008 — 6:10am ET By Paul Mah
The USB Promoter Group finalized the USB 3.0 specification on Monday this week--almost eight years after the launch of USB 2.0 at WinHEC.
Also known as "SuperSpeed USB," discrete controllers based on this standard are expected in the second half of next year, with consumer products poised to follow in 2010. About 10 times faster than USB 2.0, transferring a 25GB HD movie file will take just 70 seconds compared to 13.9 minutes using USB 2.0's 480Mbps data transfer rate.
On the other hand, the delays in rectifying the USB 3.0 standard meant that Microsoft will not have support for USB 3.0 in Windows 7 at RTM, according to Lars Giusti of Microsoft. At this moment, Microsoft is trying to decide if it will even incorporate USB 3.0 support in Microsoft Vista.
We talked a bit more about USB 3.0 a few months back, you can check out the in-depth examination of USB 3.0 here.
Monday, November 17, 2008
Los Alamos Computer World's Fastest
A supercomputer at Los Alamos National Laboratory remained the world's fastest, narrowly edging out another massive machine at Oak Ridge National Laboratory, according to a twice-yearly ranking of the 500 largest scientific systems.
International Business Machines Corp.'s 188 systems accounted for the most computing power on the so-called Top500 list, and it supplied machines rated first, fourth and fifth. Hewlett-Packard Co. moved past IBM in terms of total systems on the list, with 209 machines. Cray Inc. supplied the No. 2 system and three others in the top 10.
Intel Corp. chips were used in 379 of the top 500 systems. Rival Advanced Micro Devices Inc. supplied chips in 59 machines, including seven of the 10 fastest.
The No. 1 Roadrunner machine at Los Alamos uses both AMD and IBM microprocessors, while the Oak Ridge Jaguar system is powered only by AMD's Opteron chip.
The Top500 list is compiled by researchers at the University of Mannheim, Germany, along with the University of Tennessee in Knoxville, and the Department of Energy's National Energy Research Scientific Computing Center in Berkeley, Calif.
International Business Machines Corp.'s 188 systems accounted for the most computing power on the so-called Top500 list, and it supplied machines rated first, fourth and fifth. Hewlett-Packard Co. moved past IBM in terms of total systems on the list, with 209 machines. Cray Inc. supplied the No. 2 system and three others in the top 10.
Intel Corp. chips were used in 379 of the top 500 systems. Rival Advanced Micro Devices Inc. supplied chips in 59 machines, including seven of the 10 fastest.
The No. 1 Roadrunner machine at Los Alamos uses both AMD and IBM microprocessors, while the Oak Ridge Jaguar system is powered only by AMD's Opteron chip.
The Top500 list is compiled by researchers at the University of Mannheim, Germany, along with the University of Tennessee in Knoxville, and the Department of Energy's National Energy Research Scientific Computing Center in Berkeley, Calif.
Thursday, November 13, 2008
LG and Sharp Play Naughty- Get Caught
LG, Sharp, Chunghwa admit to LCD price fixing
By Dawn Kawamoto, CNET News.com
Thursday, November 13, 2008 10:48 AM
LG Display, Sharp, and Chunghwa Picture Tubes agreed to plead guilty to criminal charges for participating in a liquid crystal display price-fixing conspiracy and pay US$585 million in fines, the U.S. Department of Justice announced Wednesday.
The three companies worked in concert to set prices on thin-film transistor LCDs, which are used in computer monitors, notebooks, televisions, mobile phones, and various electronics, according to the antitrust unit of the Justice Department.
Apple, Dell, and Motorola were among the companies affected by the price fixing, antitrust regulators said.
"The price-fixing conspiracies affected millions of American consumers who use computers, cell phones, and numerous other household electronics every day," Thomas Barnett, assistant attorney general for the Justice Department's antitrust division, said in a statement.
The three companies, which were charged with violating the Sherman Antitrust Act, allegedly held "crystal" meetings and engaged in communications about setting prices on the TFT-LCD displays. They agreed to charge predetermined prices for the displays, issued price quotes based on those agreements, and exchanged sales information on the display panels, in order to monitor and enforce the agreement, the Justice Department said.
LG Display agreed to pay a US$400 million fine, marking the second-highest antitrust fine ever imposed. The company pleaded guilty to setting prices with other unnamed suppliers for the TFT-LCD panels worldwide from September 2001 to June 2006, when the company operated under the name L.G. Philips LCD, a joint venture between LG Electronics and Philips Electronics. LG Display America was known as L.G. Philips LCD America.
Sharp, meanwhile, agreed to pay a US$120 million fine and participated in the conspiracy between April 2001 and December 2006 with other unnamed suppliers. The conspiracy involved setting prices in three separate agreements for TFT-LCD panels sold to Dell, which used them in computer monitors and laptops.
And during the period ranging from the fall of 2005 to mid-2006, similar price-fixing schemes were employed in sales to Motorola, which used the panels in its popular Razr mobile phones.
Sharp's conspiracy also touched Apple from September 2005 to December 2006, in which Apple used the displays for its popular iPod music players.
Chunghwa agreed to pay a US$65 million fine, for its participation in the price-fixing scheme from September 2001 through December 2006.
The Justice Department began its investigation in 2006 and notes its investigation is still on-going.
"Dell is aware of the announcement and will review its impact, but we have no comment at this time and probably will not in the near term as it's an ongoing investigation," a Dell representative said Wednesday, in an e-mail response.
Sony, a major LCD panel producer, also declined to comment.
For the LCD industry, problems began in the late 1990s when a surge in demand for notebooks and handheld devices drove up the need for LCD glass. As a result, the TFT-LCD makers built glass plants in Korea and Taiwan during 1998 through 1999.
But as those factories came online and began to pump out LCD glass, a glut took hold. And by the fall of 2000, prices on 15-inch flat panels plummeted to a point that in some cases manufacturers were having to sell their panels at US$5 to US$10 below cost.
Between October 2000 through August 2001, LCD makers were feeling the pain of an over supply of panels. But after August 2001, prices began to rise.
And apparently, it was no coincidence. Five months prior, Sharp had begun fixing prices on TFT-LCD panels sold to PC giant Dell and in September 2001, LG and Chunghwa also began to engage in price fixing, as well.
Analysts, at the time, predicted LCD shortages, especially in the 15-inch panel, would continue through 2002.
IDC analyst Bob O'Donnell noted at the time that while PCs tend to only go down in price over time, flat panel prices have occasionally risen. Said O'Donnell at the time: "LCD is one of the few [markets] where things have actually gone up in price."
Although Sharp admits to engaging in price fixing with Apple's iPod screens in the 2005 to 2006 period, it remains unclear whether other vendors may have engaged in a similar behavior with Apple back in 2002.
That is when Apple was hit with a component shortage of 15-inch LCD panels for its newly introduced all-in-one flat panel iMacs. As a result, Apple suffered a shortage of iMacs after introducing and touting its sleek iMac.
By Dawn Kawamoto, CNET News.com
Thursday, November 13, 2008 10:48 AM
LG Display, Sharp, and Chunghwa Picture Tubes agreed to plead guilty to criminal charges for participating in a liquid crystal display price-fixing conspiracy and pay US$585 million in fines, the U.S. Department of Justice announced Wednesday.
The three companies worked in concert to set prices on thin-film transistor LCDs, which are used in computer monitors, notebooks, televisions, mobile phones, and various electronics, according to the antitrust unit of the Justice Department.
Apple, Dell, and Motorola were among the companies affected by the price fixing, antitrust regulators said.
"The price-fixing conspiracies affected millions of American consumers who use computers, cell phones, and numerous other household electronics every day," Thomas Barnett, assistant attorney general for the Justice Department's antitrust division, said in a statement.
The three companies, which were charged with violating the Sherman Antitrust Act, allegedly held "crystal" meetings and engaged in communications about setting prices on the TFT-LCD displays. They agreed to charge predetermined prices for the displays, issued price quotes based on those agreements, and exchanged sales information on the display panels, in order to monitor and enforce the agreement, the Justice Department said.
LG Display agreed to pay a US$400 million fine, marking the second-highest antitrust fine ever imposed. The company pleaded guilty to setting prices with other unnamed suppliers for the TFT-LCD panels worldwide from September 2001 to June 2006, when the company operated under the name L.G. Philips LCD, a joint venture between LG Electronics and Philips Electronics. LG Display America was known as L.G. Philips LCD America.
Sharp, meanwhile, agreed to pay a US$120 million fine and participated in the conspiracy between April 2001 and December 2006 with other unnamed suppliers. The conspiracy involved setting prices in three separate agreements for TFT-LCD panels sold to Dell, which used them in computer monitors and laptops.
And during the period ranging from the fall of 2005 to mid-2006, similar price-fixing schemes were employed in sales to Motorola, which used the panels in its popular Razr mobile phones.
Sharp's conspiracy also touched Apple from September 2005 to December 2006, in which Apple used the displays for its popular iPod music players.
Chunghwa agreed to pay a US$65 million fine, for its participation in the price-fixing scheme from September 2001 through December 2006.
The Justice Department began its investigation in 2006 and notes its investigation is still on-going.
"Dell is aware of the announcement and will review its impact, but we have no comment at this time and probably will not in the near term as it's an ongoing investigation," a Dell representative said Wednesday, in an e-mail response.
Sony, a major LCD panel producer, also declined to comment.
For the LCD industry, problems began in the late 1990s when a surge in demand for notebooks and handheld devices drove up the need for LCD glass. As a result, the TFT-LCD makers built glass plants in Korea and Taiwan during 1998 through 1999.
But as those factories came online and began to pump out LCD glass, a glut took hold. And by the fall of 2000, prices on 15-inch flat panels plummeted to a point that in some cases manufacturers were having to sell their panels at US$5 to US$10 below cost.
Between October 2000 through August 2001, LCD makers were feeling the pain of an over supply of panels. But after August 2001, prices began to rise.
And apparently, it was no coincidence. Five months prior, Sharp had begun fixing prices on TFT-LCD panels sold to PC giant Dell and in September 2001, LG and Chunghwa also began to engage in price fixing, as well.
Analysts, at the time, predicted LCD shortages, especially in the 15-inch panel, would continue through 2002.
IDC analyst Bob O'Donnell noted at the time that while PCs tend to only go down in price over time, flat panel prices have occasionally risen. Said O'Donnell at the time: "LCD is one of the few [markets] where things have actually gone up in price."
Although Sharp admits to engaging in price fixing with Apple's iPod screens in the 2005 to 2006 period, it remains unclear whether other vendors may have engaged in a similar behavior with Apple back in 2002.
That is when Apple was hit with a component shortage of 15-inch LCD panels for its newly introduced all-in-one flat panel iMacs. As a result, Apple suffered a shortage of iMacs after introducing and touting its sleek iMac.
Home Theater Retailer Woes Continue
Sound Advice Home Theater Retailer Latest Chain to Close Doors
Home Theater News Industry-Trade News
Written by AVRev.com
Wednesday, 12 November 2008
This economy is taking its toll on home theater specialty retailers. First, Tower Records went out of business. Then, Tweeter files for bankruptcy and its eventual new owners pulled the plug. Circuit City seems to be in trouble, recently announcing the closing of several of their stores.
Now, Sound Advice, a Florida-based home theater retailer has called it quits. They will be closing all 22 of their stores by year's end. Their liquidation sale started last Wednesday and will continue through until after the holidays. They are currently offering 10 percent off TVs, 20 percent off speakers and 40 percent off cables and accessories.
Sound Advice employs about 50 workers, which is about 30 fewer than were working there a year ago. The stores have stopped accepting checks, selling gift cards, warranties, etc.
Sound Advice was acquired for $61 million in 2001 by Tweeter, so it was probably not a stretch of the imagination that Sound Advice would eventually close.
Home Theater News Industry-Trade News
Written by AVRev.com
Wednesday, 12 November 2008
This economy is taking its toll on home theater specialty retailers. First, Tower Records went out of business. Then, Tweeter files for bankruptcy and its eventual new owners pulled the plug. Circuit City seems to be in trouble, recently announcing the closing of several of their stores.
Now, Sound Advice, a Florida-based home theater retailer has called it quits. They will be closing all 22 of their stores by year's end. Their liquidation sale started last Wednesday and will continue through until after the holidays. They are currently offering 10 percent off TVs, 20 percent off speakers and 40 percent off cables and accessories.
Sound Advice employs about 50 workers, which is about 30 fewer than were working there a year ago. The stores have stopped accepting checks, selling gift cards, warranties, etc.
Sound Advice was acquired for $61 million in 2001 by Tweeter, so it was probably not a stretch of the imagination that Sound Advice would eventually close.
Monday, November 10, 2008
First HD Wireless Video Extension over 802.11n
$2,300 for a transmit/receive pair
Avocent Announces the Digital Signage Industry’s First IEEE802.11n Based HD Video Extension and Distribution System
New Emerge® MPX1550 multipoint extenders raise the bar for wireless video distribution products
HUNTSVILLE, Ala., November 10, 2008 – Avocent Corporation (NASDAQ: AVCT), the recognized leader in managed audio video wireless extension networking products for professional audio video applications, today announced the MPX1550 wireless extender.
This breakthrough product combines Avocent’s field-proven MPX1500 wireless video distribution technology with the latest in WiFi advancements, IEEE802.11n MIMO-based radios, thereby raising the bar for wireless HD video distribution systems in terms of visual acuity, transmission distance, and noise immunity. With the MPX1550 system, deployment of a single stream of media to many displays can be accomplished in minutes, even under the most challenging of conditions, with a degree of reliability and quality that rivals a dedicated source device at each and every display.
Unlike multiple source devices, however, video and audio remain in lockstep across all displays, failure-prone moving parts are kept to a minimum, and software licensing costs are greatly reduced. Equally important, all devices in the media network are remotely manageable using the MPX1550 extender’s onboard Web interface, which is accessible via a dedicated control LAN interface.
“As the number of public displays continues to mushroom, the ability to deploy these displays in a rapid yet cost-effective manner has become a key differentiator for signage network providers,” said Mitch Friend, senior vice president and general manager of Avocent. “It has also become clear to us that content requirements differ among various deployments such as retail, institutional and way-finding signage applications. Characteristics such as visual acuity, resolution of motion and still images tend to vary. To handle the most demanding applications, we’ve added the MPX1550 system to our product offerings. By adding the 802.11n support to our product line, customers are now able to choose the extender that is best suited for their application.”
Emerge MPX1550 extenders are widely deployed for a variety of professional video applications, such as digital signage – providing panels with live content, entertainment and advertising in retail outlets, theaters, restaurants, airports, gas stations, and other venues. The Emerge MPX1550 extenders offer both wired (over IP) and wireless operation, including distribution of both HD and SD video signals from a single transmitter to a cluster of receivers. The MPX1550 system is uniquely optimized for both full motion video and still images.
The extenders support digital and analog video signals, providing support for a wide range of DVI, HDMI, VGA and component source and display devices. MPX1550 extenders also perform analog to digital conversion as needed to match dissimilar source and display devices. For additional control over source and display devices, the MPX1550 extenders also forward serial and IR signals. Support for HDCP ensures that even protected content can be distributed through signage networks in a secure and compliant manner.
The Emerge MPX1550T wireless transmitter and Emerge MPX1550R wireless receiver are available this month at a MSRP of $1,145 each.
Avocent Announces the Digital Signage Industry’s First IEEE802.11n Based HD Video Extension and Distribution System
New Emerge® MPX1550 multipoint extenders raise the bar for wireless video distribution products
HUNTSVILLE, Ala., November 10, 2008 – Avocent Corporation (NASDAQ: AVCT), the recognized leader in managed audio video wireless extension networking products for professional audio video applications, today announced the MPX1550 wireless extender.
This breakthrough product combines Avocent’s field-proven MPX1500 wireless video distribution technology with the latest in WiFi advancements, IEEE802.11n MIMO-based radios, thereby raising the bar for wireless HD video distribution systems in terms of visual acuity, transmission distance, and noise immunity. With the MPX1550 system, deployment of a single stream of media to many displays can be accomplished in minutes, even under the most challenging of conditions, with a degree of reliability and quality that rivals a dedicated source device at each and every display.
Unlike multiple source devices, however, video and audio remain in lockstep across all displays, failure-prone moving parts are kept to a minimum, and software licensing costs are greatly reduced. Equally important, all devices in the media network are remotely manageable using the MPX1550 extender’s onboard Web interface, which is accessible via a dedicated control LAN interface.
“As the number of public displays continues to mushroom, the ability to deploy these displays in a rapid yet cost-effective manner has become a key differentiator for signage network providers,” said Mitch Friend, senior vice president and general manager of Avocent. “It has also become clear to us that content requirements differ among various deployments such as retail, institutional and way-finding signage applications. Characteristics such as visual acuity, resolution of motion and still images tend to vary. To handle the most demanding applications, we’ve added the MPX1550 system to our product offerings. By adding the 802.11n support to our product line, customers are now able to choose the extender that is best suited for their application.”
Emerge MPX1550 extenders are widely deployed for a variety of professional video applications, such as digital signage – providing panels with live content, entertainment and advertising in retail outlets, theaters, restaurants, airports, gas stations, and other venues. The Emerge MPX1550 extenders offer both wired (over IP) and wireless operation, including distribution of both HD and SD video signals from a single transmitter to a cluster of receivers. The MPX1550 system is uniquely optimized for both full motion video and still images.
The extenders support digital and analog video signals, providing support for a wide range of DVI, HDMI, VGA and component source and display devices. MPX1550 extenders also perform analog to digital conversion as needed to match dissimilar source and display devices. For additional control over source and display devices, the MPX1550 extenders also forward serial and IR signals. Support for HDCP ensures that even protected content can be distributed through signage networks in a secure and compliant manner.
The Emerge MPX1550T wireless transmitter and Emerge MPX1550R wireless receiver are available this month at a MSRP of $1,145 each.
Thursday, November 6, 2008
Wednesday, November 5, 2008
Google vs. Dolly Parton
The white space battle makes the NY Times....
http://www.nytimes.com/2008/11/04/technology/internet/04wireless.html?_r=2&scp=2&sq=dolly%20parton&st=cse&oref=slogin&oref=slogin
http://www.nytimes.com/2008/11/04/technology/internet/04wireless.html?_r=2&scp=2&sq=dolly%20parton&st=cse&oref=slogin&oref=slogin
Game Over- FCC Goes White Space Google
FCC votes to turn empty TV spectrum into wireless Internet access
The plan allows high-tech firms such as Google and Microsoft to develop a new generation of devices that will use the 'white spaces' between channels to go online.
By Jim Puzzanghera November 5, 2008
Reporting from Washington -- Federal regulators on Tuesday approved the largest ever expansion of wireless Internet access, unanimously backing a controversial plan to allow a new generation of devices to use the empty airwaves between television channels to go online.Dubbed "Wi-Fi on steroids" by its supporters in the high-tech industry, the plan promises to offer wireless Internet service across America -- most likely for free -- and spur new systems for transmitting video and other data between devices in homes.
It overcame staunch opposition from the entertainment industry, which is worried that the Web-surfing devices will interfere with TV broadcasts and wireless microphones.Although expected to be slower and possibly less secure than commercial broadband services from cable and phone companies, the new Internet connections will ride on the highest-quality broadcast airwaves, which are able to carry signals long distances and easily penetrate trees and walls.For decades, those government-owned airwaves have been reserved for TV stations.
But the Federal Communications Commission, in a 5-0 vote intended to increase the reach of high-speed Internet access, approved a plan advocated by public interest groups and technology companies, including Google Inc. and Microsoft Corp., to allow the use of the spectrum by new laptops, mobile phones and other gadgets with built-in equipment that are expected to hit the market in about two years.
"Consumers across the country will have access to devices and services they may have only dreamed about before," FCC Chairman Kevin J. Martin said.
The high-tech firms say the so-called white spaces of the airwaves that lie between the broadcast TV channels have the potential to provide revolutionary new wireless services that people could use for free -- unlike the spectrum leased by the government to cellphone companies, which then charge customers to access it.
Google Chief Executive Eric Schmidt and Microsoft co-founder Bill Gates personally lobbied FCC commissioners to open up access to the vacant channels, which range from about a third of the TV airwaves in major cities such as Los Angeles to three-quarters of the airwaves in rural areas.
These companies will have to build the infrastructure to connect the airwaves to the Internet, such as installing transmitters on existing cellular towers. Although they could charge users for those connections -- in the same way that some coffee shops charge for access to their Wi-Fi hot spots -- Google and others are expected to offer them for free, recouping the cost through sales of white-space-enabled devices and online advertising.
"This is a clear victory for Internet users and anyone who wants good wireless communications," Google co-founder Larry Page said.
Broadcasters fiercely fought it, warning that the new devices could cause some viewers to lose their TV signals because of interference. The issue is of particular concern because broadcasters must switch to all-digital signals in February. With traditional analog TV stations, interference causes static or fuzziness. But broadcasters say digital pictures can freeze or be lost entirely if another signal is broadcast on or near the same channel.
"The commission chose a path that imperils America's television reception in order to satisfy the 'free' spectrum demands of Google and Microsoft," said David Donovan, president of the Assn. for Maximum Service Television, an engineering trade group of TV broadcasters.
Representatives of sports leagues, musicians and large churches have also complained about potential interference from the new Internet devices and lobbied against the changes. They worry, for example, that one of these devices in a concert-goer's pocket would interfere with the performer's wireless microphone.
The FCC's field tests of early prototypes provided by Microsoft and other companies produced mixed results, with some of the devices failing to sense and avoid broadcast signals. Broadcasters said those results showed that the technology wasn't ready.
But FCC officials said the tests showed that it was possible for devices to use the airwaves without interference.
The devices will operate at low power and will only be able to use channels 21 to 51, where there are fewer TV stations. The FCC will give preference to devices that use technology to determine a user's location and then avoid TV channels operating there based on a special database, rather than devices that try to sense and avoid TV signals. Devices that use sensing technology will have to go through more rigorous field testing before being certified.
The FCC also will create a safe zone around large sporting and performance venues, such as the Los Angeles Coliseum and New York's Broadway theater district. The new mobile devices in those areas won't have access to channels used by wireless microphones.
Puzzanghera is a Times staff writer.jim.puzzanghera@latimes.com
The plan allows high-tech firms such as Google and Microsoft to develop a new generation of devices that will use the 'white spaces' between channels to go online.
By Jim Puzzanghera November 5, 2008
Reporting from Washington -- Federal regulators on Tuesday approved the largest ever expansion of wireless Internet access, unanimously backing a controversial plan to allow a new generation of devices to use the empty airwaves between television channels to go online.Dubbed "Wi-Fi on steroids" by its supporters in the high-tech industry, the plan promises to offer wireless Internet service across America -- most likely for free -- and spur new systems for transmitting video and other data between devices in homes.
It overcame staunch opposition from the entertainment industry, which is worried that the Web-surfing devices will interfere with TV broadcasts and wireless microphones.Although expected to be slower and possibly less secure than commercial broadband services from cable and phone companies, the new Internet connections will ride on the highest-quality broadcast airwaves, which are able to carry signals long distances and easily penetrate trees and walls.For decades, those government-owned airwaves have been reserved for TV stations.
But the Federal Communications Commission, in a 5-0 vote intended to increase the reach of high-speed Internet access, approved a plan advocated by public interest groups and technology companies, including Google Inc. and Microsoft Corp., to allow the use of the spectrum by new laptops, mobile phones and other gadgets with built-in equipment that are expected to hit the market in about two years.
"Consumers across the country will have access to devices and services they may have only dreamed about before," FCC Chairman Kevin J. Martin said.
The high-tech firms say the so-called white spaces of the airwaves that lie between the broadcast TV channels have the potential to provide revolutionary new wireless services that people could use for free -- unlike the spectrum leased by the government to cellphone companies, which then charge customers to access it.
Google Chief Executive Eric Schmidt and Microsoft co-founder Bill Gates personally lobbied FCC commissioners to open up access to the vacant channels, which range from about a third of the TV airwaves in major cities such as Los Angeles to three-quarters of the airwaves in rural areas.
These companies will have to build the infrastructure to connect the airwaves to the Internet, such as installing transmitters on existing cellular towers. Although they could charge users for those connections -- in the same way that some coffee shops charge for access to their Wi-Fi hot spots -- Google and others are expected to offer them for free, recouping the cost through sales of white-space-enabled devices and online advertising.
"This is a clear victory for Internet users and anyone who wants good wireless communications," Google co-founder Larry Page said.
Broadcasters fiercely fought it, warning that the new devices could cause some viewers to lose their TV signals because of interference. The issue is of particular concern because broadcasters must switch to all-digital signals in February. With traditional analog TV stations, interference causes static or fuzziness. But broadcasters say digital pictures can freeze or be lost entirely if another signal is broadcast on or near the same channel.
"The commission chose a path that imperils America's television reception in order to satisfy the 'free' spectrum demands of Google and Microsoft," said David Donovan, president of the Assn. for Maximum Service Television, an engineering trade group of TV broadcasters.
Representatives of sports leagues, musicians and large churches have also complained about potential interference from the new Internet devices and lobbied against the changes. They worry, for example, that one of these devices in a concert-goer's pocket would interfere with the performer's wireless microphone.
The FCC's field tests of early prototypes provided by Microsoft and other companies produced mixed results, with some of the devices failing to sense and avoid broadcast signals. Broadcasters said those results showed that the technology wasn't ready.
But FCC officials said the tests showed that it was possible for devices to use the airwaves without interference.
The devices will operate at low power and will only be able to use channels 21 to 51, where there are fewer TV stations. The FCC will give preference to devices that use technology to determine a user's location and then avoid TV channels operating there based on a special database, rather than devices that try to sense and avoid TV signals. Devices that use sensing technology will have to go through more rigorous field testing before being certified.
The FCC also will create a safe zone around large sporting and performance venues, such as the Los Angeles Coliseum and New York's Broadway theater district. The new mobile devices in those areas won't have access to channels used by wireless microphones.
Puzzanghera is a Times staff writer.jim.puzzanghera@latimes.com
Tuesday, November 4, 2008
Preview of Live Sound Column
Another Green World
Written by John Mayberry
Live Sound International
Who can argue with the current trend of smaller carbon footprints, green packaging, and energy efficiency?
Me.
What a colossal waste of time, effort, and money. “Going Green” is the last thing we should be doing right now.
Something bizarre is going on we’re culturally loathing to admit- a trend driven by guilt, despair, depression, and paranoia. It’s a lemming like rush to the acceptance of mediocrity.
Mediocrity as a virtue?
It’s almost as if the Borg were attacking and we’ve decided to capitulate rather than fight. I don’t believe for a moment that resistance is futile against this mistakenly woeful Green Revolution.
Yet right now we’re stuck in the middle of a conundrum; an unhappy intersection of marketing prowess, Me Generation greed, economic leverage from oversea competitors, and an unhelpful dose of our own stupidity. We must compete our way out of it, not dig our holes even deeper.
I saw my first green audio product the other day. Yeech. Instead of touting its technical merits the press release rambled on about how the packaging was 43% smaller and could more easily recycled. A boring little spit of a product was in the box, but who cares if its carbon footprint is smaller? Are we now playing to the crowd or to the customer?
And why in the world should Americans cut down on packaging? Exporting trash is America’s single largest export! It’s bigger than corn or coal right now- look it up.
In the old days it was the biggest house, fastest car, loudest sound system, or the most sparkling jewel. These days it’s degenerated into something far more sinister. Remember the two buses on the Sex Pistols disc going to Nowhere and Boredom? That’s where the Green crowd wants to take us- and on mass transportation no less.
With apologies to the Renaissance Faire crowd, your world is boring and is not the solution to this malaise. As fascinating as it is to see unshaven Luddites prance about in the dirt with pointed shoes whilst strumming a lyre, let’s just say it’s a fork in the road I’m glad you went down and not me. Yet you have the microphone right now, as they say. And I mean to take it away from you.
The First Lord of Green Boredom in my book is Al Gore, a man who practices something entirely different than what he preaches. His message of apocalyptic environmentalism may have delivered him a Nobel Prize, but if I may I’d like to whisper an opposing message into your ear as well, “Our industry does not sell boring very well. Never has, and never will. We sell excitement, movement, and energy. We sell new and different- spectacular events if it all comes together properly. I doubt the Rolled Stones would have quite the attraction of the Rolling Stones, for instance….”
Do Toyota hybrids sell well in Abu Dhabi? Hell no. Ferraris do. I suppose that’s why they’re building Ferrariland there instead of here. Can you imagine the conniption fits our Greenies would convulse in had they even proposed putting a Ferrariland in Southern California? The best we can hope for a Prius based ride in Legoland.
The Toyota Prius is sold in over 40 countries, yet over sixty percent of their sales have been in the U.S. Ever wonder why? Is it high fuel costs here compared to Europe? Stylish design? High performance (30 HP less than Toyota’s own Yaris)? Low maintenance or insurance costs?
No, it’s because we’re supposed to feel better driving a boring car that’s acceptable to the Green crowd, even if we’re not quite sure where those 600,000 nickel metal hydride batteries are going to get dumped. Perhaps they should try Yucca Mountain- it’s not be used for anything right now anyway. Suffice to say the same crowd that predicts Armageddon from cow flatulence yet gives the battery disposal issues a complete pass.
They even gave the first 85,000 hybrid owners access to the car pool lane in California without any passengers. How that helped unclog the freeways I’m not quite sure. It seems the OPEC boys found a much more efficient technique in my opinion.
The only way out of this mess is with bigger and better technology, not going backwards and accepting lower performance as we pine for the good old days. Our best and brightest engineers should be racing ahead to build the latest and greatest, not wasting their time bragging about the various merits of cardboard packaging and telling us to make do with less performance. They should be building nuclear and fusion reactors, high capacity energy storage, more efficient transportation, and better communication systems.
We got it. Digital consoles, integrated wild tracks, DSP processing, switched power supplies, and line arrays have been very good to this industry. So what’s next, and when do we get it? I want our manufacturers to rigorously go through the entire system concept from start to finish and make the whole thing much better! That is what we need to prosper in the long term. The dreamers, technologists, engineers, and builders that will shape our new reality must be given their chance.
Try an experiment for me. Go up to your head salesperson and tell them you want them to only sell systems that don’t perform well but they are made from eco-friendly low carbon footprint materials. The audience won’t hear anything, but who really cares anyway? I have a pretty good idea where your salesperson’s footprint will be planted on you after that request. It does sound kind of stupid when it hits home, doesn’t it? It’s not different for any other industry.
Going Green may be a very quick way to make your accounting go red and your future black. Paraphrasing Churchill, this may not be the beginning of the end of the Green Revolution. But perhaps it is the end of the beginning. I hope we’re at a point where the platitudes and positioning end and meaningful innovation begins.
Thank goodness for that. Now flip this chlorinated and Kraft pulped (a fascinating industrial process dependent on gas turbine engines, by the way) page on to the next article.
Written by John Mayberry
Live Sound International
Who can argue with the current trend of smaller carbon footprints, green packaging, and energy efficiency?
Me.
What a colossal waste of time, effort, and money. “Going Green” is the last thing we should be doing right now.
Something bizarre is going on we’re culturally loathing to admit- a trend driven by guilt, despair, depression, and paranoia. It’s a lemming like rush to the acceptance of mediocrity.
Mediocrity as a virtue?
It’s almost as if the Borg were attacking and we’ve decided to capitulate rather than fight. I don’t believe for a moment that resistance is futile against this mistakenly woeful Green Revolution.
Yet right now we’re stuck in the middle of a conundrum; an unhappy intersection of marketing prowess, Me Generation greed, economic leverage from oversea competitors, and an unhelpful dose of our own stupidity. We must compete our way out of it, not dig our holes even deeper.
I saw my first green audio product the other day. Yeech. Instead of touting its technical merits the press release rambled on about how the packaging was 43% smaller and could more easily recycled. A boring little spit of a product was in the box, but who cares if its carbon footprint is smaller? Are we now playing to the crowd or to the customer?
And why in the world should Americans cut down on packaging? Exporting trash is America’s single largest export! It’s bigger than corn or coal right now- look it up.
In the old days it was the biggest house, fastest car, loudest sound system, or the most sparkling jewel. These days it’s degenerated into something far more sinister. Remember the two buses on the Sex Pistols disc going to Nowhere and Boredom? That’s where the Green crowd wants to take us- and on mass transportation no less.
With apologies to the Renaissance Faire crowd, your world is boring and is not the solution to this malaise. As fascinating as it is to see unshaven Luddites prance about in the dirt with pointed shoes whilst strumming a lyre, let’s just say it’s a fork in the road I’m glad you went down and not me. Yet you have the microphone right now, as they say. And I mean to take it away from you.
The First Lord of Green Boredom in my book is Al Gore, a man who practices something entirely different than what he preaches. His message of apocalyptic environmentalism may have delivered him a Nobel Prize, but if I may I’d like to whisper an opposing message into your ear as well, “Our industry does not sell boring very well. Never has, and never will. We sell excitement, movement, and energy. We sell new and different- spectacular events if it all comes together properly. I doubt the Rolled Stones would have quite the attraction of the Rolling Stones, for instance….”
Do Toyota hybrids sell well in Abu Dhabi? Hell no. Ferraris do. I suppose that’s why they’re building Ferrariland there instead of here. Can you imagine the conniption fits our Greenies would convulse in had they even proposed putting a Ferrariland in Southern California? The best we can hope for a Prius based ride in Legoland.
The Toyota Prius is sold in over 40 countries, yet over sixty percent of their sales have been in the U.S. Ever wonder why? Is it high fuel costs here compared to Europe? Stylish design? High performance (30 HP less than Toyota’s own Yaris)? Low maintenance or insurance costs?
No, it’s because we’re supposed to feel better driving a boring car that’s acceptable to the Green crowd, even if we’re not quite sure where those 600,000 nickel metal hydride batteries are going to get dumped. Perhaps they should try Yucca Mountain- it’s not be used for anything right now anyway. Suffice to say the same crowd that predicts Armageddon from cow flatulence yet gives the battery disposal issues a complete pass.
They even gave the first 85,000 hybrid owners access to the car pool lane in California without any passengers. How that helped unclog the freeways I’m not quite sure. It seems the OPEC boys found a much more efficient technique in my opinion.
The only way out of this mess is with bigger and better technology, not going backwards and accepting lower performance as we pine for the good old days. Our best and brightest engineers should be racing ahead to build the latest and greatest, not wasting their time bragging about the various merits of cardboard packaging and telling us to make do with less performance. They should be building nuclear and fusion reactors, high capacity energy storage, more efficient transportation, and better communication systems.
We got it. Digital consoles, integrated wild tracks, DSP processing, switched power supplies, and line arrays have been very good to this industry. So what’s next, and when do we get it? I want our manufacturers to rigorously go through the entire system concept from start to finish and make the whole thing much better! That is what we need to prosper in the long term. The dreamers, technologists, engineers, and builders that will shape our new reality must be given their chance.
Try an experiment for me. Go up to your head salesperson and tell them you want them to only sell systems that don’t perform well but they are made from eco-friendly low carbon footprint materials. The audience won’t hear anything, but who really cares anyway? I have a pretty good idea where your salesperson’s footprint will be planted on you after that request. It does sound kind of stupid when it hits home, doesn’t it? It’s not different for any other industry.
Going Green may be a very quick way to make your accounting go red and your future black. Paraphrasing Churchill, this may not be the beginning of the end of the Green Revolution. But perhaps it is the end of the beginning. I hope we’re at a point where the platitudes and positioning end and meaningful innovation begins.
Thank goodness for that. Now flip this chlorinated and Kraft pulped (a fascinating industrial process dependent on gas turbine engines, by the way) page on to the next article.
The Top Five Reasons Why Windows Vista Failed
October 6th, 2008
Posted by Jason Hiner @ 4:21 am
On Friday, Microsoft gave computer makers a six-month extension for offering Windows XP on newly-shipped PCs. While this doesn’t impact enterprise IT — because volume licensing agreements will allow IT to keep installing Windows XP for many years to come — the move is another symbolic nail in Vista’s coffin.
The public reputation of Windows Vista is in shambles, as Microsoft itself tacitly acknowledged in its Mojave ad campaign.
IT departments are largely ignoring Vista. In June (18 months after Vista’s launch), Forrester Research reported that just 8.8% of enterprise PCs worldwide were running Vista. Meanwhile, Microsoft appears to have put Windows 7 on an accelerated schedule that could see it released in 2010. That will provide IT departments with all the justification they need to simply skip Vista and wait to eventually standardize on Windows 7 as the next OS for business.
So how did Vista get left holding the bag? Let’s look at the five most important reasons why Vista failed.
5. Apple successfully demonized Vista
Apple’s clever I’m a Mac ads have successfully driven home the perception that Windows Vista is buggy, boring, and difficult to use. After taking two years of merciless pummeling from Apple, Microsoft recently responded with it’s I’m a PC campaign in order to defend the honor of Windows. This will likely restore some mojo to the PC and Windows brands overall, but it’s too late to save Vista’s perception as a dud.
4. Windows XP is too entrenched
In 2001, when Windows XP was released, there were about 600 million computers in use worldwide. Over 80% of them were running Windows but it was split between two code bases: Windows 95/98 (65%) and Windows NT/2000 (26%), according to IDC. One of the big goals of Windows XP was to unite the Windows 9x and Windows NT code bases, and it eventually accomplished that.
In 2008, there are now over 1.1 billion PCs in use worldwide and over 70% of them are running Windows XP. That means almost 800 million computers are running XP, which makes it the most widely installed operating system of all time. That’s a lot of inertia to overcome, especially for IT departments that have consolidated their deployments and applications around Windows XP.
And, believe it or not, Windows XP could actually increase its market share over the next couple years. How? Low-cost netbooks and nettops are going to be flooding the market. While these inexpensive machines are powerful enough to provide a solid Internet experience for most users, they don’t have enough resources to run Windows Vista, so they all run either Windows XP or Linux. Intel expects this market to explode in the years ahead. (For more on netbooks and nettops, see this fact sheet and this presentation — both are PDFs from Intel.)
3. Vista is too slow
For years Microsoft has been criticized by developers and IT professionals for “software bloat” — adding so many changes and features to its programs that the code gets huge and unwieldy. However, this never seemed to have enough of an effect to impact software sales. With Windows Vista, software bloat appears to have finally caught up with Microsoft.
Vista has over 50 million lines of code. XP had 35 million when it was released, and since then it has grown to about 40 million. This software bloat has had the effect of slowing down Windows Vista, especially when it’s running on anything but the latest and fastest hardware. Even then, the latest version of Windows XP soundly outperforms the latest version of Microsoft Vista. No one wants to use a new computer that is slower than their old one.
2. There wasn’t supposed to be a Vista
It’s easy to forget that when Microsoft launched Windows XP it was actually trying to change its OS business model to move away from shrink-wrapped software and convert customers to software subscribers. That’s why it abandoned the naming convention of Windows 95, Windows 98, and Windows 2000, and instead chose Windows XP.
The XP stood for “experience” and was part of Microsoft’s .NET Web services strategy at the time. The master plan was to get users and businesses to pay a yearly subscription fee for the Windows experience — XP would essentially be the on-going product name but would include all software upgrades and updates, as long as you paid for your subscription. Of course, it would disable Windows on your PC if you didn’t pay. That’s why product activation was coupled with Windows XP.
Microsoft released Windows XP and Office XP simultaneously in 2001 and both included product activation and the plan to eventually migrate to subscription products. However, by the end of 2001 Microsoft had already abandoned the subscription concept with Office, and quickly returned to the shrink-wrapped business model and the old product development model with both products.
The idea of doing incremental releases and upgrades of its software — rather than a major shrink-wrapped release every 3-5 years — was a good concept. Microsoft just couldn’t figure out how to make the business model work, but instead of figuring out how to get it right, it took the easy route and went back to an old model that was simply not very well suited to the economic and technical realities of today’s IT world.
1. It broke too much stuff
One of the big reasons that Windows XP caught on was because it had the hardware, software, and driver compatibility of the Windows 9x line plus the stability and industrial strength of the Windows NT line. The compatibility issue was huge. Having a single, highly-compatible Windows platform simplified the computing experience for users, IT departments, and software and hardware vendors.
Microsoft either forgot or disregarded that fact when it released Windows Vista, because, despite a long beta period, a lot of existing software and hardware were not compatible with Vista when it was released in January 2007. Since many important programs and peripherals were unusable in Vista, that made it impossible for a lot of IT departments to adopt it. Many of the incompatibilities were the result of tighter security.
After Windows was targeted by a nasty string of viruses, worms, and malware in the early 2000s, Microsoft embarked on the Trustworthy Computing initiative to make its products more secure. One of the results was Windows XP Service Pack 2 (SP2), which won over IT and paved the way for XP to become the world’s mostly widely deployed OS.
The other big piece of Trustworthy Computing was the even-further-locked-down version of Windows that Microsoft released in Vista. This was definitely the most secure OS that Microsoft had ever released but the price was user-hostile features such as UAC, a far more complicated set of security prompts that accompanied many basic tasks, and a host of software incompatibility issues. In other words, Vista broke a lot of the things that users were used to doing in XP.
Bottom line
There are some who argue that Vista is actually more widely adopted than XP was at this stage after its release, and that it’s highly likely that Vista will eventually replace XP in the enterprise. I don’t agree. With XP, there were clear motivations to migrate: bring Windows 9x machines to a more stable and secure OS and bring Windows NT/2000 machines to an OS with much better hardware and software compatibility. And, you also had the advantage of consolidating all of those machines on a single OS in order to simplify support.
With Vista, there are simply no major incentives for IT to use it over XP. Security isn’t even that big of an issue because XP SP2 (and above) are solid and most IT departments have it locked down quite well. As I wrote in the article Prediction: Microsoft will leapfrog Vista, release Windows 7 early, and change its OS business, Microsoft needs to abandon the strategy of releasing a new OS every 3-5 years and simply stick with a single version of Windows and release updates, patches, and new features on a regular basis. Most IT departments are essentially already on a subscription model with Microsoft so the business strategy is already in place for them.
As far as the subscription model goes for small businesses and consumers, instead of disabling Windows on a user’s PC if they don’t renew their subscription, just don’t allow that machine to get any more updates if they don’t renew. Microsoft could also work with OEMs to sell something like a three-year subscription to Windows with every a new PC. Then users would have the choice of renewing on their own after that.
This article was originally published in the Tech Sanity Check blog (subscribe via RSS or e-mail alert).
Posted by Jason Hiner @ 4:21 am
On Friday, Microsoft gave computer makers a six-month extension for offering Windows XP on newly-shipped PCs. While this doesn’t impact enterprise IT — because volume licensing agreements will allow IT to keep installing Windows XP for many years to come — the move is another symbolic nail in Vista’s coffin.
The public reputation of Windows Vista is in shambles, as Microsoft itself tacitly acknowledged in its Mojave ad campaign.
IT departments are largely ignoring Vista. In June (18 months after Vista’s launch), Forrester Research reported that just 8.8% of enterprise PCs worldwide were running Vista. Meanwhile, Microsoft appears to have put Windows 7 on an accelerated schedule that could see it released in 2010. That will provide IT departments with all the justification they need to simply skip Vista and wait to eventually standardize on Windows 7 as the next OS for business.
So how did Vista get left holding the bag? Let’s look at the five most important reasons why Vista failed.
5. Apple successfully demonized Vista
Apple’s clever I’m a Mac ads have successfully driven home the perception that Windows Vista is buggy, boring, and difficult to use. After taking two years of merciless pummeling from Apple, Microsoft recently responded with it’s I’m a PC campaign in order to defend the honor of Windows. This will likely restore some mojo to the PC and Windows brands overall, but it’s too late to save Vista’s perception as a dud.
4. Windows XP is too entrenched
In 2001, when Windows XP was released, there were about 600 million computers in use worldwide. Over 80% of them were running Windows but it was split between two code bases: Windows 95/98 (65%) and Windows NT/2000 (26%), according to IDC. One of the big goals of Windows XP was to unite the Windows 9x and Windows NT code bases, and it eventually accomplished that.
In 2008, there are now over 1.1 billion PCs in use worldwide and over 70% of them are running Windows XP. That means almost 800 million computers are running XP, which makes it the most widely installed operating system of all time. That’s a lot of inertia to overcome, especially for IT departments that have consolidated their deployments and applications around Windows XP.
And, believe it or not, Windows XP could actually increase its market share over the next couple years. How? Low-cost netbooks and nettops are going to be flooding the market. While these inexpensive machines are powerful enough to provide a solid Internet experience for most users, they don’t have enough resources to run Windows Vista, so they all run either Windows XP or Linux. Intel expects this market to explode in the years ahead. (For more on netbooks and nettops, see this fact sheet and this presentation — both are PDFs from Intel.)
3. Vista is too slow
For years Microsoft has been criticized by developers and IT professionals for “software bloat” — adding so many changes and features to its programs that the code gets huge and unwieldy. However, this never seemed to have enough of an effect to impact software sales. With Windows Vista, software bloat appears to have finally caught up with Microsoft.
Vista has over 50 million lines of code. XP had 35 million when it was released, and since then it has grown to about 40 million. This software bloat has had the effect of slowing down Windows Vista, especially when it’s running on anything but the latest and fastest hardware. Even then, the latest version of Windows XP soundly outperforms the latest version of Microsoft Vista. No one wants to use a new computer that is slower than their old one.
2. There wasn’t supposed to be a Vista
It’s easy to forget that when Microsoft launched Windows XP it was actually trying to change its OS business model to move away from shrink-wrapped software and convert customers to software subscribers. That’s why it abandoned the naming convention of Windows 95, Windows 98, and Windows 2000, and instead chose Windows XP.
The XP stood for “experience” and was part of Microsoft’s .NET Web services strategy at the time. The master plan was to get users and businesses to pay a yearly subscription fee for the Windows experience — XP would essentially be the on-going product name but would include all software upgrades and updates, as long as you paid for your subscription. Of course, it would disable Windows on your PC if you didn’t pay. That’s why product activation was coupled with Windows XP.
Microsoft released Windows XP and Office XP simultaneously in 2001 and both included product activation and the plan to eventually migrate to subscription products. However, by the end of 2001 Microsoft had already abandoned the subscription concept with Office, and quickly returned to the shrink-wrapped business model and the old product development model with both products.
The idea of doing incremental releases and upgrades of its software — rather than a major shrink-wrapped release every 3-5 years — was a good concept. Microsoft just couldn’t figure out how to make the business model work, but instead of figuring out how to get it right, it took the easy route and went back to an old model that was simply not very well suited to the economic and technical realities of today’s IT world.
1. It broke too much stuff
One of the big reasons that Windows XP caught on was because it had the hardware, software, and driver compatibility of the Windows 9x line plus the stability and industrial strength of the Windows NT line. The compatibility issue was huge. Having a single, highly-compatible Windows platform simplified the computing experience for users, IT departments, and software and hardware vendors.
Microsoft either forgot or disregarded that fact when it released Windows Vista, because, despite a long beta period, a lot of existing software and hardware were not compatible with Vista when it was released in January 2007. Since many important programs and peripherals were unusable in Vista, that made it impossible for a lot of IT departments to adopt it. Many of the incompatibilities were the result of tighter security.
After Windows was targeted by a nasty string of viruses, worms, and malware in the early 2000s, Microsoft embarked on the Trustworthy Computing initiative to make its products more secure. One of the results was Windows XP Service Pack 2 (SP2), which won over IT and paved the way for XP to become the world’s mostly widely deployed OS.
The other big piece of Trustworthy Computing was the even-further-locked-down version of Windows that Microsoft released in Vista. This was definitely the most secure OS that Microsoft had ever released but the price was user-hostile features such as UAC, a far more complicated set of security prompts that accompanied many basic tasks, and a host of software incompatibility issues. In other words, Vista broke a lot of the things that users were used to doing in XP.
Bottom line
There are some who argue that Vista is actually more widely adopted than XP was at this stage after its release, and that it’s highly likely that Vista will eventually replace XP in the enterprise. I don’t agree. With XP, there were clear motivations to migrate: bring Windows 9x machines to a more stable and secure OS and bring Windows NT/2000 machines to an OS with much better hardware and software compatibility. And, you also had the advantage of consolidating all of those machines on a single OS in order to simplify support.
With Vista, there are simply no major incentives for IT to use it over XP. Security isn’t even that big of an issue because XP SP2 (and above) are solid and most IT departments have it locked down quite well. As I wrote in the article Prediction: Microsoft will leapfrog Vista, release Windows 7 early, and change its OS business, Microsoft needs to abandon the strategy of releasing a new OS every 3-5 years and simply stick with a single version of Windows and release updates, patches, and new features on a regular basis. Most IT departments are essentially already on a subscription model with Microsoft so the business strategy is already in place for them.
As far as the subscription model goes for small businesses and consumers, instead of disabling Windows on a user’s PC if they don’t renew their subscription, just don’t allow that machine to get any more updates if they don’t renew. Microsoft could also work with OEMs to sell something like a three-year subscription to Windows with every a new PC. Then users would have the choice of renewing on their own after that.
This article was originally published in the Tech Sanity Check blog (subscribe via RSS or e-mail alert).
Subscribe to:
Posts (Atom)