Saturday was a pretty average day -- if a little cold -- until the chanting started. Listening through an open window in my North Beach apartment, I couldn't quite make out the words. So I headed toward Washington Square Park to see what all the commotion was about, and everything became clear.
They were chanting "Ho, ho, ho! Ho, ho, ho!" and by they, I mean about 200 people dressed in some version, and in many cases a perverse version, of Santa Claus. They were also playing dodge ball, climbing trees, pounding beers, and generally inciting chaos. The defenseless Benjamin Franklin Memorial quickly became a victim of that chaos when somebody wrote "HO" in white spray paint on the base (see above photo), then climbed the monument and santafied it.
The Santa invasion is formally called Santacon, and it started in this very city back in 1994, then quickly spread to dozens of cities including London, Tokyo, and even McMudon Station, Antarctica. The event had been called anything from Red Menace to Santapalooza, and ranges from extremely nice, involving mostly Christmas Caroling, to extremely naughty, San Francisco-style.
Here, a Santa has dropped his bag of presents as he prepares to nail another Santa with a plastic ball. Meanwhile, his Santapants have gone south. Once the Santas had their way with Washington Square Park and Benjamin Franklin, they marched to the bars, tackling each other, playing tambourines, and smoking cigarettes along the way.
A pack of elves in smart black suits also wore ear pieces and tags that read, "AGENT" and seemed to be running things when they weren't flirting with sexy Santas.
A more terrifying species of elf was also in attendance -- the huggy elf. The picture below was taking just seconds before a huggy elf, without warning or good cause, embraced this reporter.
There were also plenty of creative Santas in attendance, including Jewish Santas, Christmas-tree headed Santas, S&M Santas, and Mistress Clauses. But nothing really beat this Mexican Wrestler Santa.
A Sports Car It's not, but savvy suspension tuning and sweet steering feel made the F-150 a favorite on our twisty moutain run. Midlevel Lariat's plenty plush for us.
This is it. Crunch time. The 2009 Ford F-150 and Dodge Ram are rolling onto the market like a pair of gigantic craps dice, and the companies tossing them are each betting big on this game. Unfortunately, the rules changed while these dice were in mid-air. Fuel prices skyrocketed, the economy tanked, consumer confidence evaporated, and folks who once chose half-ton pickups more for their Marlboro-Man-image-enhancing qualities than for their towing or hauling capabilities are shopping elsewhere.
Ford claims it sells more of its half-ton pickups to work and commercial customers than its competitors do, and Ford predicts this segment will grow to 45 percent of F-150 sales. Toward that end, the truck's fully boxed chassis is further fortified to provide best-in-class rigidity, payload capacity (up to 3030 pounds), and tow ratings (up to 11,300 pounds). As such, the new F-150 is well positioned to capture contractors migrating down-market out of Super-Dutys to save money and gas (did we mention that a new six-speed automatic, a lighter, more aerodynamic cab, and other tweaks boost fuel economy by 12 percent with the 5.4-liter?).
Ford claims payload and towing numbers like that simply can't be had with a coil-sprung rear axle, so it stuck with leafs but made them longer to smooth the ride and wider with new mounting hardware to improve lateral rigidity and roll control. The ride doesn't quite match Dodge's, but the chassis engineers managed to tune the steering for pleasing heft and remarkable accuracy that had many judges lauding the F-150s for feeling smaller and nimbler than their Dodge counterparts. Lateral grip of 0.70 g for both Fords bested all but the feathery base Dodge and Suzuki, and our rear-drive SXT scored the best stop at 133 feet from 60 mph (the three-ton Lariat needed 144 feet).
Status-conscious contractors will have eight F-150 models from which to choose (including the forthcoming SVT Raptor), which Detroit editor Todd Lassa reckons is "about four too many," adding, "If this Lariat is the third truck from the top, how much of a boudoir must the King Ranch and Platinum interiors be?" Judges praised the low noise levels and interior materials quality, though some found the design cartoonishly macho. Still, handy features like the Tailgate Step, Box Side Step, a stowable bed-extender, and rear seats that fold up with one hand to reveal a broad, flat load floor help tally a strong superiority score.
On the negative side of the ledger is Ford's aging all-V-8-engine lineup, which is composed of two- and three-valve 4.6-liters and a three-valve 5.4. SVT will bring a bigger 6.2 in the Raptor, and an EcoBoost V-6 is likely to join the lineup for folks who don't tow, but the diesel is on hold.
The base V-8 handily outruns and outhauls the V-6 Dodge, but sounds and feels strained doing so. Gearing that's a third shorter than the Dodge's kept our 5.4-liter 4x4 within 0.6 second of the big Dodge, but costs it at the pump, where both trucks averaged just 13.2 mpg over 500 miles of mixed driving. The new six-speed automatic features excellent tow/haul-mode programming (ordering downshifts with a tap of the brakes on downhill grades, holding lower gears, etc.), but in normal mode, it's lethargic to kick down, and there's no way to manually select the higher gears.
Stepping Up Slide-out step and pop-up handle add weight to the tailgate, but a lift-assist spring makes it manageable. It seems well worth the $350, as do the side steps ($325 for two).
Both Fords tackled our off-road sand-loop with aplomb. The 4x4 transfer case engaged high-and low-range settings quickly and easily, with the message center confirming the shift was in process. We're disappointed, however, that there's no on-pavement AWD option as offered by Dodge and General Motors.
Coming into the final discussion, the Dodge and Ford were running close in the superiority category. In this price-sensitive market, neither truck held a tie-breaking advantage in the value category. The scales also looked level weighing the significance of Ford's high sales and model-range breadth against the game-changing nature of the Dodge suspension.
In the end, we accept the prediction that work trucks will come to dominate this segment and give the nod to the more capable, broader-reaching Ford in the closest vote in Truck of the Year history -- and we sincerely hope neither company craps out when these dice come to rest.
Do you use iTunes to listen to your music? If so, do you also scrobble your music to Last.fm? You should. Doing so provides many benefits - some of which don't become obvious until you've been doing it for a few months. I have had the little Last.fm daemon running in my task bar for the past 6 months or so innocuously keeping track of what I've been listening to and using it to build up "recommendations" for me on the Last.fm site. I have to say, after that much time, the site knows me pretty well. As I would expect, hitting the site and letting it play "my recommendations" tends to weigh heavily in the southern/classic rock and blues direction but it also spices itself up with some Motown and some Stax Soul (due in large part to a fixation (and a downloaded box set) I had with each a few weeks back). Toss in a little metal, a little rap, some prog rock and even some ecclectic Cuban music and you get a pretty well rounded playlist. Good stuff.
After experiencing this, it became pretty obvious to me that the Last.fm can really benefit from tracking my iTunes habits - but I wondered, could it work in the opposite? Could iTunes benefit from my usage of Last.fm? Turns out it can!
If you're familiar with the way I access my home music here at work, you know that I do it through a combination of iTunes built in LAN sharing and a little piece of software called Hamachi (now owned by LogMeIn). This is unbelievably convenient, but what I miss out on are two key features of iTunes that I would love to take advantage of - play count tracking and song rating. Neither is possible when using a remote library. Well, thanks to an innovative PERL scripter (and Last.fm/iTunes fan) the first of the two problems is all but solved!
It works like this - you install a free PERL scripting interface and run the guy's script on the computer where your iTunes library lives. It goes out, connects to your Last.fm account which has been keeping track of what you play, and how often, and if the playcount on Last.fm for a song is greater than what your local iTunes has in its database, it updates it! Viola! All automatic like!
An obvious question is "why the hell do you care?" and to that I answer "Smart Playlists". What's that? You don't use Smart Playlists? Well, you should. I'll just leave it at that.
Now... if only I could find a good way to rate my songs remotely I'd be in heaven and my playlists would get that much better!
With its first redesign since being introduced in 2002, the 2009 BMW Z4 roadster looks as if it's finally passed through its awkward adolescence and has filled out into a curvaceous hardtop roadster.
The most obvious change to the Z4 when it debuts at the Detroit auto show in January will be its more conventional styling. The polarizing influence of Chris Bangle has been greatly toned down, although the trademark "Bangle butt" remains. Overall, the lines look similar to those of the BMW 1-series, though some cues carry over from the new 7-series as well.
The other plainly clear difference is the absence of a cloth top. In lieu of separate convertible and coupe variants, the new Z4 features a retractable hardtop that disappears at the touch of a button within 20 seconds. Less clear from the pictures are the Z4's stretched dimensions. The new car is nearly 6 inches longer and a half an inch wider than its predecessor. BMW says this growth affords passengers more comfort and space inside. The larger size, coupled with the addition of the power-operated hardtop, contribute to a weight increase of roughly 500 lbs in a fully loaded, automatic model, to a claimed curb weight of 3494 lbs.
Changes inside are a bit subtler. The biggest news is the availability of iDrive for the first time in a Z4. Overall, the interior appears to have moved decidedly more upscale, especially when equipped with the optional leather-wrapped dash. BMW says it focused on increased practicality as well. Door openings are larger for easier ingress and egress, and there are new storage compartments throughout the interior. In the trunk, a mid-gate separating the roof compartment can be folded down for more luggage. An optional winter package adds a pass through to the cabin, allowing room for up to two full size golf bags with the top up, BMW says.
Under the hood, the Z4 features a pair of revised 3.0-liter inline-sixes. The sDrive30i comes with a naturally aspirated version of the engine producing 255 hp, compared with 215 hp in the current base Z4. Top-level, sDrive35i roadsters will receive the 300 hp, twin-turbo six used in the 335i. The sDrive30i comes with a choice of a six-speed manual or automatic, while the sDrive35i adds the option of BMW's seven-speed dual-clutch transmission. BMW estimates the dual-clutch transmission equipped, turbocharged model will go from 0 to 60 in 5.0 seconds.
At the moment, BMW has no plans for an M-model. Our man in Europe, Georg Kacher, indicates the reason is BMW can't fit the M3's V-8 into the Z4's engine bay. He adds that we might see some four-cylinders to the lineup in the near future.
Little has changed in the roadster's suspension, which retains the traditional strut front, multi-link rear configuration. Z4s equipped with the optional Sport Package get electronically adjustable dampers. Drivers will also be able to configure throttle and steering response, stability control, and the shift points on the automatic transmission.
The Z4 will launch late in the second quarter of 2009, just in time for summer. BMW has yet to release official pricing information, but expect the increased accomadations and the hardtop to add about $5,000 to $10,000, depending on the trim, meaning a base price around $40,000 for the sDrive30i and $50,000 for the sDrive35i. Stay tuned for more information.
LONDON — Aston Martin is in the news again, from two different directions: A new image of the carmaker's upcoming Rapide sedan has surfaced on the Web, just ahead of reports from the Middle East that Kuwait's Investment Dar may be selling part of its stake in the British supercar builder.
The latest computer-generated image of the Rapide, which appears to have originated at 4WheelsNews.com, shows the car from a rear-three-quarter perspective. As evidenced in the image, the 500-horsepower Rapide will feature a fastback rear end with a panoramic glass roof and bulging rear fenders.
Aston Martin plans to unveil the production version next spring at the 2009 Geneva Auto Show, hard on the heels of the Porsche Panamera.
Meanwhile, Investment Dar over the weekend said the company and its investment partners were considering offers to sell a minority stake in Aston Martin, according to Reuters, but would remain shareholders in the automaker.
The partners also are considering an initial public offering on the London Stock Exchange in three to five years, according to Chairman David Richards, the longtime head of Prodrive who helped engineer the acquisition of Aston last year from Ford. In an interview with Arabian Business, Richards said he expects Investment Dar to divest its 50 percent stake in Aston by 2013.
Inside Line says: Given the current global economic downturn, our guess is that now is not the best time to try to sell even a modest stake in an exotic-car company. — Paul Lienert, Correspondent
ERLANGER, Kentucky — Toyota is halting work on its new Blue Springs, Mississippi, plant, leaving the start of production uncertain for its Prius hybrid.
But Toyota hastened to reassure U.S. consumers that the work stoppage will not affect Prius availability in the United States.
"We've got enough Prius capacity in Japan to meet North American demand at this time," said Mike Goss, external affairs manager for Toyota Motor Engineering and Manufacturing North America, in response to an e-mail query from Inside Line.
The construction of the plant, Toyota's eighth in the U.S., originally was seen as a major attempt to keep up with growing demand in the U.S. market. But demand for the Prius has slowed considerably. Toyota's November 2008 sales report showed that Prius sales were down 48.3 percent versus November 2007. Toyota sold 8,660 Prius hybrids in the U.S. in November 2008, compared with 16,737 in November 2007.
"Due to the steep decline of the auto market, Toyota is suspending preparations for its Mississippi plant," the company said in a statement on Monday. "Since the building itself is already about 90 percent complete, Toyota will go ahead and finish that portion of the project.
"This will bring Toyota's total investment to date to about $300 million. But we'll be holding off on subsequent activities, such as equipment installation. This likely means that start of production will be delayed. Due to the uncertainty of the market, it is impossible to say at this time when production will begin."
Toyota originally had earmarked $1.3 billion for the Mississippi plant and projected employment at around 2,000. Production was slated to begin in late 2010. Last month, the Nikkei reported that declining U.S. sales had forced Toyota to consider postponing production to 2011 or later at the Mississippi plant.
"We are committed to completing the Mississippi plant when market conditions allow," said Toyota. It added that, "for employees already on staff at Toyota Mississippi, their jobs are secure." It did not disclose how many workers had been hired thus far.
The suspension of building at Toyota's Mississippi plant comes on the heels of news from General Motors that it will put the brakes on North American production in January. General Motors on Friday said it will slash another 250,000 vehicles from its North American production forecast for the first quarter of 2009. The latest cuts involve 20 assembly plants, including three each in Canada and Mexico. The cuts also affect the automaker's Bowling Green, Kentucky, plant, which builds the Chevrolet Corvette and the Cadillac XLR, as well as GM's Lansing Grand River plant, which builds the Cadillac CTS and the Cadillac STS.
Inside Line says: Toyota's dilemma in Blue Springs presents powerful evidence that the industry's problems are not limited to Detroit-based automakers. — Anita Lienert, Correspondent
You really have to hand it to Lindsay Lohan. The "see-thru shirt with no bra" pictures were getting kind of old, so she stepped it up and did what may be the first ever"see-thru pants" pictures. You can't really see anything good, but at least these aren’t simply more pictures from thatvideo gamesawards show. I'm surprised that was as big as it was, because I read that this was a disappointing year for video games. I bet another bad year would have been 1942, on account of the war and everything.
TOKYO — If you were impressed by the Nissan GT-R, wait until you see the limited-edition Spec V, to be launched in Japan on January 8. No plans have been announced for it to come to North America, Nissan says.
The Spec V arrives barely 12 months after the base model landed in domestic showrooms in December of 2007, and just six months since its debut in the U.S. market.
As this is published, three weeks before the launch, Nissan was remaining tight-lipped about details of the road-going racing car. The company briefly tested a camouflaged prototype at Germany's famed NĂĽrburgring in October, but information on this car was sketchy. However, a source close to Nissan revealed some startling modifications that hint at Nissan's end purpose: to go head to head with Europe and America's legends in the FIA GT Championships.
And as can be expected, basically every piece of hardware has been tweaked. Former top racing engineer and GT-R chief engineer Kazutoshi Mizuno would have it no other way. First things first: Engineers had to get curb weight down. By removing the rear seats and adding lightweight carbon and aluminum body panels, around 198 pounds have been lost, dropping the coupe's weight to 3,638 pounds. Engine output might only be up 5 horsepower, to 478 hp at 6,400 rpm, but it's the beefed-up midrange torque that Nissan was after.
Mizuno instructed his technicians to modify the ECU and exhaust system to focus more of the already available 434 pound-feet of torque in the middle ranges between 3,000 and 5,000 rpm. That gives the GT-R more urgency exiting corners, a prerequisite for racing.
Fitted with a Brembo six-piston brake package, the base GT-R already had superb stopping power. But to take it to the next level, Nissan has added a unique, race-spec six-piston carbon-ceramic setup, developed in-house, that reportedly adds more than $30,000 to the car's base price. And to transfer that braking force to the racetrack, the Spec V offers a choice of either Bridgestone or Dunlop 20-inch semi-slicks on lightweight alloys. For those interested in cost, you can add another $15,000 for the tire/wheel package.
In addition to modifying engine response for faster cornering, Mizuno and his team have fitted specially developed Bilstein shocks and permanently set the car's damping and transmission settings into the R mode for quicker on-track performance.
Now what does all that mean? Having had experience as the man responsible for Nissan's Le Mans 24 Hours challenge as well as GT racing back home, Mizuno knows what GT racing is all about. So expect to see a race-spec version of the Spec V joining the ranks of Porsche, Aston Martin, Maserati and Corvette on the grid in the near future as Nissan takes on the world's best in the FIA GT Championship series.
For those of you with wads of spare cash lying around and a craving for a Spec V, you'd better be prepared to hand over upward of $150,000, because it won't be cheap. And to think that just four months ago, the dollar-yen exchange rate meant you could have picked up a Spec V for under $125,000.
Tim Gallagher of Nissan USA told Inside Line that there are no announced plans for the Spec V to come to North America, but added, "We expect to see some" of the expected Japan-market Nismo package parts for the GT-R offered in North America at an unspecified future time, "but not the entire range of selections."
Inside Line says: Enviably hot car that Americans will only be able to envy for now. — Peter Lyon, Correspondent
WASHINGTON (AP) -- With the recession dragging down consumer prices and home construction, the Federal Reserve is prepared to slash a key interest rate -- perhaps to an all-time low -- in a desperate bid to stem the country's economic slide.
Consumer prices fell by a record amount in November, while home building plunged by the most in a quarter-century, according to government reports released Tuesday that underscored the economy's weakening state.
Falling prices for goods and services at first blush might sound like a good thing. But if prices keep spiraling downward, they can wreak economic havoc. That gives the Fed another reason to lower rates, which would protect against this risk.
With the Fed's key rate dropping ever closer to zero, the central bank is moving into uncharted territory.
Nonetheless, Fed Chairman Ben Bernanke has made it clear the Fed isn't running out of ammunition to fight the worst financial crisis since the 1930s. It is exploring using tools -- other than rate cuts -- to revive the economy. New insights on that front could be revealed when Bernanke and his colleagues wrap up a two-day meeting Tuesday.
"The message is simply the Fed stands ready to do everything in its power to stop the economy's free fall," said Richard Yamarone, economist at Argus Research.
On Wall Street, those expectations lifted stocks. The Dow Jones industrials gained about 70 points in morning trading.
In its battle against a recession that started last December, the Fed already has cut the target for the federal funds rate, its main tool for influencing economic activity, to 1 percent, a level seen only once before in the last half-century.
Many economists predict the Fed will cut the funds rate in half -- to just 0.50 percent. A few think the Fed could opt for an even more forceful action -- lowering rates by a whopping three-quarters percentage point or more. If that larger cut occurs, it would be the lowest on records that track the monthly average of the funds rate going back to 1954. The funds rate is the interest banks charge each other on overnight loans.
The benefit of another Fed rate reduction, though, may be mostly psychological, rather than economic.
"It's a feel-good thing," said economist Ken Mayland, president of ClearView Economics. "Hopefully this a bridge to better confidence."
Slammed by the financial crisis, worried banks have hoarded their cash and been extremely reluctant to lend money to customers. Fearful consumers, watching jobs vanish and their investments tank, have sharply cut back their spending, including on big-ticket purchases like homes and cars that typically involve financing.
In response to the Fed's expected action, the prime rate -- now at 4 percent -- for many consumer and small-business loans would drop by a corresponding amount. The prime lending rate is used to peg rates on home equity loans, certain credit cards and other consumer loans. Cheaper rates could give pinched borrowers a dose of relief.
The goal of lower borrowing costs is to entice people and businesses to spend more, which would revive the economy. So far, though, the Fed's aggressive rate reductions have failed to stabilize the economy.
Bernanke says the Fed is weighing other ways to aid the economy given that it can lower the funds rate only so far -- to zero.
For example, the Fed could buy longer-term Treasury or agency securities on the open market in substantial quantities. This might lower rates on these securities and help spur buying appetites.
A Fed program announced late last month to buy $600 billion in debt and mortgage-backed securities from mortgage giants Fannie Mae and Freddie Mac already has helped pushed mortgage rates down.
By boosting the quantity of money in the financial system, the Fed has engaged in so-called "quantitative easing" to provide economic relief. The Fed's balance sheet has ballooned to $2.2 trillion, from close to $900 billion in September, reflecting efforts to mend the financial system.
"Never in the postwar history has the Fed acted as lender of last resort to this degree," Mayland said.
In fact, with all the lending by the Fed, the actual funds rate has fallen at times well below its current 1 percent target.
Hours before the Fed's announcement, the Labor Department reported that consumer prices fell by a record 1.7 percent in November as energy prices retreated. It marked the second straight month that prices dropped and raised the specter that the country could be heading for a dangerous bout of deflation.
Deflation means a widespread -- and prolonged -- decline in prices that hits Americans' incomes and corporate profits as well as already stricken housing values and investments. Lower rates by the Fed would help fend it off.
However, the White House welcomed the drop in energy prices, which had soared to record highs in July. "It gives families more cash to spend on other priorities," said spokesman Tony Fratto.
Another report underlined the housing market's woes. The number of housing projects started in November plunged by 18.9 percent, the most in a quarter-century as builders slashed production, the Commerce Department reported. That left housing starts at just 625,000, on an annualized basis, a new all-time low that broke last month's record.
As housing, credit and financial problems persist, the economic rubble mounts higher.
Shell-shocked employers axed 533,000 jobs in November alone. That drove the unemployment rate up to 6.7 percent, a 15-year high.
Since the start of the recession, the economy has shed nearly 2 million jobs. Analysts predict another 3 million more will be lost between now and the spring of 2010.
Last week alone, Bank of America Corp., tool maker Stanley Works and Sara Lee Corp., known for food brands such as Jimmy Dean and Hillshire Farm, announced job cuts.
General Motors Corp. and Chrysler LLC, are in danger of running out of money within weeks and are seeking government aid. The White House is exploring ways to throw a lifeline to Detroit after rescue efforts collapsed in Congress.
With the employment market eroding and consumers retrenching, the economy could stagger backward at a shocking 6 percent rate in the current October-December quarter, analysts predict. It shrank at a 0.5 percent pace in the third quarter.
President-elect Barack Obama is advocating an economic recovery plan that includes spending on big public works projects to bolster jobs. His plan also includes tax cuts to spur consumers to spend more and businesses to step up investment and hiring.
People naturally group information by topic and remember relationships between important things, like a person and the company where she works. But enabling computers to grasp these same concepts has been the subject of long-standing research. Recently, this has focused on the Semantic Web, but a European endeavor called the Nepomuk Project will soon see the effort take new steps onto the PC in the form of a "semantic desktop."
Those working on the project, coordinated by the German Research Center for Artificial Intelligence (DFKI), have been toiling for three years to create software that can spot meaningful connections between the files on a computer. Nepomuk's software is available for several computer platforms and now comes as a standard component of the K Desktop Environment (KDE), a popular graphical interface for the Linux operating system.
The idea of a semantic desktop is not new. The Open Source Applications Foundation and SRI, two nonprofit organizations, have both worked on similar projects. But previous efforts have suffered from the difficulty of generating good semantic information: for semantic software to be useful, semantic information needs to be generated and tagged to files and documents. But without useful applications in the first place, it is hard to persuade users to generate and tag this data themselves.
Nepomuk is distinguished by a more practical vision, says Ansgar Bernardi, deputy head of knowledge management research at DFKI. The software adds a lot of semantic information automatically and encourages users to add more by making annotated data more useful. It also provides an easy way to share tagged information with others.
The software generates semantic information by using "crawlers" to go through a computer and annotate as many files as possible. These crawlers look through a user's address book, for example, and search for files related to the people found in there. Nepomuk can then connect a file sent by a particular person with one related to the company that person works for, making Nepomuk a particularly useful way to search a computer, Bernardi says.
While most operating systems let users search on their computer by keyword alone, Nepomuk can uncover more useful information by focusing on the connections between data; it can locate relevant files if they don't mention the keyword used to search. And peer-to-peer file-sharing architecture built into the system also makes it easy to share files and the associated semantic data between users.
"This might be the semantic desktop that actually survives," says Nova Spivack, CEO and founder of Radar Networks, the company behind Twine, a semantic bookmarking and social-networking service. "There's a lot of potential to build on what they've done."
Spivack notes that other efforts to bring semantic technology to the desktop haven't succeeded in reaching end users. "Nepomuk is designed for real people and developers," he says. For this reason, Spivack sees the inclusion of Nepomuk in KDE as particularly important, since KDE software is widely distributed and can easily be modified by software developers.
Although funding for the official Nepomuk project ends this month, Bernardi expects it to continue as an open-source software effort. A spinoff company is also in the works, he says, and a newly founded legal body called the Open Semantic Collaboration Architecture Foundation will help coordinate continuing work on the technology created by Nepomuk.
Nepomuk's software is available in several platforms besides KDE. Users can download the basic software for free for Windows, Macintosh, and Linux. It is also possible to use Nepomuk in a more limited way--just for Web pages viewed through Firefox, for example--with a limited installation.
Power polymer: A new polymer, shown in powdered form in this photo, can be used to make stable fuel-cell membranes that conduct negatively charged ions. Credit: National Academy of Sciences/PNAS
Fuel cells are, in principle, the most efficient way to convert hydrogen fuel into electricity. But they require expensive catalysts such as platinum to split hydrogen into ions and electrical current. Cheaper metals simply can't withstand the harsh acidic environment of the fuel cell. Now researchers in China have developed a fuel cell that uses a new membrane material to operate in alkaline conditions, eliminating the need for an expensive catalyst. The power output of the new prototype, which uses nickel as a catalyst, is still relatively low, but it provides a first demonstration of a potentially much less expensive fuel cell.
Conventional fuel cells consist of two electrodes coated with a platinum catalyst that splits hydrogen fuel into acidic hydrogen ions and electrons. The electrodes are separated by a polymer membrane that conducts acidic hydrogen ions from one side to the other, creating an external electrical current. The new fuel cell, developed by researchers led by Lin Zhuang, a professor of chemistry at Wuhan University, in Wuhan, China, uses a new membrane that conducts alkaline ions called hydroxyl groups. Alkaline fuel cells work by reacting hydrogen and oxygen to create hydroxyl ions and water, a reaction catalyzed in the Wuhan University fuel cell by the nickel anode. The hydroxyl ions are conducted across the polymer membrane, generating an external electrical current.
Most researchers have been focused on acidic fuel cells because membranes that work well under such conditions have already been developed. A stable hydroxyl-conducting membrane has been "the holy grail of electrochemistry," says Robert Savinell, a professor of chemical engineering at Case Western Reserve University, in Cleveland. Such a membrane would allow researchers to build fuel cells and batteries that don't require precious-metal catalysts but can use cheaper ones like nickel.
Zhuang's polymer is comparable in structure to the highly conductive polymer Nafion that's used in conventional acidic fuel cells. It may prove to be less expensive than Nafion, which must be fortified with fluorine groups to protect it from acidic conditions. Other researchers are working on improving the power output and lowering the cost of acidic fuel cells by developing alternatives to Nafion, but these cells still require expensive catalysts.
Zhuang's group demonstrated the new membrane in an alkaline fuel cell that uses a silver cathode and a hydrogen-splitting nickel anode as the catalyst. The nickel catalysts used in previously developed alkaline fuel cells weren't very efficient because they quickly got oxidized, so alkaline fuel cells have used the same platinum catalysts as their acidic counterparts. The Wuhan researchers created an anode coated with nickel nanoparticles decorated with chromium that's more tolerant to oxidation than previous nickel catalysts.
The power output of the new fuel cell--about 50 milliwatts per square centimeter at 60 ÂşC--is modest. But as the first demonstration of an alkaline fuel cell that doesn't require expensive metal catalysts, it's an important proof of principle, researchers say. Fuel cells have a long way to go in terms of efficiency, long-term stability, and expense, says Frank DiSalvo, a professor of physical science at Cornell University, in Ithaca, NY. "This work enhances the research tool kit of materials we can explore to see if we can deliver on fuel-cell efficiency," he says.
Zhuang says that he and his group are working on improving the cell's power output by further tuning the catalyst and the membrane. They'll also have to demonstrate the long-term stability of the cell. "We believe that catalysts with higher activity and lower cost will soon be realized," he says.
Nimble sensors: Small, unmanned aerial vehicles, such as this 18-kilogram Boeing ScanEagle, could provide more precise data about weather systems, to increase the accuracy of long-term forecasts. Credit: Boeing
Weather forecasters may not have the best reputation for accuracy, but with today's computational modeling, it's possible to make pretty reliable weather predictions up to 48 hours in advance. Researchers at MIT, however, believe that autonomous aircraft running smart storm-chasing algorithms could get that figure up to four days. Better weather forecasting could help farmers and transportation authorities with planning and even save lives by providing earlier warnings about storms and severe weather, says Jonathan How, principal investigator at MIT's Department of Aeronautics and Astronautics.
Long-term predictions don't necessarily go wrong because of forecasting models, but rather because initial conditions were inaccurately measured, says Martin Ralph, a research meteorologist at the National Oceanic and Atmospheric Administration's earth systems laboratory, in Boulder, CO. Such inaccuracies come from gaps in the data, he says.
Ground-based sensors are already used to record temperature, wind speed, humidity, air density, and rainfall, but they gauge conditions only at ground level, says How. At sea, where many severe weather fronts originate, the coverage is much sparser. Satellite observations help build up a picture, but satellites are blind to a number of useful types of data, such as low-altitude wind speed and atmospheric boundary conditions, says Ralph.
To get the most accurate readings, you really want to get your sensors into the weather itself, says How. In theory, weather balloons can do this, but only if they happen to be in the right place at the right time. So weather services currently attempt to track down weather systems using piloted planes that fly prescribed routes, taking measurements along the way. The logistics of deploying such planes is so complicated, however, that it's difficult to change their routes in response to changing weather conditions.
Consequently, says How, there has been a lot of interest in using unmanned aerial vehicles, or UAVs, instead. The idea is that there would be a constant number of UAVs in the air, continuously working together to position themselves in what would collectively be the most useful locations.
The problem, says How, is that calculating the most useful locations is an enormously complex task. It involves analyzing more than a million data states from hundreds of thousands of sensor locations, and using this data to predict the weather conditions six to eight hours from now. But that's exactly the challenge that the MIT researchers tackled.
So far, the algorithms they developed have been used only in a simulation, as part of a National Science Foundation project. MIT's Han-Lim Choi, who has been working on the algorithms as part of his PhD research, presented the latest results of the project last week at the IEEE Conference on Decision Control in Cancun, Mexico. The work has attracted the interest of the U.S. Navy, and the MIT group is applying for funding to put the algorithms into practice, says How.
Another challenge is size, says Floreano. The UAVs need to be small and safe enough to not harm humans and objects if they are deployed in large numbers. He points out, however, that subkilogram UAVs are now becoming available.
In fact, How and his colleagues are more interested in testing their algorithms on the relatively large ScanEagle UAVs from Boeing, which weigh about 18 kilograms apiece. These would be capable of flying distances in excess of 1,000 miles, even laden with sensors and communications equipment. With this sort of range, a fleet of just four could reasonably cover a good-sized area, reducing the risk of collisions with manmade objects.
As Internet service providers (ISP) struggle with increasing traffic from peer-to-peer file-sharing networks, some have resorted to simply throttling this data, attracting ire from both users and regulators. Under a scheme that should be rolled out early next year, some ISPs plan to take a different approach: cooperating with file-sharing networks so that they share data more effectively.
The new scheme is called Provider Portal for Applications (P4P), and it's a voluntary, open standard that requires ISPs to share some information about how their networks are laid out. Initial tests have shown that the P4P framework can dramatically speed up download times for file sharers while also reducing the bandwidth costs for ISPs.
Peer-to-peer file sharing has exploded over the past decade, driven by increasing consumer bandwidth and growing demand for large amounts of data. Rather than serve files from a centralized location, file-sharing networks scatter pieces of among thousands of individual computers and help users find and download this data. File sharing now accounts for about 70 percent of all network traffic, and some ISPs have found it hard to deal with the increased load. In August, Comcast was rebuked by the Federal Communications Commission for trying to throttle peer-to-peer traffic on its network.
The new protocol reduces file-trading traffic by having ISPs reveal some internal network information to peer-to-peer "trackers"--servers that are used to locate files for downloading. Trackers can then use this network information to arrange file sharing more efficiently, by connecting computers that are nearer and sharing files at the lowest resource cost to the ISPs involved. As an example, suppose someone running a BitTorrent client tries to download an MP3. As it stands, the file might come from a computer halfway around the world, even if someone next door also happens to have a copy. By using P4P, the tracker knows to connect computers that are closer together, requiring bits to travel less distance.
"We knew, as a peer-to-peer company, that in order for peer to peer to become successfully commercialized, network operators had to be cooperative," says Robert Levitan, CEO of Pando Networks, a company that offers commercial peer-to-peer content delivery services. "Instead of blocking traffic, they had to get involved in it."
Pando is a founding member of the P4P Working Group, a consortium set up in 2007 to develop and test new technologies to make P2P more efficient. Members include the ISPs Verizon and Comcast, the peer-to-peer software business BitTorrent, the network equipment manufacturer Cisco Systems, and academic institutions including Yale and Washington University.
Small-scale tests conducted in March by Yale researchers, Pando, Verizon, and Telefonica Group suggest that the system could cut the average distance that data has to travel from 1,000 miles to 160 miles, and reduce the number of connections that have to be made through major hubs from 5.5 to 0.69. This would help ISPs avoid the costs incurred when information is handed between major networks. The approach could also benefit users, by increasing download speeds by an average of 20 percent, according to the same tests.
A more recent study carried out this fall with Comcast, Verizon, and AT&T showed that peer-to-peer download speeds could increase 50 to 150 percent using the technology. And the amount of content that is delivered entirely within each ISP should increase from 14 percent to as much as 89 percent.
But the P4P approach is not without its challenges. The protocol depends on ISPs calculating and making available "p-distance values" to peer-to-peer trackers, to tell them how best to connect different file sharers. There are also legal questions. Because many files traded on peer-to-peer networks violate copyright, ISPs will want to make sure cooperating with P2P networks won't make them responsible.
Nonetheless, Richard Woundy, senior vice president for software and applications with Comcast, admits that the idea is appealing. "The ISP benefits because traffic isn't going over as much infrastructure," he says. "It's staying within a metro area, or at least staying within the ISP. It's not going over a transit link to an upstream provider."
Doug Pasko, principal member of the technical team at Verizon, says that Pandora and Verizon have plans to roll out a P4P implementation soon, possibly by the end of January. The P4P working group has also submitted an application with the Internet Engineering Task Force to seek official approval for the P4P standard. And Pasko doesn't think that legal problems are likely. "P4P itself doesn't increase our legal exposure," he says. "That's because we're offering optimization guidance. We don't have any information on what that content is."
Comcast is also interested in implementing the technology, says Barry Tishgart, vice president for Internet services for the company. "Our inclination is, we want to do it. The results of our trial are very positive," he says. But the tests carried out so far have been relatively small: the one performed this fall shared a single 21-megabyte video file, which was downloaded 15,000 times. So Tishgart wants to see what happens when larger file sizes and large swarms of peers try to download a popular file.
Finally, the success of the scheme depends on the thousand or more peer-to-peer trackers that currently exist agreeing to use the P4P protocol. Tishgart says that they tend to be suspicious of the ISP's motives. But if they see performance gains for their users and no downside, then they may be much more likely to cooperate.
Black liquor: Chemrec’s gasification plant in New Bern, NC. The paper mill there consumes up to 330 tons of black liquor per day--a mixture of caustic chemicals and dissolved wood left over from the bleaching of pulp for paper production. It currently produces a clean burning gas that provides heat energy to the mill, and recycles the caustic chemicals. Credit: Chemrec
Pulp and paper plants could soon double as biorefineries if financing for a Swedish gasification project is any indication. As gas prices have slumped this fall, threatening to run some biofuels innovators out of business, Swedish company Chemrec has pulled in a stream of grants and investments backing a process for turning the black liquor left over from pulp and paper bleaching into a clean-burning synthetic biofuel.
Chemrec received $20 million in venture-capital funding earlier this month, and another $300,000 from the U.S. Department of Energy this week to assess the feasibility of applying its process at a pulp mill in Escanaba, MI. The Stockholm-based firm was already ramping up R&D through a $37 million EU-supported research consortium involving seven European industrial firms that was launched in September.
Part of the attraction is the ecological profile of the biofuel generated with Chemrec's process, dimethyl ether (DME), which can be used as a replacement for liquefied petroleum gas (LPG) and diesel. Amidst growing angst over the ecological impacts of biofuels production and the disruption caused to food production, recent analyses, such as the EU's Renew study of second-generation biofuels, have found that DME made from biomass gasification provides the highest greenhouse-gas reduction for the lowest cost.
The heart of Chemrec's technology is a gasification process that turns black liquor into a mix of carbon monoxide, hydrogen, and CO2 called synthesis gas, or syngas, for short. Gasification of coal is already a booming business in China, where the resulting syngas is converted into chemicals and fuels. And gasification of wood chips is also on the rise. For example, Canada's Nexterra Energy is one of several developers installing small power plants that gasify wood chips and burn the resulting syngas to generate power and heat for residential developments.
But black liquor is an obvious feedstock for biomass gasification. Pulp mills already take care of gathering loads of biomass, and, as a liquid, the waste liquor is easier to feed into the gasifier than are solid chunks of biomass. In practice, however, this waste has proved tough to gasify. The mixed success to date of black liquor gasification developer ThermoChem Recovery International, based in Baltimore, exemplifies the challenge. Of two large-scale installations using ThermoChem's technology, one is still running, while the second never operated commercially due to gasifier design flaws.
Chemrec CEO Jonas Rudberg explains that black liquor is particularly difficult to deal with because of the highly caustic inorganic chemicals, such as sodium hydroxide, employed to break down the pulp. In Chemrec's reactor design, black liquor and pure oxygen injected in from the top feed an 1,800 °C fireball at the center of the reactor. Most of the dissolved wood in the black liquor forms syngas and flows out of the reactor.
The inorganic chemicals, however, form a molten smelt of sodium sulfide and sodium carbonate on the heat-shielding ceramic tiles protecting the reactor walls. As the smelt flows down and out of the reactor, it attacks the ceramics. "In this contact between smelt and ceramic, reactions occur which alter the surface of the refractory," says Rudberg. "The key trick is to select materials which can withstand this chemical impact."
Rudberg says that Chemrec worked closely with researchers at Oak Ridge National Laboratory to identify appropriate materials for testing in a gasification plant that has operated at a Weyerhaeuser mill in New Bern, NC, since 1996. This plant can process up to 15 percent of the mill's black liquor. Rudberg says that the refractory at New Bern has been operating for two years, which he believes is long enough to prove that its commercialization is viable.
That performance is clearly enough to convince Chemrec's backers to finance the next step: generating biofuel from the syngas. While Weyerhaeuser simply burns the syngas to generate heat at New Bern, Chemrec's small research plant in Pitea, Sweden, has demonstrated production of syngas pure enough for catalytic fuel synthesis. BioDME, Chemrec's EU-funded consortium, will turn that syngas into between four and five metric tons of DME per day.
Another BioDME partner, Haldor Topsoe, will build the DME synthesis plant, to start up in 2010. Göteborg-based Volvo Group (not to be confused with the Ford-owned luxury-car division, Volvo Cars) will adapt the fuel systems of 14 long-haul diesel trucks to run on DME. And Swedish oil company Preem is building four fueling stations to distribute the DME across Sweden.
At the same time, Chemrec is doing the engineering for two plants that would be 25 times larger, producing 40,000 tons of DME each year: one at Pitea, and one at the New Page mill in Michigan. Converting every pulp mill in the United States would, according to Rudberg, generate the equivalent of about 7.5 billion gallons of fuel--about one-fifth of the U.S. government's total target for 2020.
But it remains questionable whether demand would naturally follow. DME is currently used primarily as a substitute in aerosol spray cans, and clearly more than four fueling stations in Sweden will be needed for it to take off as a biofuel. Marc Londo, a senior research and biofuels expert at the Energy Research Center of the Netherlands, in Amsterdam, says that this chicken-and-egg dilemma is a major drawback. He believes that Chemrec's success would be better assured if it produced synthetic diesel from its syngas--a strategy pursued by German biomass gasification innovator Choren Industries. "The strong advantage of synthetic diesel is you can simply blend it with currently available diesel," says Londo. "For bio DME, you need dedicated distribution networks."
Londo says that synthetic diesel has another advantage: while it costs slightly more to produce from syngas than DME does, synthetic diesel has a higher energy density. A tank of diesel will take a long-haul truck twice as far as a tank of DME: "For long-haul trucks, energy density is a critical factor, and synthetic diesel is thus a more valuable fuel," he says.
BioDME project leader Per Salomonsson, an R&D manager with Volvo Group, says that it comes down to how much fuel an acre of land will produce. Synthetic diesel would be a lot easier for Volvo Group to drop into its vehicles, but according to their estimates, DME will deliver over 65 percent more miles of travel per acre cultivated; compared with conventional biodiesel produced from vegetal oil, the advantage is five to one. "There will be a shortage of biomass in the future," says Salomonsson. "In the long run, we can't afford to have anything but the most efficient process."
.
All you art collectors out there. Here is a chance to get a Giclee copy of some of Ian M Sherwin work. Ian is planning on doing a whole series of Marblehead, Massachusetts paintings. His work is amazing.