Nerval's Lobster writes "While automakers from Tokyo to Detroit rush to sprinkle their respective vehicles with all sorts of sensors and screens, the chairman of Volkswagen Group has warned about the limits of data analytics for automobiles. 'The car must not become a data monster,' Martin Winterkorn told an audience at the CeBit trade show in Germany, according to Re/code. 'I clearly say yes to Big Data, yes to greater security and convenience, but no to paternalism and Big Brother.' At the same time, Winterkorn endorsed a closer relationship between tech companies such as IBM and the auto industry, and highlighted Volkswagen's experiments with autonomous driving—both of which will necessarily infuse automakers (and his company in particular) with more data-driven processes. The question is which policies from which entities will ultimately dictate how that data is used. Winterkorn isn't the first individual to voice concerns about how automakers (and their partners) store and analyze all that vehicle data. At this January's Consumer Electronics Show (CES) in Las Vegas, a Ford executive drew considerable controversy by suggesting that Ford collects detailed information on how customers use its vehicles. 'We know everyone who breaks the law, we know when you're doing it. We have GPS in your car, so we know what you're doing. By the way, we don't supply that data to anyone,' Jim Farley, Ford's global vice president of marketing and sales, told show attendees. Farley later attempted to clarify his statement to Business Insider, but that didn't stop a fierce debate over vehicle monitoring—and certainly hasn't stopped automakers and tech companies from collaborating over more ways to integrate data-centric features to vehicles."
Become a fan of Slashdot on Facebook
An anonymous reader writes "As reported before on Slashdot, one of the most terrible sins on Wikipedia is to edit articles for pay, or otherwise violate the 'neutral point of view' policy, per their co-founder Jimmy Wales. And yet, the Wikipedia-criticism website Wikipediocracy recently began a study showing that dozens of the Wikimedia Foundation's largest cash donors have violated that policy. Repeatedly, and wantonly. In short, they wrote articles about themselves or their companies, then gave the WMF big donations — and were not confronted about violating the NPOV policy." Do the proposed TOS changes address this? Note that they also found that many of the donors adequately documented their conflict of interest.
dcblogs writes with news that the rumored IBM layoffs have begun. "IBM is laying off U.S. employees this week as part of a $1B restructuring, and is apparently trying keep the exact number of cuts secret. The Alliance@IBM, the main source of layoff information at IBM, says the company has stopped including in its resource action documents, given to cut employees, the number of employees selected for a job cut. The union calls it a 'disturbing development.' Meanwhile, two days prior to the layoffs, NY Governor Cuomo announced that it reached a new minimum staffing level agreement with IBM to 'maintain 3,100 high-tech jobs in the Hudson Valley and surrounding areas.' The governor's office did not say how many IBM jobs are now there, but others put estimate it at around 7,000. Lee Conrad, a national coordinator for the Alliance, said the governor's announcement raises some questions for workers and the region. 'Yes, you're trying to protect 3,100 jobs but what about the other 3,900 jobs?' The Alliance estimates that anywhere from 4,000 to 6,000 U.S. workers could be impacted by the latest round of layoffs. IBM says it has more than 3,000 open positions in the U.S., and says the cuts are part of a 'rebalancing' as it shifts investments into new areas of technology, such as cognitive computing." Alliance@IBM has a page collecting reports from people terminated today.
Nerval's Lobster writes "Ray Kurzweil, the technologist who's spent his career advocating the Singularity, discussed his current work as a director of engineering at Google with The Guardian. Google has big plans in the artificial-intelligence arena. It recently acquired DeepMind, self-billed 'cutting edge artificial intelligence company' for $400 million; that's in addition to snatching up all sorts of startups and research scientists devoted to everything from robotics to machine learning. Thanks to the massive datasets generated by the world's largest online search engine (and the infrastructure allowing that engine to run), those scientists could have enough information and computing power at their disposal to create networked devices capable of human-like thought. Kurzweil, having studied artificial intelligence for decades, is at the forefront of this in-house effort. In his interview with The Guardian, he couldn't resist throwing some jabs at other nascent artificial intelligence systems on the market, most notably IBM's Watson: 'IBM's Watson is a pretty weak reader on each page, but it read the 200m pages of Wikipedia. And basically what I'm doing at Google is to try to go beyond what Watson could do. To do it at Google scale. Which is to say to have the computer read tens of billions of pages. Watson doesn't understand the implications of what it's reading.' That sounds very practical, but at a certain point Kurzweil's predictions veer into what most people would consider science fiction. He believes, for example, that a significant portion of people alive today could end up living forever, thanks to the ministrations of ultra-intelligent computers and beyond-cutting-edge medical technology."
colinneagle writes "Amid all the talk about Microsoft forking Android for a smartphone OS, one suggestion involves a look back to Microsoft's DOS days. Microsoft DOS was designed per IBM's specification to run exclusively on IBM's PC hardware platforms. Phoenix Technologies employed software developers it nicknamed 'virgins,' who hadn't been exposed to IBM's systems, to create a software layer between Microsoft's DOS system and PCs built by IBM's competitors. This helped Microsoft avoid infringing on IBM's patents or copyrights, and subsequently helped fuel the explosive growth of PC clones. Microsoft could use the same approach to 'clone' the proprietary Android components in its own Android fork. This would prevent copyright infringement while giving Microsoft access to Google Play apps, as well as Android's massive base of developers." Microsoft (or anyone) could generate a lot of goodwill by completely replacing the proprietary bits of Android; good thing that doing so is a work in progress (and open-source, too), thanks to Replicant. (Practically speaking, though, couldn't Google just make access to the Play Store harder, if Microsoft were to create an Android-alike OS? Even now, many devices running Android variants don't have access to it.)
An anonymous reader writes "Corporate employees editing Wikipedia articles about themselves or their employers sometimes commit major violations of Wikipedia's "bright line" against paid editing, devised by Jimbo Wales himself, to prevent 'COI' editing. (Consider the recent flap over the firm Wiki-PR's activities, for example.) Yet the Wikipediocracy website, run by critics of Wikipedia management, has just published an article about IBM employees editing Wikipedia articles. Not only is such editing apparently commonplace, it's being badly done as well. And most bizarrely, one of the IBM employees is a Wikipedia administrator, who is married to another Wikipedia administrator. She works on the Watson project, which uses online databases to build its AI system....including the full text of Wikipedia." Reading about edit wars is also far more informative (if less entertaining) than reading the edit wars themselves.
An anonymous reader writes "When we last checked in with Tim Armstrong, the AOL CEO was demonstrating 'Leadership with a Capital L' to employees of the company's Patch local news subsidiary by summarily firing an employee in the middle of a conference call for taking photos. Armstrong continued to serve up tasty material for tech bloggers this past week, blaming $7.1 million in extra expenses from Obamacare, and for $2 million in expenses for 'two AOLers that had distressed babies', for a decision to hold all matching funds for employee 401K programs until the end of each calendar year. After a small firestorm in the press, and a petition from AOL employees unhappy with both the policy change and the way it was presented, Armstrong reversed course, reinstating the per-period match and apologizing for mentioning the individual employee cases (TechCrunch is an AOL subsidiary). Incidentally, Armstrong was originally following in the footsteps of IBM, which made similar changes to its 401K program that went into effect last year."
jfruh writes "Having already gotten out of the low-end server market, IBM appears to be trying to get out of the chip business as well. The company currently manufactures Power Architecture chips for its own use and for other customers. Big Blue wants to sell off its manufacturing operations, but will continue to design its own chips."
First time accepted submitter MAE Keller writes "Two U.S. companies are joining a military research program to develop sensitive electronic components able to self-destruct on command to keep them out of the hands of potential adversaries who would attempt to counterfeit them for their own use. From the article: 'Last Friday DARPA awarded a $2.1 million contract to PARC, and a $3.5 million contract to IBM for the VAPR program, which seeks to develop transient electronics that can physically disappear in a controlled, triggerable manner.'"
msmoriarty writes with news that the Eclipse foundation is ten years old this week. Although Eclipse was released in 2001, development was controlled by IBM until the creation of the independent Eclipse Foundation in 2004. "According to Eclipse Foundation Director Mike Milinkovich, that's a major reason Eclipse was able to thrive: 'IBM....did an exemplary job of setting Eclipse free ... We became the first open source organization to show that real competitors could collaborate successfully within the community.' He also talks about misconceptions about Eclipse, its current open source success, and what he sees for the future."
benrothke writes "At first glance, The Art of the Data Center: A Look Inside the Worlds Most Innovative and Compelling Computing Environments appears like a standard coffee table book with some great visuals and photos of various data centers throughout the world. Once you get a few pages into the book, you see it is indeed not a light-read coffee table book, rather a insightful book where some of the brightest minds in the industry share their insights on data center design and construction." Read below for the rest of Ben's review.
KentuckyFC writes "In May last year, Google and NASA paid a reported $15 million for a quantum computer from the controversial Canadian start up D-Wave Systems. One question mark over the device is whether it really is quantum or just a conventional computer in disguise. That's harder to answer than it sounds, not least because any direct measurement of a quantum state destroys it. So physicists have to take an indirect approach. They assume the computer is a black box in which they can input data and receive an output. Given this input and output, the question is whether this computing behavior can be best reproduced by a classical or a quantum algorithm. Last summer, an international team of scientists compared a number of classical algorithms against an algorithm that relies on a process called quantum annealing. Their conclusion was that quantum annealing best reproduces the D-Wave computer's behavior, a result that was a huge boon for the company. Now a group from UC Berkeley and IBM's Watson Research Lab says it has a found a classical algorithm that explains the results just as well, or even better, than quantum annealing. In other words, the results from the D-Wave machine could just as easily be explained if it was entirely classical. That comes on the back of mounting evidence that the D-Wave computer may not cut the quantum mustard in other ways too. Could it be that Google and NASA have forked out millions for a classical calculator?"
McGruber writes "Like the Mac, the IBM PC Junior first went on sale in late January 1984. That is where the similarities end — the PC Junior became the biggest PC dud of all time. Back on May 17, 1984, the NY Times reported that the PC Junior 'is too expensive for casual home users, but, at the same time, is not nearly powerful enough for serious computer users who can afford a more capable machine.' The article also quoted Peter Norton, then still a human programmer who had not yet morphed into a Brand, who said that the PC Junior 'may well be targeted at a gray area in the market that just does not exist.'' IBM cancelled the machine in March 1985, after only selling 270,000 of them. While it was a commercial flop, the machine is still liked by some. Michael Brutman's PCJr page attempts to preserve the history and technical information of the IBM PCjr and YouTube has a video of a PC Junior running a demo."
itwbennett writes "Well, that was fast. Earlier this week the rumor mill was getting revved up about a potential sale of IBM's x86 server business, with Lenovo, Dell, and Fujitsu reportedly all interested in scooping it up. On Thursday, Lenovo Group announced it has agreed to buy IBM's x86 server hardware business and related maintenance services for $2.3 billion. The deal encompasses IBM's System x, BladeCenter and Flex System blade servers and switches, x86-based Flex integrated systems, NeXtScale and iDataPlex servers and associated software, blade networking and maintenance operations. IBM will retain its System z mainframes, Power Systems, Storage Systems, Power-based Flex servers, and PureApplication and PureData appliances." SlashBI has some words from an analyst about why Lenovo wants the x86 product line more than IBM does.
itwbennett writes "It was widely reported last year (including on Slashdot) that IBM attempted to sell off its x86 server business to Lenovo, which seemed logical as Lenovo had bought out the IBM's PC business a decade ago. However, the two firms could not come to financial terms and the deal was never struck. Well, the rumors have started up again, only this time Lenovo has some competition, as Dell and Fujitsu are now being thrown into the mix as possible suitors."
Nerval's Lobster writes "IBM believes its Watson supercomputing platform is much more than a gameshow-winning gimmick: its executives are betting very big that the software will fundamentally change how people and industries compute. In the beginning, IBM assigned 27 core researchers to the then-nascent Watson. Working diligently, those scientists and developers built a tough 'Jeopardy!' competitor. Encouraged by that success on live television, Big Blue devoted a larger team to commercializing the technology—a group it made a point of hiding in Austin, Texas, so its members could better focus on hardcore research. After years of experimentation, IBM is now prepping Watson to go truly mainstream. As part of that upgraded effort (which includes lots of hype-generating), IBM will devote a billion dollars and thousands of researchers to a dedicated Watson Group, based in New York City at 51 Astor Place. The company plans on pouring another $100 million into an equity fund for Watson's growing app ecosystem. If everything goes according to IBM's plan, Watson will help kick off what CEO Ginni Rometty refers to as a third era in computing. The 19th century saw the rise of a "tabulating" era: the birth of machines designed to count. In the latter half of the 20th century, developers and scientists initiated the 'programmable' era—resulting in PCs, mobile devices, and the Internet. The third (potential) era is 'cognitive,' in which computers become adept at understanding and solving, in a very human way, some of society's largest problems. But no matter how well Watson can read, understand and analyze, the platform will need to earn its keep. Will IBM's clients pay lots of money for all that cognitive power? Or will Watson ultimately prove an overhyped sideshow?"
judgecorp writes "Two media reports suggest that the Universal Credit scheme to overhaul Britain's welfare programme is in trouble. The IT project to support Universal Credit was launched by the Cabinet Office, and it will be completed and run by the Department for Work and Pensions (DWP) — but the Guardian says the Cabinet Office has pulled out its elite experts too soon, while a different leak told Computer Weekly that the four original suppliers — HP, IBM, Accenture and BT — have been effectively frozen out in an internal change. It's the biggest change to Britain's benefits system for many years, and all the evidence says it's not going well."
An anonymous reader writes with a link to Der Spiegel, which describes a Top-Secret spy-agency catalog which reveals that the NSA "has been secretly back dooring equipment from US companies including Dell, Cisco, Juniper, IBM, Western Digital, Seagate, Maxtor and more, risking enormous damage to US tech sector." Der Spiegel also has a wider ranging article about the agency's Tailored Access Operations unit.
McGruber writes "Myer, Australia's largest department store chain, has closed its website 'until further notice' at the height of the post-Christmas (and Australian summer) sales season. The website crashed on Christmas Day and has been down ever since. This means Myer will see no benefit for those days from booming domestic online sales, which were tipped to hit $344 million across the retail sector on Boxing Day alone. Teams from IBM and Myer's information technology division were 'working furiously' to fix the problem."
Esther Schindler writes "The big screen has always tried to keep step with technology usually unsuccessfully. Peter Salus looks at how the film industry has treated computing. For a long time, the 'product placement' of big iron was limited to a few brands, primarily Burroughs. For instance: 'Batman: The Movie and Fantastic Voyage (both 1966) revert to the archaic Burroughs B205, though Fantastic Voyage also shows an IBM AN/FSQ-7 Combat Direction Central. At 250 tons for each installation (there were about two dozen) the AN/FSQ-7 was the largest computer ever built, with 60,000 vacuum tubes and a requirement of 3 megawatts of power to perform 75,000 ips for regional radar centers. The last IBM AN/FSQ-7, at Luke Air Force Base in Arizona, was demolished in February 1984.' Fun reading, I think."