Of course, Linux is OpenSource which means it IS free.
Computers, WinXP, PC Games, Security, Linux Ubuntu News, Internet
SUMMARY: Joseph Menn of Reuters reports on the story he helped break about how activist hackers linked to the collective known as Anonymous have secretly accessed U.S. government computers in multiple agencies and stolen sensitive information. Menn says the campaign began almost a year ago and its scope is not yet known.
SUMMARY: Inside the high-tech criminal mind. It's no secret that cybercriminals are stealing personal information and credit card numbers by hacking into corporate and government computers. One school in Pittsburgh is training the next generation of cybersecurity experts to fight off the bad guys by teaching them to think the same way.
RICK KARR: The bad guys stole more than three million Social Security numbers from the State of South Carolina. As many as seventy million credit card numbers from Sony PlayStation. They got access to all of the personal details of some customers of a nationwide mortgage lending firm. But cybercriminals aren’t just looking to steal personal information and credit card numbers when they break into corporate computers -- they’re looking for other valuable information.
RICK KARR: All those flaws that Carnegie Mellon’s undergrads find every semester ... don’t necessarily mean that the software on your P-C or your bank’s web site is badly written. Almost every piece of software, every computer system has vulnerabilities that can be exploited -- it’s virtually impossible to make anything that’s connected to the internet perfectly secure. And today -- compared to 10 or 20 years ago, all of us have just so many more computers and smartphones and tablets -- all of them connected and vulnerable. So we’re vulnerable, too.
Carnegie Mellon’s students are so good at exploiting those vulnerabilities ... that the NSA enlisted them to create a game that teaches hacking skills to high-school-aged students -- and paid for the job. Cylab, the university’s cybersecurity institute, is home to the to-ranked competitive hacking team in the world: the Plaid Parliament of Pwning -- “pwn” is hacker-speak for “own”, as in the hacker takes a computer over and owns it. For third straight year, the team won top honors at international contests that pit teams of hackers against one another ... and utterly demolished the competition at a prestigious contest in Las Vegas.
By regulating a flow of voltage to the surface of smooth touch screen, Disney researchers in Pittsburgh discovered that they can create the sensation of texture and three-dimensional surfaces. The technology can represent an artificial texture applied to an image, or elevation data extracted from topographical maps. But how does a smooth surface simulate the feel of a 3D bump?
"Our brain perceives the 3D bump on a surface mostly from information that it receives via skin stretching," said Ivan Poupyrev, who directs Disney Research, Pittsburgh's Interaction Group. "Therefore, if we can artificially stretch skin on a finger as it slides on the touch screen, the brain will be fooled into thinking an actual physical bump is on a touch screen even though the touch surface is completely smooth."
ReactOS® is a free open source operating system based on the best design principles found in the Windows NT® architecture (Windows versions such as Windows XP, Windows 7, Windows Server 2012 are built on Windows NT architecture). Written completely from scratch, ReactOS is not a Linux based system, and shares none of the UNIX architecture.
The main goal of the ReactOS® project is to provide an operating system which is binary compatible with Windows. This will allow your Windows® applications and drivers to run as they would on your Windows system. Additionally, the look and feel of the Windows operating system is used, such that people accustomed to the familiar user interface of Windows® would find using ReactOS straightforward. The ultimate goal of ReactOS® is to allow you to use it as alternative to Windows® without the need to change software you are used to.
ReactOS 0.3.15 is still in alpha stage, meaning it is not feature-complete and is recommended only for evaluation and testing purposes.
The National Security Agency is winning its long-running secret war on encryption, using supercomputers, technical trickery, court orders and behind-the-scenes persuasion to undermine the major tools protecting the privacy of everyday communications in the Internet age, according to newly disclosed documents.
The agency has circumvented or cracked much of the encryption, or digital scrambling, that guards global commerce and banking systems, protects sensitive data like trade secrets and medical records, and automatically secures the e-mails, Web searches, Internet chats and phone calls of Americans and others around the world, the documents show.
Many users assume — or have been assured by Internet companies — that their data is safe from prying eyes, including those of the government, and the N.S.A. wants to keep it that way. The agency treats its recent successes in deciphering protected information as among its most closely guarded secrets, restricted to those cleared for a highly classified program code-named Bullrun, according to the documents, provided by Edward J. Snowden, the former N.S.A. contractor.
Beginning in 2000, as encryption tools were gradually blanketing the Web, the N.S.A. invested billions of dollars in a clandestine campaign to preserve its ability to eavesdrop. Having lost a public battle in the 1990s to insert its own “back door” in all encryption, it set out to accomplish the same goal by stealth.
The agency, according to the documents and interviews with industry officials, deployed custom-built, superfast computers to break codes, and began collaborating with technology companies in the United States and abroad to build entry points into their products. The documents do not identify which companies have participated.
The N.S.A. hacked into target computers to snare messages before they were encrypted. And the agency used its influence as the world’s most experienced code maker to covertly introduce weaknesses into the encryption standards followed by hardware and software developers around the world.
“For the past decade, N.S.A. has led an aggressive, multipronged effort to break widely used Internet encryption technologies,” said a 2010 memo describing a briefing about N.S.A. accomplishments for employees of its British counterpart, Government Communications Headquarters, or GCHQ. “Cryptanalytic capabilities are now coming online. Vast amounts of encrypted Internet data which have up till now been discarded are now exploitable.”
When the British analysts, who often work side by side with N.S.A. officers, were first told about the program, another memo said, “those not already briefed were gobsmacked!”
An intelligence budget document makes clear that the effort is still going strong. “We are investing in groundbreaking cryptanalytic capabilities to defeat adversarial cryptography and exploit Internet traffic,” the director of national intelligence, James R. Clapper Jr., wrote in his budget request for the current year.
In recent months, the documents disclosed by Mr. Snowden have described the N.S.A.’s broad reach in scooping up vast amounts of communications around the world. The encryption documents now show, in striking detail, how the agency works to ensure that it is actually able to read the information it collects.
The agency’s success in defeating many of the privacy protections offered by encryption does not change the rules that prohibit the deliberate targeting of Americans’ e-mails or phone calls without a warrant. But it shows that the agency, which was sharply rebuked by a federal judge in 2011 for violating the rules and misleading the Foreign Intelligence Surveillance Court, cannot necessarily be restrained by privacy technology. N.S.A. rules permit the agency to store any encrypted communication, domestic or foreign, for as long as the agency is trying to decrypt it or analyze its technical features.
The N.S.A., which has specialized in code-breaking since its creation in 1952, sees that task as essential to its mission. If it cannot decipher the messages of terrorists, foreign spies and other adversaries, the United States will be at serious risk, agency officials say.
Just in recent weeks, the Obama administration has called on the intelligence agencies for details of communications by Qaeda leaders about a terrorist plot and of Syrian officials’ messages about the chemical weapons attack outside Damascus. If such communications can be hidden by unbreakable encryption, N.S.A. officials say, the agency cannot do its work.
But some experts say the N.S.A.’s campaign to bypass and weaken communications security may have serious unintended consequences. They say the agency is working at cross-purposes with its other major mission, apart from eavesdropping: ensuring the security of American communications.
Some of the agency’s most intensive efforts have focused on the encryption in universal use in the United States, including Secure Sockets Layer, or SSL, virtual private networks, or VPNs, and the protection used on fourth generation, or 4G, smartphones. Many Americans, often without realizing it, rely on such protection every time they send an e-mail, buy something online, consult with colleagues via their company’s computer network, or use a phone or a tablet on a 4G network.
For at least three years, one document says, GCHQ, almost certainly in close collaboration with the N.S.A., has been looking for ways into protected traffic of the most popular Internet companies: Google, Yahoo, Facebook and Microsoft’s Hotmail. By 2012, GCHQ had developed “new access opportunities” into Google’s systems, according to the document.
“The risk is that when you build a back door into systems, you’re not the only one to exploit it,” said Matthew D. Green, a cryptography researcher at Johns Hopkins University. “Those back doors could work against U.S. communications, too.”
Paul Kocher, a leading cryptographer who helped design the SSL protocol, recalled how the N.S.A. lost the heated national debate in the 1990s about inserting into all encryption a government back door called the Clipper Chip.
“And they went and did it anyway, without telling anyone,” Mr. Kocher said. He said he understood the agency’s mission but was concerned about the danger of allowing it unbridled access to private information.
“The intelligence community has worried about ‘going dark’ forever, but today they are conducting instant, total invasion of privacy with limited effort,” he said. “This is the golden age of spying.”
A Vital Capability
The documents are among more than 50,000 shared by The Guardian with The New York Times and ProPublica, the nonprofit news organization. They focus primarily on GCHQ but include thousands either from or about the N.S.A.
Intelligence officials asked The Times and ProPublica not to publish this article, saying that it might prompt foreign targets to switch to new forms of encryption or communications that would be harder to collect or read. The news organizations removed some specific facts but decided to publish the article because of the value of a public debate about government actions that weaken the most powerful tools for protecting the privacy of Americans and others.
The files show that the agency is still stymied by some encryption, as Mr. Snowden suggested in a question-and-answer session on The Guardian’s Web site in June.
“Properly implemented strong crypto systems are one of the few things that you can rely on,” he said, though cautioning that the N.S.A. often bypasses the encryption altogether by targeting the computers at one end or the other and grabbing text before it is encrypted or after it is decrypted.
The documents make clear that the N.S.A. considers its ability to decrypt information a vital capability, one in which it competes with China, Russia and other intelligence powers.
“In the future, superpowers will be made or broken based on the strength of their cryptanalytic programs,” a 2007 document said. “It is the price of admission for the U.S. to maintain unrestricted access to and use of cyberspace.”
The full extent of the N.S.A.’s decoding capabilities is known only to a limited group of top analysts from the so-called Five Eyes: the N.S.A. and its counterparts in Britain, Canada, Australia and New Zealand. Only they are cleared for the Bullrun program, the successor to one called Manassas — both names of American Civil War battles. A parallel GCHQ counterencryption program is called Edgehill, named for the first battle of the English Civil War of the 17th century.
Unlike some classified information that can be parceled out on a strict “need to know” basis, one document makes clear that with Bullrun, “there will be NO ‘need to know.’ ”
Only a small cadre of trusted contractors were allowed to join Bullrun. It does not appear that Mr. Snowden was among them, but he nonetheless managed to obtain dozens of classified documents referring to the program’s capabilities, methods and sources.
Ties to Internet Companies
When the N.S.A. was founded, encryption was an obscure technology used mainly by diplomats and military officers. Over the last 20 years, with the rise of the Internet, it has become ubiquitous. Even novices can tell that their exchanges are being automatically encrypted when a tiny padlock appears next to the Web address on their computer screen.
Because strong encryption can be so effective, classified N.S.A. documents make clear, the agency’s success depends on working with Internet companies — by getting their voluntary collaboration, forcing their cooperation with court orders or surreptitiously stealing their encryption keys or altering their software or hardware.
According to an intelligence budget document leaked by Mr. Snowden, the N.S.A. spends more than $250 million a year on its Sigint Enabling Project, which “actively engages the U.S. and foreign IT industries to covertly influence and/or overtly leverage their commercial products’ designs” to make them “exploitable.” Sigint is the abbreviation for signals intelligence, the technical term for electronic eavesdropping.
By this year, the Sigint Enabling Project had found ways inside some of the encryption chips that scramble information for businesses and governments, either by working with chipmakers to insert back doors or by surreptitiously exploiting existing security flaws, according to the documents. The agency also expected to gain full unencrypted access to an unnamed major Internet phone call and text service; to a Middle Eastern Internet service; and to the communications of three foreign governments.
In one case, after the government learned that a foreign intelligence target had ordered new computer hardware, the American manufacturer agreed to insert a back door into the product before it was shipped, someone familiar with the request told The Times.
The 2013 N.S.A. budget request highlights “partnerships with major telecommunications carriers to shape the global network to benefit other collection accesses” — that is, to allow more eavesdropping.
At Microsoft, as The Guardian has reported, the N.S.A. worked with company officials to get pre-encryption access to Microsoft’s most popular services, including Outlook e-mail, Skype Internet phone calls and chats, and SkyDrive, the company’s cloud storage service.
Microsoft asserted that it had merely complied with “lawful demands” of the government, and in some cases, the collaboration was clearly coerced. Executives who refuse to comply with secret court orders can face fines or jail time.
N.S.A. documents show that the agency maintains an internal database of encryption keys for specific commercial products, called a Key Provisioning Service, which can automatically decode many messages. If the necessary key is not in the collection, a request goes to the separate Key Recovery Service, which tries to obtain it.
How keys are acquired is shrouded in secrecy, but independent cryptographers say many are probably collected by hacking into companies’ computer servers, where they are stored. To keep such methods secret, the N.S.A. shares decrypted messages with other agencies only if the keys could have been acquired through legal means. “Approval to release to non-Sigint agencies,” a GCHQ document says, “will depend on there being a proven non-Sigint method of acquiring keys.”
Simultaneously, the N.S.A. has been deliberately weakening the international encryption standards adopted by developers. One goal in the agency’s 2013 budget request was to “influence policies, standards and specifications for commercial public key technologies,” the most common encryption method.
Cryptographers have long suspected that the agency planted vulnerabilities in a standard adopted in 2006 by the National Institute of Standards and Technology, the United States’ encryption standards body, and later by the International Organization for Standardization, which has 163 countries as members.
Classified N.S.A. memos appear to confirm that the fatal weakness, discovered by two Microsoft cryptographers in 2007, was engineered by the agency. The N.S.A. wrote the standard and aggressively pushed it on the international group, privately calling the effort “a challenge in finesse.”
“Eventually, N.S.A. became the sole editor,” the memo says.
Even agency programs ostensibly intended to guard American communications are sometimes used to weaken protections. The N.S.A.’s Commercial Solutions Center, for instance, invites the makers of encryption technologies to present their products and services to the agency with the goal of improving American cybersecurity. But a top-secret N.S.A. document suggests that the agency’s hacking division uses that same program to develop and “leverage sensitive, cooperative relationships with specific industry partners” to insert vulnerabilities into Internet security products.
A Way Around
By introducing such back doors, the N.S.A. has surreptitiously accomplished what it had failed to do in the open. Two decades ago, officials grew concerned about the spread of strong encryption software like Pretty Good Privacy, or P.G.P., designed by a programmer named Phil Zimmermann. The Clinton administration fought back by proposing the Clipper Chip, which would have effectively neutered digital encryption by ensuring that the N.S.A. always had the key.
That proposal met a broad backlash from an unlikely coalition that included political opposites like Senator John Ashcroft, the Missouri Republican, and Senator John Kerry, the Massachusetts Democrat, as well as the televangelist Pat Robertson, Silicon Valley executives and the American Civil Liberties Union. All argued that the Clipper would kill not only the Fourth Amendment, but also America’s global edge in technology.
By 1996, the White House backed down. But soon the N.S.A. began trying to anticipate and thwart encryption tools before they became mainstream.
“Every new technology required new expertise in exploiting it, as soon as possible,” one classified document says.
Each novel encryption effort generated anxiety. When Mr. Zimmermann introduced the Zfone, an encrypted phone technology, N.S.A. analysts circulated the announcement in an e-mail titled “This can’t be good.”
But by 2006, an N.S.A. document notes, the agency had broken into communications for three foreign airlines, one travel reservation system, one foreign government’s nuclear department and another’s Internet service by cracking the virtual private networks that protected them.
By 2010, the Edgehill program, the British counterencryption effort, was unscrambling VPN traffic for 30 targets and had set a goal of an additional 300.
But the agencies’ goal was to move away from decrypting targets’ tools one by one and instead decode, in real time, all of the information flying over the world’s fiber optic cables and through its Internet hubs, only afterward searching the decrypted material for valuable intelligence.
A 2010 document calls for “a new approach for opportunistic decryption, rather than targeted.” By that year, a Bullrun briefing document claims that the agency had developed “groundbreaking capabilities” against encrypted Web chats and phone calls. Its successes against Secure Sockets Layer and virtual private networks were gaining momentum.
But the agency was concerned that it could lose the advantage it had worked so long to gain, if the mere “fact of” decryption became widely known. “These capabilities are among the Sigint community’s most fragile, and the inadvertent disclosure of the simple ‘fact of’ could alert the adversary and result in immediate loss of the capability,” a GCHQ document outlining the Bullrun program warned.
Since Mr. Snowden’s disclosures ignited criticism of overreach and privacy infringements by the N.S.A., American technology companies have faced scrutiny from customers and the public over what some see as too cozy a relationship with the government. In response, some companies have begun to push back against what they describe as government bullying.
Google, Yahoo and Facebook have pressed for permission to reveal more about the government’s secret requests for cooperation. One small e-mail encryption company, Lavabit, shut down rather than comply with the agency’s demands for what it considered confidential customer information; another, Silent Circle, ended its e-mail service rather than face similar demands.
In effect, facing the N.S.A.’s relentless advance, the companies surrendered.
Ladar Levison, the founder of Lavabit, wrote a public letter to his disappointed customers, offering an ominous warning. “Without Congressional action or a strong judicial precedent,” he wrote, “I would strongly recommend against anyone trusting their private data to a company with physical ties to the United States.”
Statement from the Office of the Director of National Intelligence:
It should hardly be surprising that our intelligence agencies seek ways to counteract our adversaries’ use of encryption. Throughout history, nations have used encryption to protect their secrets, and today, terrorists, cybercriminals, human traffickers and others also use code to hide their activities. Our intelligence community would not be doing its job if we did not try to counter that.
While the specifics of how our intelligence agencies carry out this cryptanalytic mission have been kept secret, the fact that NSA’s mission includes deciphering enciphered communications is not a secret, and is not news. Indeed, NSA’s public website states that its mission includes leading “the U.S. Government in cryptology … in order to gain a decision advantage for the Nation and our allies.”
The stories published yesterday, however, reveal specific and classified details about how we conduct this critical intelligence activity. Anything that yesterday’s disclosures add to the ongoing public debate is outweighed by the road map they give to our adversaries about the specific techniques we are using to try to intercept their communications in our attempts to keep America and our allies safe and to provide our leaders with the information they need to make difficult and critical national security decisions.
Is Microsoft going the way of the Soviet Union? Vivek Wadhwa, vice president for academics and innovation at Singularity University, director of research at Pratt School of Engineering, Duke University, and a fellow at Stanford Law School, thinks so. A good friend of the Making Sen$e Business Desk, Wadhwa takes another look at Microsoft's future -- an issue he explored earlier this week in his column on the Washington Post's Innovations blog.
Vivek Wadhwa: When companies become too big, they usually lose their ability to innovate. There are a few notable exceptions, such as Apple, GE and Google, but most become complacent and focus increasingly on defending their existing turf rather than on creating new markets. Thus they begin their march into oblivion.
That is the present state of Microsoft. It has become an old giant, obsessed with defending its aging products. If Microsoft doesn't change course, it is likely to suffer the same fate as that old superpower, the former Soviet Union, whose obsession with preserving its bloated bureaucracy led to its destruction.
Microsoft has lost ground in practically every emerging field, including mobile computing, music players, smartphones, search and social networking. Yes, it has had an odd success or two, such as the Xbox, but these are just flukes.
It isn't that Microsoft doesn't have talented people working for it. Quite to the contrary, it has an abundance of talent. For two decades, it was the tech industry's strongest talent magnet. It hired the best of the best. And most of these geniuses haven't left -- yet.
My former students and friends who work at Microsoft tell me that they love the company, but are stifled by its bureaucracy, turf wars and central planning. Big ideas get quashed because they don't fit into the corporate vision; products with great potential are killed because they could threaten the company's core products. These employees believe that their talent is being wasted. They long for the days when Microsoft was a lean mean fighting machine.
That's why I believe that the best path forward for Microsoft is to break itself up into a number of fighting machines -- smaller companies that compete with upstarts in Silicon Valley and with each other. These micro-Microsofts need to have the freedom to take risks and cannibalize the company's core products. That won't happen under its present structure.
The Windows 8 fiasco illustrates the problems that Microsoft faces. Windows RT, the version of Windows 8 that was designed for tablet computers with touch screens, has a beautiful user interface and functionality. In many ways, it is better than Apple's iOS and Google's Android. But Microsoft was obsessed with protecting its Windows operating system and Office tools franchise. So it bundled a version of Microsoft Office into RT. To make the desktop version of Windows 8 consistent with RT, it added to it the same tiled user interface and removed the Start button.
Most desktop computers and laptops, however, don't have touch screens. And Windows users aren't used to computers without Start buttons. So they hated Windows 8 desktop, and it was a commercial disaster.
The inclusion of Microsoft Office on RT and Microsoft's desire to protect its operating system's pricing structure led it to charge re-sellers a price rumored to be about $85 (the re-seller price is a well-guarded secret). This is more than what lower-end tablets will soon cost, and competes directly with Android, which Google gives away. That's why RT, too, was a commercial disaster.
The sensible thing for Microsoft to do would have been to provide a lighter version of RT -- for free. It would have competed head to head with Android and would likely have won because it has a superior user interface. Microsoft could have made money by charging for special features and apps such as Office. If Microsoft's RT division had had the freedom, it might also have done the unimaginable by bundling Google's Office apps and other competitive products into it.
Tablet prices are dropping rapidly. I expect that next year, there will be several players selling devices that cost less than $100. Full-featured tablets that cost around $50 -- and less -- are also on the horizon. When these become available, the market for tablets will explode. There will be hundreds of millions, perhaps billions, of such devices. Instead of running Microsoft's RT, they will likely run Android. Microsoft has lost its opportunity to sell additional products on these devices through its obsession with protecting its legacy software. Windows and Office will likely slip into oblivion like the five year plans and Politburo the Soviet Union clung to.
But there is still hope for Microsoft. It has a wealth of great people and great technologies in its labs. They need to be untethered from the central bureaucracy and set free to compete and take big risks. I am not too optimistic, though, that this will happen. I worry that Microsoft will go the way of Kodak, RIM and Nokia -- or even the former Soviet Union -- all of which tanked because they were busy protecting old turf.
For a long time, Microsoft Office has been the reigning champ of office suites, but that doesn't mean the free alternative, LibreOffice, isn't worth considering. Let's take a look at how the two compare, and if it's finally possible to ditch the paid option for the free one.
You might not think it's really fair to compare the free LibreOffice and the paid Microsoft Office, but the two are a lot closer in features than you might think. For one, LibreOffice is compatible with a lot more systems, including Windows, OS X, and Linux, while Microsoft Office's newest version is restricted to just Windows 7 and Windows 8. Besides: it’s not about which one is “better” or “more feature filled.” It’s about whether your work requires what Microsoft has to offer, or if you can get by with something free and save a bit of money. Now, with LibreOffice reaching 4.1, we've decided it's time to give it an in-depth comparison with Microsoft Office.
While we certainly can't go through each feature one-by-one, we'll attempt to get a good look at how they compare. If you're interested in looking for a specific feature, head to this page and search for it on the table. It should give you a pretty good idea of exactly which features are in which suite. In this post, we're going to talk in more general terms.
Augmented reality, image recognition and other multimedia features could be standard in future smartphones and tablets, and nVidia’s upcoming Tegra 5 mobile chip will have features to handle such demanding graphics capabilities.
nVidia on Wednesday said that it has made its biggest advance in mobile graphics technology with the integration of its latest graphics core code-named Kepler into Tegra 5, which is code-named Project Logan. The chip is due next year, and will be able to handle the most demanding graphics applications through ray-tracing, tessellation, advanced lighting and post processing, said Daniel Vivoli, senior vice president at nVidia.
The graphics capabilities in Logan will be demonstrated at the SIGGRAPH show in Anaheim, California. The demo will highlight the ability of a mobile processor to show a lifelike human face while consuming just two to three watts of power. The 3D simulation of the human face will show “full features,” Vivoli said, including light refraction, microscopic wrinkles on the skin, and other small details such as skin oils.
The human face — called Ira by nVidia — was demonstrated on stage at nVidia’s GPU Technology Conference and was rendered with server-class graphics processors based on Kepler. Features from those high-end GPUs are being scaled down to fit into the power constraints of mobile devices, Vivoli said. The Ira demonstration was ported to Logan after paring down some hardware capabilities and also with tweaks in clock gating and cache.
Tegra 5 is scheduled to ship next year. nVidia has just started shipping its Tegra 4 chip, which will be in devices such as Hewlett-Packard’s SlateBook X2 tablet.
nVidia declined to provide numbers on the graphics performance gains versus Tegra 4. But the graphics core will be faster and more power efficient, and nVidia said it will use less than a third of the power of graphics cores in tablets like the iPad when rendering the same graphics. Logan will provide better graphics performance at the same power consumption levels.
nVidia is known for its graphics, and its chips are considered among the best at handling multimedia in mobile devices. The company’s high-end Tesla graphics chips based on Kepler are being used in some of the world’s fastest supercomputers, and now similar features will be available in mobile devices going ahead.
It’s also the first time that nVidia is bringing its latest graphics core to the mobile processor, effectively uniting all graphics products on the same microarchitecture.
“We’ve always have a separate architecture,” Vivoli said. “We’ve been working for years where we can converge the graphics roadmaps.”
nVidia earlier offered a graphics development platform called Kayla in which a Tegra processor was attached to a Kepler GPU via a PCI-Express interconnect. The platform was intended to get programmers to start writing mobile applications for the Kepler GPU. But with Logan, the Kepler graphics processor is integrated inside the Tegra chip.
Programmers will have to write algorithms and programs to enable augmented reality, face recognition and other high-end multimedia, Vivoli said. Processing such tasks will be quicker when off-loaded to the Kepler graphics core, Vivoli said.
It will also be the “first time” GPGPU (general-purpose graphics processing unit computing) comes to mobile devices, Vivoli said, referring to a concept in which processing is being increasingly moved to from CPUs to graphics cores in systems.
But the CPUs and graphics processors still need to work in a coherent manner, and the Tegra 5 chip will support a range of parallel programming tools such as CUDA 5.5, OpenCL 2.0 and Microsoft’s DirectX. Such tools harness the joint processing power of CPUs and GPUs to bring performance gains in supercomputers, and with mobile devices, the performance boosts have to fit within a specific power limit.
There are multiple parallel programming development tools for mobile devices and supercomputing. Intel offers development tools to work with its Xeon Phi accelerator chip, while Advanced Micro Devices is pushing with specifications from the HSA (Heterogeneous System Architecture) Foundation, a group that hopes to provide tools so applications can be easily ported across different chip architectures and devices. nVidia is not a member of HSA, which is backed by ARM, Qualcomm, Texas Instruments and others.
Beyond Logan, nVidia is making more hardware improvements that should make graphics rendering faster. The Tegra 6 processor code-named Parker will unite CPU and GPU and make it a shared resource. Parker will also have a 3D structure in which transistors will be stacked on top of each other, which should make the GPU faster and more power efficient. Parker is due for release in 2015.
America’s research universities, among the most open and robust centers of information exchange in the world, are increasingly coming under cyberattack, most of it thought to be from China, with millions of hacking attempts weekly. Campuses are being forced to tighten security, constrict their culture of openness and try to determine what has been stolen.
University officials concede that some of the hacking attempts have succeeded. But they have declined to reveal specifics, other than those involving the theft of personal data like Social Security numbers. They acknowledge that they often do not learn of break-ins until much later, if ever, and that even after discovering the breaches they may not be able to tell what was taken.
Universities and their professors are awarded thousands of patents each year, some with vast potential value, in fields as disparate as prescription drugs, computer chips, fuel cells, aircraft and medical devices.
“The attacks are increasing exponentially, and so is the sophistication, and I think it’s outpaced our ability to respond,” said Rodney J. Petersen, who heads the cybersecurity program at Educause, a nonprofit alliance of schools and technology companies. “So everyone’s investing a lot more resources in detecting this, so we learn of even more incidents we wouldn’t have known about before.”
Tracy B. Mitrano, the director of information technology policy at Cornell University, said that detection was “probably our greatest area of concern, that the hackers’ ability to detect vulnerabilities and penetrate them without being detected has increased sharply.”
Like many of her counterparts, she said that while the largest number of attacks appeared to have originated in China, hackers have become adept at bouncing their work around the world. Officials do not know whether the hackers are private or governmental. A request for comment from the Chinese Embassy in Washington was not immediately answered.
Analysts can track where communications come from — a region, a service provider, sometimes even a user’s specific Internet address. But hackers often route their penetration attempts through multiple computers, even multiple countries, and the targeted organizations rarely go to the effort and expense — often fruitless — of trying to trace the origins. American government officials, security experts and university and corporate officials nonetheless say that China is clearly the leading source of efforts to steal information, but attributing individual attacks to specific people, groups or places is rare.
The increased threat of hacking has forced many universities to rethink the basic structure of their computer networks and their open style, though officials say they are resisting the temptation to create a fortress with high digital walls.
“A university environment is very different from a corporation or a government agency, because of the kind of openness and free flow of information you’re trying to promote,” said David J. Shaw, the chief information security officer at Purdue University. “The researchers want to collaborate with others, inside and outside the university, and to share their discoveries.”
Some universities no longer allow their professors to take laptops to certain countries, and that should be a standard practice, said James A. Lewis, a senior fellow at the Center for Strategic and International Studies, a policy group in Washington. “There are some countries, including China, where the minute you connect to a network, everything will be copied, or something will be planted on your computer in hopes that you’ll take that computer back home and connect to your home network, and then they’re in there,” he said. “Academics aren’t used to thinking that way.”
Bill Mellon of the University of Wisconsin said that when he set out to overhaul computer security recently, he was stunned by the sheer volume of hacking attempts.
“We get 90,000 to 100,000 attempts per day, from China alone, to penetrate our system,” said Mr. Mellon, the associate dean for research policy. “There are also a lot from Russia, and recently a lot from Vietnam, but it’s primarily China.”
Other universities report a similar number of attacks and say the figure is doubling every few years. What worries them most is the growing sophistication of the assault.
RAY SUAREZ (Newshour): Finally tonight: video games, virtual reality and how changes in those technologies may be connected with economic behavior.
NewsHour economics correspondent Paul Solman and Paul's avatar are our guides, part of his ongoing reporting Making Sense of financial news.
And you should know his story contains some video game violence.
MAN: You should feel like you're there.
MAN: Oh, gosh. Oh, my gosh.
PAUL SOLMAN (Newshour): Video games, one of the world's fastest-growing industries, with more than $80 billion a year in revenues now, more than twice that of movies.
MAN: The feeling of dropping is really awesome.
PAUL SOLMAN: And at a recent developers conference in San Francisco, the race was on to try out a breakthrough that could take the industry to an entirely new level.
MAN: This is insane.
PAUL SOLMAN: Though not yet ready for retail -- it's expected to sell for about $300 -- the Oculus Rift is already being hailed as the Holy Grail of gaming, a lightweight, affordable headset to deliver totally immersive virtual reality, or V.R.
SUMMARY: As U.S. and Chinese officials meet this week in Washington to discuss cyber issues -- as well as broader strategic and economic issues -- a number of Congress members and computer security experts say they are fed up with China stealing proprietary data from American companies. Ray Suarez reports.
Teams from eight countries competed in the first round of the challenge to develop a disaster response robot
Except in this game, turning on a garden hose is an enormously difficult task, requiring huge teams of scientists and decades of acquired technology.
About twenty-six teams from eight countries competed on June 17-21 in The Virtual Robotics Challenge, the first round of the DARPA Robotics Challenge, using complex software to direct virtual robots in a cloud-based simulator that looks like a 3-D video game.
The overall challenge for the teams is to develop software that can operate a DARPA-supplied humanoid robot across a low-bandwidth network, which is expected to be the only type of network available to first responders in a disaster scenario.
This first round was a software competition in which teams used software of their own design to have a simulated ATLAS robot navigate a simulated disaster zone that looked something like suburbia gone wrong. For three days, competitors stared into computer screens in their respective far-flung labs and offices, instructing their virtual robots to complete a series of challenges, including driving a vehicle and walking over uneven ground. Robots also had to pick up a hose, connect it to a spigot and turn it on.
“The disaster response scenario is technically very challenging,” said Russ Tedrake, a professor at MIT’s Department of Electrical Engineering and Computer Science. “It requires the robot and human operator to simultaneously perceive and gain an understanding for a complex, new environment, and then use that information to perform difficult manipulation tasks and traverse complex terrains.”
That means that the virtual robot must feed its raw sensor data back to its operating team, which then, with the help of the robot, must interpret its surroundings and enter instructions about where to move or how to manipulate objects. The team then continuously asks the robot to share its plan, adjusting their requests and their suggestions until the robot provides a correct answer, at which point the robot is allowed to go on autonomously.
The top nine teams received funding and an ATLAS robot to compete in the DARPA Robotics Challenge Trials in December 2013. The trials are the second of three DARPA challenge events and will be the first time that the physical robots will compete.
The overall winner of the first round was The Florida Institute for Human and Machine Cognition, a team of some 22 researchers.
“Getting in the car and driving was our biggest challenge,” said research scientist Jerry Pratt, the Florida Institute’s team leader. “Walking — we had that nailed.”
Other winners included Worcester Polytechnic Institute, MIT, and TRACLabs. The Jet Propulsion Laboratory, which was also among the winning teams, donated its awarded funds to three runner-up teams that DARPA had not originally selected – it had chosen six teams – putting the total to nine teams that will compete in the second round.
|(click for larger view)|
|(click for larger view)|
|(click for larger view)|
I make no guarantee that any advice given on these pages will work as expected. There are just too many variables depending on Operator Systems and hardware configurations to give any advice that will always work.
Where possible, I will provide links to my source.
I have over 30yrs experience in electronics, computers, and software. I have served as an IT Technician. So I have created this blog to pass on my experience on these subjects.
Note that I do not have any Certifications nor degrees. All I know is from hands-on.
My experience in electronics comes from 22yrs in the Navy (retired) in Avionics, including as an instructor.
Note that I monitor the support forums under "Recommended Links."