Monday, December 29, 2014

WINDOWS 7 - Classic Shell

Just found something that is outstanding for my Windows 7 Pro 64bit rig.

Classic Shell for Windows 7 & 8

Here are screenshots of just two examples:

This shows the style I'm using for the [Start] menu.

This is the file Explorer classic style.

There are more styles for other Windows UIs.

Why do I like this utility?  See my [Start] menu:

Monday, November 24, 2014

CYBER ATTACKS - Outdated Internet Browsers

"Your outdated Internet browser is a gateway for cyber attacks" PBS NewsHour 11/18/2014


JUDY WOODRUFF (NewsHour):  Major U.S. government agencies have been the target of cyber-attacks of late.  The State Department is the latest.  During the past week, officials had to temporarily shut down an unclassified e-mail system after a suspected hacking.  In recent months, the White House, the Postal Service and the National Weather Service all have been targeted.

Meanwhile, as the holiday season approaches, retailers and the business world are on the lookout for breaches.

A new book breaks down the pervasiveness of what’s happening.

Jeffrey Brown has our conversation.

JEFFREY BROWN (NewsHour):  Hardly a week goes by anymore without a report of some major cyber-breach, whether it’s targeting retailers, the government, or any and all of us.  The attacks are generated in a new netherworld of crime, some of it individualized, even chaotic, other parts of it extremely well-organized.

Writer and journalist Brian Krebs has uncovered some major breaches, including the one on Target that compromised the credit card data of tens of millions of people.  He writes about all of this on his blog Krebs on Security and now in his new book, “Spam Nation.”

And welcome to you.

BRIAN KREBS, Author, “Spam Nation”:  Thank you.

JEFFREY BROWN:  You are peering a world of cyber-crime that few of us ever see.  What does it look like?

BRIAN KREBS:  It’s a pretty dark place.


BRIAN KREBS:  Yes, absolutely.

But it’s not as dark as you might imagine.  If you’re somebody who doesn’t know their way around, there are plenty of people willing to show you the way.  They might take a cut of the action to help you do that, but it’s not as dark…

Wednesday, November 5, 2014

SOFTWARE - Hardware Monitor on My Windows 7 64bit Rig

Quite awhile back I posted an article bout CPUID's Hardware Monitor (aka HWMonitor).

Well, above is what it is showing for my custom built Windows 7 64bit 'Super Rig.'

I was surprised by the display of my UPS (Uninterpretable Power Supply) info.

This was not displayed in my dead-and-berried WinXP Rig.  Maybe it's because on my new rig the UPS is connect via USB.

Note that HWMonitor comes in a free none-Pro version.  HWMonitor Pro ($) allows you to create graphs.

Wednesday, October 15, 2014

WINDOWS - WinXP vs Win7

As I said in my previous post, I was forced to go to Windows 7.

I have found that Microdunce has 'broken' features in Win7:

[Send to]:  This is the first broken feature I ran into.  In WinXP you can put any shortcut in your [SendTo] folder and it will work when using the Context Menu [Send to] option.  NOT in Win7, you cannot use normal shortcuts in your [SendTo] folder.

POINTERS:  In WinXP you can set custom pointers sourced from anywhere, any CUR file.  In Win7 ALL pointers must be in C:\Windows\Cursors.  This means you have to copy cursors/pointers from your other sources to that folder for any Pointer Customization to hold on next boot, ALSO you should save a the DEFAULT cursor theme.

SOUNDS:  In Win7 there is no "Start Windows" sound listed.  "Exit Windows" is listed.  Luckily I found a utility to change the "Start Windows" sound.  Now tell me, what is the logic of NOT having "Start Windows" listed?

I consider features 'broken' if any change makes it HARDER to use Windows.

I will add more 'broken features' here as I find them.

Saturday, September 13, 2014

HARDWARE - My New Super-PC (updated)

Well.... after 20+ years my old WinXP desktop PC died, gave up the ghost.

So I got new custom built PC, went BIG.

  • Windows 7 Pro  64bit
  • CPU:  Intel Core i5-4690 @ 3.50GHz (aka Quad Core)
  • Memory:  8gb
  • Hard Drive:  4 Terabytes, Hybread (Solid State + SATA)
  • Video Card:  GeForce GTX 770 CUDA Core, 2k memory

What the hybread hard does?  Think of the SSD as a super-cache.  The drives copies the most used programs to SSD, which is actually memory, and works much faster.

NOTE:  The original build was with Windows 7 Home Professional.  I used Windows Anytime Update to change to Windows 7 Pro.  The update was flawless and took under 15min.

AND..... I upgraded to broadband network (AT&T U-verse, really had no choice, they're dumping DSL).  Speed test below.

via Speed Test NET

Wednesday, August 27, 2014

SECURITY - NSA's Secret 'Google'

"The Surveillance Engine:  How the NSA Built Its Own Secret Google" by Ryan Gallagher, The Intercept 8/25/2014


The National Security Agency is secretly providing data to nearly two dozen U.S. government agencies with a “Google-like” search engine built to share more than 850 billion records about phone calls, emails, cellphone locations, and internet chats, according to classified documents obtained by The Intercept.

The documents provide the first definitive evidence that the NSA has for years made massive amounts of surveillance data directly accessible to domestic law enforcement agencies.  Planning documents for ICREACH, as the search engine is called, cite the Federal Bureau of Investigation and the Drug Enforcement Administration as key participants.

ICREACH contains information on the private communications of foreigners and, it appears, millions of records on American citizens who have not been accused of any wrongdoing.  Details about its existence are contained in the archive of materials provided to The Intercept by NSA whistleblower Edward Snowden.

Earlier revelations sourced to the Snowden documents have exposed a multitude of NSA programs for collecting large volumes of communications.  The NSA has acknowledged that it shares some of its collected data with domestic agencies like the FBI, but details about the method and scope of its sharing have remained shrouded in secrecy.

ICREACH has been accessible to more than 1,000 analysts at 23 U.S. government agencies that perform intelligence work, according to a 2010 memo.  A planning document from 2007 lists the DEA, FBI, Central Intelligence Agency, and the Defense Intelligence Agency as core members.  Information shared through ICREACH can be used to track people’s movements, map out their networks of associates, help predict future actions, and potentially reveal religious affiliations or political beliefs.

The creation of ICREACH represented a landmark moment in the history of classified U.S. government surveillance, according to the NSA documents.

“The ICREACH team delivered the first-ever wholesale sharing of communications metadata within the U.S. Intelligence Community,” noted a top-secret memo dated December 2007.  “This team began over two years ago with a basic concept compelled by the IC’s increasing need for communications metadata and NSA’s ability to collect, process and store vast amounts of communications metadata related to worldwide intelligence targets.”

The search tool was designed to be the largest system for internally sharing secret surveillance records in the United States, capable of handling two to five billion new records every day, including more than 30 different kinds of metadata on emails, phone calls, faxes, internet chats, and text messages, as well as location information collected from cellphones.  Metadata reveals information about a communication — such as the “to” and “from” parts of an email, and the time and date it was sent, or the phone numbers someone called and when they called — but not the content of the message or audio of the call.

Monday, August 11, 2014

INTERNET - Criminals Steal 1.2 Billion Web Credentials

"After criminals steal 1.2 billion web credentials, how to protect personal info from data breaches" PBS NewsHour 8/6/2014


GWEN IFILL (NewsHour):  Computer hacking and the breaches of privacy that come with them are becoming a regular and unwelcome feature of our wired world.

Now The New York Times and a security firm based in the Midwest are reporting a massive one that includes the collection of more than a billion username and password combinations and more than 500 million e-mail addresses.  What’s more, the perpetrators appear to be a shadowy Russian crime ring.

Details, including the names of the victims, are hard to come by.  But the news has raised eyebrows around the world.  So, how serious is it?

For that, we turn to Dmitri Alperovitch, co-founder and chief technology officer of CrowdStrike, a Web security firm.

Mr. Alperovitch, tell us just in context of all these other breaches we have had in the past year, say, how — relative to those, how big is this?

DMITRI ALPEROVITCH, CrowdStrike:  Well, the number is certainly striking; 1.2 billion credentials is a lot.  In the past, we have seen some big breaches that numbered in the hundreds of millions.

But this is certainly the biggest one that I — that I can remember.

LINUX - More Cities and Nations Ditch Microsoft

"Turin to Be First Italian City to Adopt Ubuntu, Unshackle from the 'Tyranny of Proprietary Software'" by Silviu Stahie, SoftPedia 8/8/2014

Turin wants to be the first city in Italy to switch completely to open source and Ubuntu and entirely ditch all the Microsoft products.

The number of local authorities that decide to switch to open source to match the IT needs of a city is slowly increasing and now it looks like the city of Turin in Italy is also doing the same thing.

One of the main tools that are available for the local governments to decrease the public spending is to make some changes when it comes to upgrading the proprietary software.  Usually, this procedure costs a lot of money and the only way that you can save funds is to adopt open source solutions.

In the case of Turin, that can be done by adopting Ubuntu, which is a Linux distribution developed by Canonical and which has complete support for the Italian language.  Ubuntu is a free operating system and it's supported for a period of five years.  Even when the support ends, the IT department only has to upgrade to the next release.

According to a report on, Turin wants to become the first city in Italy to move completely to open source for its 8,300 PCs used by the local authorities.

“The transition will begin this fall and it will take a year and a half to complete.  It will become the first Italian open source city and we'll to get a saving on expenses for the computers that will go 20-40 percent compared to today,” says one of the managers of the project, Gianmarco Montanari.

“If we abandon proprietary software we will save €6 million ($8 million) in five years.  The initial investment is low but, once installed programs and taught employees how to use them, the system will go ahead on its own feet, allowing the city to lower the cost even more,” notes the director of Information Systems, Sandro Golzio.

The complete price of migrating the PCs from a version of Windows to another, together with the Office suite, would cost the city €22 million ($29.5 million) over a five-year span, but with the adoption of Ubuntu, that price will go down to €16 million ($21,4 million).

A flurry of cities in Europe are doing similar things.  In Germany, the city of Munich has already finished the transition to their own Linux distribution, and in Toulouse, France, the process is ongoing and it will be over in a couple of years.

Tuesday, July 22, 2014

INTERNET - The Impossible to Block Tracking Device

"Meet the Online Tracking Device That is Virtually Impossible to Block" by Julia Angwin, ProPublica 7/21/2014

Update: A spokesperson said that the website was "completely unaware that AddThis contained a tracking software that had the potential to jeopardize the privacy of our users." After this article was published, YouPorn removed AddThis technology from its website.

A new, extremely persistent type of online tracking is shadowing visitors to thousands of top websites, from to

First documented in a forthcoming paper by researchers at Princeton University and KU Leuven University in Belgium, this type of tracking, called canvas fingerprinting, works by instructing the visitor’s Web browser to draw a hidden image.  Because each computer draws the image slightly differently, the images can be used to assign each user’s device a number that uniquely identifies it.

Like other tracking tools, canvas fingerprints are used to build profiles of users based on the websites they visit — profiles that shape which ads, news articles, or other types of content are displayed to them.

But fingerprints are unusually hard to block.  They can’t be prevented by using standard Web browser privacy settings or using anti-tracking tools such as AdBlock Plus.

The researchers found canvas fingerprinting computer code, primarily written by a company called AddThis, on 5 percent of the top 100,000 websites.  Most of the code was on websites that use AddThis’ social media sharing tools.  Other fingerprinters include the German digital marketer Ligatus and the Canadian dating site Plentyoffish. (A list of all the websites on which researchers found the code is here).

Rich Harris, chief executive of AddThis, said that the company began testing canvas fingerprinting earlier this year as a possible way to replace “cookies,” the traditional way that users are tracked, via text files installed on their computers.

“We’re looking for a cookie alternative,” Harris said in an interview.

Harris said the company considered the privacy implications of canvas fingerprinting before launching the test, but decided “this is well within the rules and regulations and laws and policies that we have.”

He added that the company has only used the data collected from canvas fingerprints for internal research and development.  The company won’t use the data for ad targeting or personalization if users install the AddThis opt-out cookie on their computers, he said.

Arvind Narayanan, the computer science professor who led the Princeton research team, countered that forcing users to take AddThis at its word about how their data will be used, is “not the best privacy assurance.”

Device fingerprints rely on the fact that every computer is slightly different: Each contains different fonts, different software, different clock settings and other distinctive features. Computers automatically broadcast some of their attributes when they connect to another computer over the Internet.

Tracking companies have long sought to use those differences to uniquely identify devices for online advertising purposes, particularly as Web users are increasingly using ad-blocking software and deleting cookies.

In May 2012, researchers at the University of California, San Diego, noticed that a Web programming feature called “canvas” could allow for a new type of fingerprint — by pulling in different attributes than a typical device fingerprint.

In June, the Tor Project added a feature to its privacy-protecting Web browser to notify users when a website attempts to use the canvas feature and sends a blank canvas image.  But other Web browsers did not add notifications for canvas fingerprinting.

A year later, Russian programmer Valentin Vasilyev noticed the study and added a canvas feature to freely available fingerprint code that he had posted on the Internet.  The code was immediately popular.

But Vasilyev said that the company he was working for at the time decided against using the fingerprint technology.  “We collected several million fingerprints but we decided against using them because accuracy was 90 percent,” he said, “and many of our customers were on mobile and the fingerprinting doesn’t work well on mobile.”

Vasilyev added that he wasn’t worried about the privacy concerns of fingerprinting.  “The fingerprint itself is a number which in no way is related to a personality,” he said.

AddThis improved upon Vasilyev’s code by adding new tests and using the canvas to draw a pangram “Cwm fjordbank glyphs vext quiz” — a sentence that uses every letter of the alphabet at least once.  This allows the company to capture slight variations in how each letter is displayed.

AddThis said it rolled out the feature to a small portion of the 13 million websites on which its technology appears, but is considering ending its test soon.  “It’s not uniquely identifying enough,” Harris said.

AddThis did not notify the websites on which the code was placed because “we conduct R&D projects in live environments to get the best results from testing,” according to a spokeswoman.

She added that the company does not use any of the data it collects — whether from canvas fingerprints or traditional cookie-based tracking — from government websites including for ad targeting or personalization.

The company offered no such assurances about data it routinely collects from visitors to other sites, such as did not respond to inquiries from ProPublica about whether it was aware of AddThis’ test of canvas fingerprinting on its website.

Thursday, July 10, 2014

NSA - How to Insure Your Are On the Watch List

"Here’s One Way to Land on the NSA’s Watch List" by Julia Angwin and Mike Tigas, ProPublica 7/9/2014

Last week, German journalists revealed that the National Security Agency has a program to collect information about people who use privacy-protecting services, including popular anonymizing software called Tor.  But it's not clear how many users have been affected.

So we did a little sleuthing, and found that the NSA's targeting list corresponds with the list of directory servers used by Tor between December 2010 and February 2012 – including two servers at the Massachusetts Institute of Technology.  Tor users connect to the directory servers when they first launch the Tor service.

That means that if you downloaded Tor during 2011, the NSA may have scooped up your computer's IP address and flagged you for further monitoring.  The Tor Project is a nonprofit that receives significant funding from the U.S. government.

The revelations were among the first evidence of specific spy targets inside the United States.  And they have been followed by yet more evidence.  The Intercept revealed this week that the government monitored email of five prominent Muslim-Americans, including a former Bush Administration official.

It's not clear if, or how extensively, the NSA spied on the users of Tor and other privacy services.

After the news, one of Tor's original developers, Roger Dingledine, reassured users that they most likely remained anonymous while using the service:  "Tor is designed to be robust to somebody watching traffic at one point in the network – even a directory authority."  It is more likely that users could have been spied on when they were not using Tor.

For its part, the NSA says it only collects information for valid foreign intelligence purposes and that it "minimizes" information it collects about U.S. residents.  In other words, NSA may have discarded any information it obtained about U.S. residents who downloaded Tor.

However, according to a recent report by the Privacy and Civil Liberties Oversight Board, the NSA's minimization procedures vary by program.  Under Prism, for example, the NSA shares unminimized data with the FBI and CIA.

In addition, the NSA can also later search the communications of those it has inadvertently caught in its Prism dragnet, a tactic some have called a " backdoor" search.  It's not clear if similar backdoors exist for other types of data such as IP addresses.

In response to the Tor news, the NSA said it is following President Obama's January directive to not conduct surveillance for the purpose of "suppressing or burdening criticism or dissent, or for disadvantaging persons based on their ethnicity, race, gender, sexual orientation, or religion."

[Disclosure:  Mike Tigas is the developer of an app that uses Tor, called the Onion Browser.]

Monday, July 7, 2014

LINUX - Rules on Supercomputers

"Where Linux rules:  Supercomputers" by Steven J. Vaughan-Nichols, ZDNet 11/25/2013

Summary:  Linux is everywhere, except on traditional PCs.  But when it comes to total platform domination, you can't beat Linux on supercomputers.

The latest Top 500 Supercomputer list is out.  At the very tip-top, you'll find Tianhe-2.  This supercomputer, developed by China’s National University of Defense Technology, is once more the world’s fastest supercomputer with a performance of 33.86 petaflop/s (quadrillions of calculations per second) on the Linpack benchmark.  Also on top, as it has been for more than a decade now, you'll find Linux.

When it comes to supercomputers, Linux is the operating system of choice and it has been since 2004.  The latest round-up of the world's fastest computers underlines just how dominant Linux is in supercomputers.

In the November 2013 listing, 482 of the world's top supercomputers run Linux.  The free, open-source operating system is followed by Unix, with eleven; four systems running a mix of operating systems, two with Windows and a single system running BSD Unix.  That's an advantage of 96.4 percent for Linux to 3.6 percent for everyone else, if you're keeping score at home.

The vast majority of these Linux hot-rod computers use cluster architectures with 86.4 percent.  Only 15.4 percent use a massively parallel processor (MPP) design.

A related development, behind the high-tide of Linux, is that most of these supercomputers use AMD and Intel chips.  To be exact, 82 percent use Intel Xeon chips with the Xeon E5 SandyBridge processor leading the way.  9 percent use AMD Opteron and 8 percent use IBM Power processors.  All of these chips can, and do, run Linux on supercomputers.

Just over 10 percent of supercomputers, 53 systems, use accelerator/co-processor technology.  Of these, 38 use NVIDIA chips, 13 systems with Intel's Xeon Phi and two use ATI Radeon.

Looking ahead, the supercomputer testers are well aware that the Linpack benchmark is dated.  Jack Dongarra, distinguished professor of computer science at the University of Tennessee, creator of the TOP500 and Linpack's inventor, is working on a new supercomputer benchmark:  the High Performance Conjugate Gradient.

We don't have a date yet for when the HPCG will appear.  We can, however, be certain that whenever it appears, Linux will still be the top supercomputer operating system.

Thursday, June 19, 2014

LINUX - Opinion, 7 Suggested Improvements

"7 Improvements The Linux Desktop Needs" by Bruce Byfield, Datamation 6/7/2014

In the last fifteen years, the Linux desktop has gone from a collection of marginally adequate solutions to an unparalleled source of innovation and choice.  Many of its standard features are either unavailable in Windows, or else available only as a proprietary extension.  As a result, using Linux is increasingly not only a matter of principle, but of preference as well.

Yet, despite this progress, gaps remain.  Some are missing features, others missing features, and still others pie-in-the sky extras that could be easily implemented to extend the desktop metaphor without straining users' tolerance of change.

For instance, here are 7 improvements that would benefit the Linux desktop:

7.  Easy Email Encryption

These days, every email reader from Alpine to Thunderbird and Kmail include email encryption.  However, documentation is often either non-existent or poor.

But, even if you understand the theory, the practice is difficult.  Controls are generally scattered throughout the configuration menus and tabs, requiring a thorough search for all the settings that you require or want.  Should you fail to set up encryption properly, usually you receive no feedback about why.

The closest to an easy process is Enigmail, a Thunderbird extension that includes a setup wizard aimed at beginners.  But you have to know about Enigmail to use it, and the menu it adds to the composition window buries the encryption option one level down and places it with other options guaranteed to mystify everyday users.

No matter what the desktop, the assumption is that, if you want encrypted email, you already understand it.  Today, though, the constant media references to security and privacy have ensured that such an assumption no longer applies.

6.  Thumbnails for Virtual Workspaces

Virtual workspaces offer more desktop space without requiring additional monitors.  Yet, despite their usefulness, management of virtual workspaces hasn't changed in over a decade.  On most desktops, you control them through a pager in which each workspace is represented by an unadorned rectangle that gives few indications of what might be on it except for its name or number -- or, in the case of Ubuntu's Unity, which workspace is currently active.

True, GNOME and Cinnamon do offer better views, but the usefulness of these views is limited by the fact that they require a change of screens.  Nor is KDE's written list of contents, which is jarring in the primarily graphic-oriented desktop.

A less distracting solution might be mouseover thumbnails large enough for those with normal vision to see exactly what is on each workspace.

5.  A Workable Menu

The modern desktop long ago outgrew the classic menu with its sub-menus cascading across the screen.  Today, the average computer simply has too many applications to fit comfortably into such a format.

The trouble is, neither of the major alternatives is as convenient as the classic menu.  Confining the menu into a single window is less than ideal, because you either have to endure truncated sub-menus or else continually resize the window with the mouse.

Yet the alternative of a full-screen menu is even worse.  It means changing screens before you even begin to work, and relying on a search field that is only useful if you already know what applications are available -- in which case you are almost better off launching from the command line.

Frankly, I don't know what the solution might be.  Maybe spinner racks, like those in OS X?  All I can say for certain is that all alternatives for a modern menu make a carefully constructed set of icons on the desktop seem a more reasonable alternative.

4.  A Professional, Affordable Video Editor

Over the years, Linux has slowly filled the gaps in productivity software.  However, one category in which it is still lacking is in reasonably priced software for editing videos.

The problem is not that such free software is non-existent.  After all, Maya is one of the industry standards for animation.  The problem is that the software costs several thousand dollars.

At the opposite end of the spectrum are apps like Pitivi or Blender, whose functionality -- despite brave efforts by their developers -- remain basic.  Progress happens, but far more slowly than anyone hopes for.

Although I have heard of indie directors using native Linux video editors, the reason I have heard of their efforts is usually because of their complaints.  Others prefer to minimize the struggle and edit on other operating systems instead.

3.  A Document Processor

At one extreme are users whose need for word processing is satisfied by Google Docs.  At the other extreme are layout experts for whom Scribus is the only feasible app.

In-between are those like publishers and technical writers who produce long, text-oriented documents.  This category of users is served by Adobe FrameMaker on Windows, and to some extent by LibreOffice Writer on Linux.

Unfortunately, these users are apparently not a priority in LibreOffice, Calligra Words, AbiWord, or any other office suite.  Features that would provide for these users include:

  • Separate bibliographic databases for each file
  • Tables that are treated like styles in the same way that paragraphs and characters are
  • Page styles with persistent content other than headers or footers that would appear each time the style is used
  • Storable formats for cross-references, so that the structure doesn't need to be recreated manually each time that it is needed

Whether LibreOffice or another application provides these features is irrelevant comparing to whether they are available.  Without them, the Linux desktop is an imperfect place for a large class of potential users.

2.  Color-Coded Title Bars

Browser extensions have taught me how useful color coded tabs can be for workspaces.  The titles of open tabs disappear when more than eight or nine or open, so the color is often the quickest visual guide to the relation between tabs.

The same system could be just as useful on the desktop.  Better yet, the color coding might be preserved between sessions, allowing users to open all the apps needed for a specific task at the same time.  So far, I know of no desktop with such a feature.

1.  Icon Fences

For years, Stardock Systems has been selling a Windows extension called Fences, which lets icons be grouped.  You can name each group and move the icons in it together.  In addition, you can assign which fence different types of files are automatically added to, and hide and arrange fences as needed.

In other words, fences automate the sort of arrangements that users make on their desktop all the time.  Yet aside from one or two minor functions they share with KDE's Folder Views, fences remain completely unknown on Linux desktops.  Perhaps the reason is that designers are focused on mobile devices as the source of ideas, and fences are decidedly a feature of the traditional workstation desktop.

Personalized Lists

As I made this list, what struck me was how few of the improvements were general.  Several of these improvement would appeal largely to specific audiences, and only one even implies the porting of a proprietary application.  At least one is cosmetic rather than functional.

What this observation suggests is that, for the general user, Linux has very little left to add.  As an all-purpose desktop, Linux arrive some years ago, and has been diversifying ever since, until today users can choose from over half a dozen major desktops.

None of that means, of course, that specialists wouldn't have other suggestions.  In addition, changing needs can make improvements desirable that nobody once cared about.  But it does mean that many items on a list of desirable improvements will be highly personal.

All of which raises the question:  What other improvements do you think would benefit the desktop?

Tuesday, June 10, 2014

INTERNET - Internet Giants vs Spy Agencies

"Internet Giants Erect Barriers to Spy Agencies" by DAVID E. SANGER and NICOLE PERLROTH, New York Times 6/6/2014

Just down the road from Google’s main campus here, engineers for the company are accelerating what has become the newest arms race in modern technology:  They are making it far more difficult — and far more expensive — for the National Security Agency and the intelligence arms of other governments around the world to pierce their systems.

As fast as it can, Google is sealing up cracks in its systems that Edward J. Snowden revealed the N.S.A. had brilliantly exploited.  It is encrypting more data as it moves among its servers and helping customers encode their own emails.  Facebook, Microsoft and Yahoo are taking similar steps.

After years of cooperating with the government, the immediate goal now is to thwart Washington — as well as Beijing and Moscow.  The strategy is also intended to preserve business overseas in places like Brazil and Germany that have threatened to entrust data only to local providers.

Google, for example, is laying its own fiber optic cable under the world’s oceans, a project that began as an effort to cut costs and extend its influence, but now has an added purpose: to assure that the company will have more control over the movement of its customer data.

A year after Mr. Snowden’s revelations, the era of quiet cooperation is over.  Telecommunications companies say they are denying requests to volunteer data not covered by existing law.  A.T.&T., Verizon and others say that compared with a year ago, they are far more reluctant to cooperate with the United States government in “gray areas” where there is no explicit requirement for a legal warrant.

But governments are fighting back, harder than ever.  The cellphone giant Vodafone reported on Friday that a “small number” of governments around the world have demanded the ability to tap directly into its communication networks, a level of surveillance that elicited outrage from privacy advocates.

Vodafone refused to name the nations on Friday for fear of putting its business and employees at risk there.  But in an accounting of the number of legal demands for information that it receives from 14 companies, it noted that some countries did not issue warrants to obtain phone, email or web-searching traffic, because “the relevant agencies and authorities already have permanent access to customer communications via their own direct link.”

The company also said it had to acquiesce to some governments’ requests for data to comply with national laws.  Otherwise, it said, it faced losing its license to operate in certain countries.

Eric Grosse, Google’s security chief, suggested in an interview that the N.S.A.'s own behavior invited the new arms race.

“I am willing to help on the purely defensive side of things,” he said, referring to Washington’s efforts to enlist Silicon Valley in cybersecurity efforts.  “But signals intercept is totally off the table,” he said, referring to national intelligence gathering.

“No hard feelings, but my job is to make their job hard,” he added.

In Washington, officials acknowledge that covert programs are now far harder to execute because American technology companies, fearful of losing international business, are hardening their networks and saying no to requests for the kind of help they once quietly provided.

Robert S. Litt, the general counsel of the Office of the Director of National Intelligence, which oversees all 17 American spy agencies, said on Wednesday that it was “an unquestionable loss for our nation that companies are losing the willingness to cooperate legally and voluntarily” with American spy agencies.

“Just as there are technological gaps, there are legal gaps,” he said, speaking at the Wilson Center in Washington, “that leave a lot of gray area” governing what companies could turn over.

In the past, he said, “we have been very successful” in getting that data.  But he acknowledged that for now, those days are over, and he predicted that “sooner or later there will be some intelligence failure and people will wonder why the intelligence agencies were not able to protect the nation.”

Companies respond that if that happens, it is the government’s own fault and that intelligence agencies, in their quest for broad data collection, have undermined web security for all.

Many point to an episode in 2012, when Russian security researchers uncovered a state espionage tool, Flame, on Iranian computers.  Flame, like the Stuxnet worm, is believed to have been produced at least in part by American intelligence agencies.  It was created by exploiting a previously unknown flaw in Microsoft’s operating systems.  Companies argue that others could have later taken advantage of this defect.

Worried that such an episode undercuts confidence in its wares, Microsoft is now fully encrypting all its products, including Hotmail and, by the end of this year with 2,048-bit encryption, a stronger protection that would take a government far longer to crack.  The software is protected by encryption both when it is in data centers and when data is being sent over the Internet, said Bradford L. Smith, the company’s general counsel.

Mr. Smith also said the company was setting up “transparency centers” abroad so that technical experts of foreign governments could come in and inspect Microsoft’s proprietary source code.  That will allow foreign governments to check to make sure there are no “back doors” that would permit snooping by United States intelligence agencies.  The first such center is being set up in Brussels.

Microsoft has also pushed back harder in court.  In a Seattle case, the government issued a “national security letter” to compel Microsoft to turn over data about a customer, along with a gag order to prevent Microsoft from telling the customer it had been compelled to provide its communications to government officials.  Microsoft challenged the gag order as violating the First Amendment.  The government backed down.

Hardware firms like Cisco, which makes routers and switches, have found their products a frequent subject of Mr. Snowden’s disclosures, and their business has declined steadily in places like Asia, Brazil and Europe over the last year.  The company is still struggling to convince foreign customers that their networks are safe from hackers — and free of “back doors” installed by the N.S.A.  The frustration, companies here say, is that it is nearly impossible to prove that their systems are N.S.A.-proof.

Most American companies said they never knowingly let the N.S.A. weaken their systems, or install back doors.  But Mr. Snowden’s documents showed how the agency found a way.

In one slide from the disclosures, N.S.A. analysts pointed to a sweet spot inside Google’s data centers, where they could catch traffic in unencrypted form.  Next to a quickly drawn smiley face, an N.S.A. analyst, referring to an acronym for a common layer of protection, had noted, “SSL added and removed here!”

Google was already suspicious that its internal traffic could be read, and had started a program to encrypt the links among its internal data centers, “the last chink in our armor,” Mr. Grosse said.  But the slide gave the company proof that it was a regular target of the N.S.A.  “It was useful to have proof, in terms of accelerating a project already underway,” he said.

Facebook and Yahoo have also been encrypting traffic among their internal servers.  And Facebook, Google and Microsoft have been moving to more strongly encrypt consumer traffic with so-called Perfect Forward Secrecy, specifically devised to make it more labor intensive for the N.S.A. or anyone to read stored encrypted communications.

One of the biggest indirect consequences from the Snowden revelations, technology executives say, has been the surge in demands from foreign governments that saw what kind of access to user information the N.S.A. received — voluntarily or surreptitiously.  Now they want the same.

At Facebook, Joe Sullivan, the company’s chief security officer, said it had been fending off those demands and heightened expectations.

Until last year, technology companies were forbidden from acknowledging demands from the United States government under the Foreign Intelligence Surveillance Act.  But in January, Google, Facebook, Yahoo and Microsoft brokered a deal with the Obama administration to disclose the number of such orders they receive in increments of 1,000.

As part of the agreement, the companies agreed to dismiss their lawsuits before the Foreign Intelligence Surveillance Court.

“We’re not running and hiding,” Mr. Sullivan said.  “We think it should be a transparent process so that people can judge the appropriate ways to handle these kinds of things.”

The latest move in the war between intelligence agencies and technology companies arrived this week, in the form of a new Google encryption tool.  The company released a user-friendly, email encryption method to replace the clunky and often mistake-prone encryption schemes the N.S.A. has readily exploited.

But the best part of the tool was buried in Google’s code, which included a jab at the N.S.A.'s smiley-face slide.  The code included the phrase: “ssl-added-and-removed-here-; - )”

Monday, June 2, 2014

SECURITY - Warning, Big Data Brokers

"FTC report warns consumers about big data brokers" PBS NewsHour 5/31/2014


HARI SREENIVASAN (NewsHour):  Earlier this week, the Federal Trade Commission issued a report that contained consumer protection recommendations concerning what’s referred to as “big data” – the companies that collect and sell billions of bits of information about all aspects of our online lives.  Information that includes purchases, income, political affiliations – even religion. As FTC Chairwoman Edith Ramirez put it:

“It’s time to bring transparency and accountability to bear on this industry on behalf of consumers, many of whom are unaware that data brokers even exist.”

For some insight, we turn to Amy Schatz who covers tech policy issues for Re/code.

So, what were the things that this report uncovered that might surprise consumers?

AMY SCHATZ, Re/code:  I think most of the things in the report would surprise consumers, although this isn’t necessarily a new issue – this has been going around for a couple of years – but most people don’t know that there are a bunch of data collectors out there who are collecting data about you.  Whether it’s who you voted for or your political beliefs.  Whether it’s your zip code or what you purchased at the store last week or what you’re lookeingat online.  There are these profiles that are being created online of most Americans now and that information is being traded and shared in a way that a lot of consumers might find a little troubling.

Friday, May 23, 2014

MICROSOFT - Successfully Challenges FBI

"Microsoft Successfully Challenges FBI Order For User Info" by Amy Lee, Cruxial 5/23/2014

Documents related to Microsoft's successful challenge of a governmental request for information about one of the company's customers have been unsealed.

The order, a Federal Bureau of Investigation National Security Letter, sought "basic subscriber information" about one of Microsoft's enterprise customers, according to according to a post by Microsoft’s general counsel and executive VP of Legal and Corporate Affairs, Brad Smith on the company's TechNet blog.

A federal court in Seattle unsealed the documents on May 22 2014.

"This marks an important and successful step to protect Microsoft's enterprise customers regarding government surveillance," Smith wrote.

Microsoft challenged the nondisclosure provision of the Letter in June 2013, arguing that it would violate the First Amendment.

"It did so by hindering our practice of notifying enterprise customers when we receive legal orders related to their data," Smith wrote.

After the petition was filed, the FBI withdrew the Letter.  According to Smith, governmental requests for information related to enterprise customers are "extremely rare."

In the previous cases where similar requests for information occurred, Microsoft was able to obtain permission from the customer in question, or to ask directly.  In this case, the FBI was able to get the info from the customer, according to the notice of withdrawal.

Microsoft, along with major tech firms like Apple, Facebook and Google, have ramped up their efforts to gain greater abilities to disclose the government's requests for data to their customers.

"As more users migrate from locally installed software and locally stored data to cloud-based computing platforms, Microsoft increasingly is entrusted to store its customers' data safely and securely," the petition states.

In December 2013, Smith wrote a post on the TechNet blog reaffirming Microsoft's commitment to protecting customer data, and promising to inform customers of any legal orders Microsoft receives or to challenge any gag orders prohibiting them from doing so.

Smith also stated the company’s belief that when seeking information, government agencies should go directly to customers except in exceptional circumstances, "just as they did before customers moved to the cloud."

With cloud services, such as Microsoft's Office 365, customer data is stored in Microsoft data centers, rather than on the customer's own systems.

"As more users migrate from locally installed software and locally stored data to cloud-based computing platforms, Microsoft increasingly is entrusted to store its customers' data safely and securely," the petition states.

Earlier this month, Glenn Greenwald, who has been a key part of disclosures related to how the National Security Administration collects information, published documents including details of Microsoft's relationship with the agency.

Greenwald had previously claimed in July 2013 that Microsoft had worked with the NSA to circumvent encryption on Outlook and had also worked with the FBI to help them better collect information from OneDrive.  Microsoft responded to the allegations shortly after, with the basic message that the company complies with data requests only when legally necessary.

"This new capability will result in a much more complete and timely collection response [...] for our enterprise customers.  This success is the result of the FBI working for many months with Microsoft to get this tasking and collection solution established," the recently revealed document states.


"Net Neutrality Fans Aren't Going To Like This Chart" by Gerry Smith, Huffington Post 5/22/2014

When news broke in February that streaming giant Netflix would pay Comcast for direct access to the cable company's broadband network, some experts said it marked the beginning of the end of net neutrality.

Yet a new report says that such deals are far more widespread than many realized at the time.

Many large tech companies -- including Google, Microsoft, Apple, Amazon and Facebook -- have quietly brokered deals with Internet providers to ensure their content is not slowed as it travels through their networks, according to a blog post published Wednesday by telecom analyst Dan Rayburn.

It's unclear whether these deals were brokered before or after a federal court in January struck down rules that maintained net neutrality, which is the principle that all Internet traffic should be equally accessible to consumers.  But Rayburn, an analyst at the research firm Frost & Sullivan, said such arrangements between web companies and Internet providers are nothing new.

"There are a lot of these deals in the market and have been for many, many years," he wrote on his blog.

Google, Microsoft, Apple, Amazon and Facebook did not return requests for comment.

Netflix, for example, accounts for roughly 30 percent of all web traffic.  Because data-heavy videos can create traffic jams on broadband networks, the company is paying Comcast to ensure its videos are streamed to customers more smoothly.

Such deals pertain to how Internet traffic flows between your Internet provider and third-party middlemen who operate the backbone of the web.

Those deals are technically beyond the scope of the Federal Communications Commission's recent proposal to allow Internet providers to charge web companies more to deliver their content via a "fast lane."  The FCC's proposed fast lanes only relate to the so-called last mile of online traffic that flows directly to customers' homes.

On Tuesday, FCC Chairman Tom Wheeler told a congressional panel that the FCC would start looking more closely at the type of deals that Rayburn highlighted.

In his blog post, Rayburn said the deals are fair.  If companies like Netflix didn't pay extra to ensure their content was delivered smoothly, Internet providers would be forced to raise prices on customers by passing on the extra cost of handling the increased traffic from all of Netflix's streaming videos.

This chart from Rayburn's blog indicates deals between tech companies and Internet providers:

Monday, May 19, 2014

SUPERCOMPUTERS - Canonical and China Collaboration in Cloud Computing

"NUDT and Canonical bring OpenStack to world’s fastest supercomputer" by Canonical 5/4/2014

China’s National University of Defense Technology, NUDT , developers of the Tianhe-2 supercomputer, and Canonical , the organization behind Ubuntu, today announce a collaboration to bring OpenStack to the world’s fastest supercomputer for high performance cloud environments.

The new collaboration with Canonical will enable Ubuntu Server, Ubuntu OpenStack and Ubuntu’s orchestration tool, Juju, to run Tienhe2.  Today, Ubuntu OpenStack is running on 256 high performance nodes and this will grow to over 6400 nodes in the coming months.  The nodes will be available to Government departments in Guangdong province as well as other NUDT partners for analysis, census, and eGovernment applications.

Both OpenStack and Ubuntu’s orchestration tool, Juju, will run on Tianhe-2 to enable NUDT partners and affiliate to rapidly deploy and manage very high performance cloud environments.  The Juju orchestration tool makes it easy to design, deploy, scale and manage cloud workloads in OpenStack (cloud) environments.  Workloads running on Tianhe-2 will enjoy higher inter-connect bandwidth and computing power for point heavy or memory intensive application.

Professor QingBo Wu at NUDT comments; “NUDT is a pioneer of technology, especially in the area of high performance.  Tianhe-2, the world’s fastest supercomputer runs on Ubuntu Kylin and now with OpenStack and Ubuntu Juju, we are able to deliver high performance OpenStack.”

“To see the fastest supercomputer running OpenStack is already a beautiful thing,” said Mark Shuttleworth, founder of Ubuntu.  “To see it running OpenStack with workloads orchestrated by Ubuntu Juju is incredibly powerful.  We can’t wait to see it rolled out further.”

NUDT designed and built Tianhe-2, which runs on its own Kylin Cloud Linux operating system and has held the record for the world’s fastest supercomputer since 2013, having recorded results of Linpack Performance (Rmax)33,862.7 TFlop/s.  The servers use Intel Xeon processors, Intel Xeon Phi co-processors and a 160Gb per second interconnect for super-fast data transfer between nodes.

Wednesday, May 14, 2014

SECURITY - Comic Book With a Cybersecurity Theme as a Teaching Tool

"‘Cynja’ battles botnets and other cyber-scourges" by Larisa Epatko, PBS NewsHour 5/12/2014

The PBS NewsHour’s Hari Sreenivasan speaks to co-authors Chase Cunningham and Heather Dahl about their new comic book, “The Cynja”.

Fictional character Grant Wiley, 11, is a wiz on computers.  One day, his favorite teacher disappears, leaving nothing at his desk but a smoldering USB stick.  Suspecting his help might be needed, Grant grabs the stick and plugs it into his computer at home.

He’s instantly sucked into the Internet and thus begins his adventures as a newly trained “cynja” fighting computer worms, hackers, malware … and worse.

Authored by Chase Cunningham and Heather Dahl, who both work for cybersecurity consulting firms, “The Cynja: Volume 1” aims to introduce children to the world of cybersecurity and teach them how to protect their computers.

Dahl said in an interview with the PBS NewsHour’s Hari Sreenivasan that she was motivated to take on the project when she tried to find a book to teach her young nephew about “the bad guys who live in our computers.”  But “I couldn’t find anything that showed the world I work in,” so she approached Cunningham to help fill the void.

Cunningham said his goal was to create a relatable character that could explain to children what people are doing to protect their cyber-future.  “I’ve worked in this industry for a long time and a lot of the guys that are out there doing what they can to protect the Internet and keep us safe, they don’t have badges and they’re not policemen or firemen or something like that.”

His hope is readers will better understand the role of these protectors and think, “You’re a cynja, you’re a cyberspace ninja — that’s cool.”

Monday, May 5, 2014

LINUX - Ubuntu 14.04 LTS

The latest Linux Distribution of Ubuntu is 14.04 (Ubuntu 14.04 LTS).  Note LTS = Long Term Service, which means five years of security and maintenance updates, guaranteed.

I upgraded (in place, via Ubuntu Software Update) from Ubuntu 13.10 with no problem, and only one minor utility did not work which is no problem (it is unsupported software).

My new laptop (which came with Ubuntu 13.10) is 64bit which means Ubuntu 14.04 is 64bit version.

My desktop:

Some features seen on desktop:
  • The orange Ubuntu icon on the top-right Title Bar is the treed Classic Menu add-on (I hate using search to find applications)
  • Note, you can get Steam for Ubuntu
  • The blue icon with the arrow in the Unity Bar (left side of desktop) is Krusader split-panel file manager, which has an option to run in Root Mode (Root is the equivalent of Windows Administrator mode)
  • As you can see Ubuntu comes with Firefox WEB browser
  • And my favorite Desktop Calendar "Rainlendar Lite" (free version) which I also have on my WinXP Desktop rig


Rainlendar is NOT included in the Ubuntu Software Center but can be downloaded from their site.

Rainlendar Home - Rainlendar all version download
The install package is a .deb file, I used the 64bit Debian/Ubuntu version.

WARNING:  Do NOT use Ubuntu Software Center to install! (which is the default installer)

Use the GDeb Package Installer, which comes with Ubuntu 14.  Recommend this installer for any Linux Debian software NOT found on Ubuntu Software Center or Synaptic Package Manager.

Friday, April 25, 2014

INTERNET - FCC Goes For Non-Neutrality

Consumers, bend over and spread cheeks.

"F.C.C., in a Shift, Backs Fast Lanes for Web Traffic" by EDWARD WYATT, New York Times 4/23/2014


The principle that all Internet content should be treated equally as it flows through cables and pipes to consumers looks all but dead.

The Federal Communications Commission said on Wednesday that it would propose new rules that allow companies like Disney, Google or Netflix to pay Internet service providers like Comcast and Verizon for special, faster lanes to send video and other content to their customers.

The proposed changes would affect what is known as net neutrality — the idea that no providers of legal Internet content should face discrimination in providing offerings to consumers, and that users should have equal access to see any legal content they choose.

The proposal comes three months after a federal appeals court struck down, for the second time, agency rules intended to guarantee a free and open Internet.

Tom Wheeler, the F.C.C. chairman, defended the agency’s plans late Wednesday, saying speculation that the F.C.C. was “gutting the open Internet rule” is “flat out wrong.”  Rather, he said, the new rules will provide for net neutrality along the lines of the appeals court’s decision.

Still, the regulations could radically reshape how Internet content is delivered to consumers.  For example, if a gaming company cannot afford the fast track to players, customers could lose interest and its product could fail.

The rules are also likely to eventually raise prices as the likes of Disney and Netflix pass on to customers whatever they pay for the speedier lanes, which are the digital equivalent of an uncongested car pool lane on a busy freeway.

Consumer groups immediately attacked the proposal, saying that not only would costs rise, but also that big, rich companies with the money to pay large fees to Internet service providers would be favored over small start-ups with innovative business models — stifling the birth of the next Facebook or Twitter.

“If it goes forward, this capitulation will represent Washington at its worst,” said Todd O’Boyle, program director of Common Cause’s Media and Democracy Reform Initiative.  “Americans were promised, and deserve, an Internet that is free of toll roads, fast lanes and censorship — corporate or governmental.”

If the new rules deliver anything less, he added, “that would be a betrayal.”

Mr. Wheeler rebuffed such criticism.  “There is no ‘turnaround in policy,’ ” he said in a statement.  “The same rules will apply to all Internet content.  As with the original open Internet rules, and consistent with the court’s decision, behavior that harms consumers or competition will not be permitted.”

Broadband companies have pushed for the right to build special lanes.  Verizon said during appeals court arguments that if it could make those kinds of deals, it would.

Under the proposal, broadband providers would have to disclose how they treat all Internet traffic and on what terms they offer more rapid lanes, and would be required to act “in a commercially reasonable manner,” agency officials said.  That standard would be fleshed out as the agency seeks public comment.

"Consumer groups warn dismantling net neutrality could stymie startup innovation" PBS NewsHour 4/24/2014


SUMMARY:  The Federal Communications Commission is on the brink of changing the longstanding net neutrality principle, which allows consumers unfettered access to web content, and limits the ability of Internet service providers to block or filter material.  New guidelines would allow some companies to charge more (to the content provider, like YouTube) for faster service.  Gwen Ifill talks to Cecilia Kang of The Washington Post about what’s at stake.

Monday, April 21, 2014

INTERNET - Comments as Venues For Rudeness or Insults

"Taming the ‘Wild West’ of online comments" PBS NewsHour 4/20/2014


SUMMARY:  More and more websites are including online commenting as a feature for their visitors.  But sometimes the comment boards become venues for rudeness and insults.  These comments can influence how a reader perceives the story.  Hari Sreenivasan speaks with web experts who help manage online communities and comments in different ways.

Thursday, April 10, 2014

SECURITY - Heartbleed Hacks SSL Security Servers

Heartbleed hacks into the SSL protocol that protects HTTPS sites.

"Security bug Heartbleed could have provided key that unlocks personal online data" PBS NewsHour 4/9/2014


GWEN IFILL (NewsHour):  You may have heard headlines today about a major lapse in Internet security and the possibility that millions of passwords, credit card numbers, bank information, and commonly used Web sites could have been exposed.

It involves a bug or security leak called Heartbleed, which can be used to read encrypted information.

Hari Sreenivasan gets a breakdown on what you need to know.

HARI SREENIVASAN (NewsHour):  Essentially, Heartbleed can be used to read the memory of computer servers, the places behind a Web site that store your information, including the lock and key system which protects your usernames and passwords.

You probably see this encryption in the form of a green lock when you conduct a transaction and exchange information.  The breach was revealed this week, but apparently has existed for a long time.

Russell Brandom of The Verge, an online site covering tech news, is here to help explain.

Wednesday, April 9, 2014

WINDOWS XP - The Enhanced Mitigation Experience Toolkit (EMET)

Now that SECURITY support for ordinary users of Windows XP is ended, here's an alternative way to protect WinXP.

Note that Microsoft Updates (which you should be using instead of Windows Updates) will still update some Microsoft software, like the "Malicious Software Removal Tool."  What stops is security updates to WinXP itself.

The alternative protection is Microsoft's The Enhanced Mitigation Experience Toolkit (EMET)

WARNING:  The EMET is NOT for amateurs.  If used incorrectly it can cause problems with WinXP.  But if you use Recommended Settings on installation, and the Quick Profile Name [Recommended Security settings] it should be safe.

Note that EMET is for all versions of Windows and some features are not available in WinXP.

Here's a screenshot of my EMET GUI:

With WinXP SEHOP & ASLR are not available.

There Software Profiles you can [Import].  I imported Popular Software.

From the support page in above link:

What is the Enhanced Mitigation Experience Toolkit?

The Enhanced Mitigation Experience Toolkit (EMET) is a utility that helps prevent vulnerabilities in software from being successfully exploited.  EMET achieves this goal by using security mitigation technologies.  These technologies function as special protections and obstacles that an exploit author must defeat to exploit software vulnerabilities.  These security mitigation technologies do not guarantee that vulnerabilities cannot be exploited.  However, they work to make exploitation as difficult as possible to perform.

EMET 4.0 and newer versions also provide a configurable SSL/TLS certificate pinning feature that is called Certificate Trust.  This feature is intended to detect man-in-the-middle attacks that are leveraging the public key infrastructure (PKI).

Are there restrictions as to the software that EMET can protect?

EMET can work together with any software, regardless of when it was written or by whom it was written.  This includes software that is developed by Microsoft and software that is developed by other vendors.  However, you should be aware that some software may not be compatible with EMET.  For more information about compatibility, see the "Are there any risks in using EMET?" section.

What are the requirements for using EMET?

EMET 3.0 requires the Microsoft .NET Framework 2.0.
EMET 4.0 and 4.1 require the Microsoft .NET Framework 4.0. Additionally, for EMET to work with Internet Explorer 10 on Windows 8, KB2790907 must be installed.

The Microsoft Download page for EMET.  You should download both the Setup and Guide.

Note that EMET is just a GUI that makes setting various Windows options easier.

Also, I did try with DEP [Always On] (Maximum protection settings) but that prevented 2 of my boot-time apps from running, like MiniMinder.  So I changed back to the settings you see in my GUI screenshot.

Wednesday, March 26, 2014

"Microsoft makes source code for MS-DOS and Word for Windows available to public" by Roy Levin (Microsoft Research), Official Microsoft Blog 3/25/2014

On Tuesday, we dusted off the source code for early versions of MS-DOS and Word for Windows.  With the help of the Computer History Museum, we are making this code available to the public for the first time.

The museum has done an excellent job of curating some of the most significant historical software programs in computing history.  As part of this ongoing project, the museum will make available two of the most widely used software programs of the 1980’s, MS DOS 1.1 and 2.0 and Microsoft Word for Windows 1.1a, to help future generations of technologists better understand the roots of personal computing.

In 1980, IBM approached Microsoft to work on a project code-named “Chess.”  What followed was a significant milestone in the history of the personal computer.  Microsoft, at the time, provided the BASIC language interpreter for IBM.  However, they had other plans and asked Microsoft to create an operating system.  Without their own on hand, Microsoft licensed an operating system from Seattle Computer Products which would become the foundation for PC-DOS and MS-DOS.

IBM and Microsoft developed a unique relationship that paved the way for advancements in the nascent personal computer industry, and subsequent advancements in personal computing.

Bill Gates was interviewed by David Bunnell just after the launch of the IBM PC in the early 1980s for PC Magazine’s inaugural issue, and provided the backstory:  “For more than a year, 35 of Microsoft's staff of 100 worked fulltime (and plenty of overtime) on the IBM project.  Bulky packages containing computer gear and other goodies were air-expressed almost daily between the Boca Raton [IBM] laboratory and Seattle [Microsoft].  An electronic message system was established and there was almost always someone flying the arduous 4,000 mile commute.”

Following closely on the heels of MS DOS, Microsoft released the first DOS-based version of Microsoft Word in 1983, which was designed to be used with a mouse.  However, it was the 1989 release of Word for Windows that became a blockbuster for the company and within four years it was generating over half the revenue of the worldwide word-processing market.  Word for Windows was a remarkable engineering and marketing achievement, and we are happy to provide its source code to the museum.

It’s mind-boggling to think of the growth from those days when Microsoft had under 100 employees and a Microsoft product (MS-DOS) had less than 300KB (yes, kilobytes) of source code.  From those roots we’ve grown in a few short decades to become a company that has sold more than 200 million licenses of Windows 8 and has over 1 billion people using Microsoft Office.  Great things come from modest beginnings, and the great Microsoft devices and services of the future will probably start small, just as MS-DOS and Word for Windows did.

Thanks to the Computer History Museum, these important pieces of source code will be preserved and made available to the community for historical and technical scholarship.

Tuesday, March 25, 2014

THE WEB - Who Should Oversee It

The title of this article is slightly misleading to non-techies.  NO single entity controls the WEB.  The issue is who assigns the Internet Protocol (IP) Addressing and assigning of Domain Names to IPs.

"As the U.S. government relinquishes control, who should oversee the Web?" PBS NewsHour 3/24/2014


SUMMARY:  The Commerce Department recently announced it would give up oversight of ICANN, the California nonprofit that manages the unique domains of the world's websites and email servers.  There's been international pressure to make the change, especially in light of revelations about NSA surveillance.  Vint Cerf of Google and Randolph May of the Free State Foundation join Judy Woodruff to offer debate.

JUDY WOODRUFF (NewsHour):  Who controls the World Wide Web, and how is it overseen and governed?  These are the questions that most of us don’t really know the answers to, but the Obama administration announced a change in the role played by the United States, one that’s stirring up concerns about the Internet’s future and freedom from censorship.

FADI CHEHADE, CEO, ICANN:  To become the world’s ICANN, we have to go to the world.

JUDY WOODRUFF:  Change was in the wind as the Internet Corporation for Assigned Names and Numbers, ICANN, kicked off a meeting in Singapore this weekend, its purpose, to start crafting a transition from U.S. control of administration of the Internet.

Since 1998, the California nonprofit has had a federal contract to manage the unique identifiers of the world’s Web sites and e-mail servers, regulating domain names such as dot-com and dot-gov.

Wednesday, March 19, 2014

WINXP - Updates to Continue for Big Business For a Fee

More proof that Microdunce does not care about peon customers.  They are just another greedy company who cares only about profits and not serving customers who bought their product.  I would be willing to pay $50/year for continued WinXP Updates.

This strategy is recently confirmed by several banks making the Updates For Fee deal with Microdunce to protect their ATMs running WinXP.

"Microsoft will still patch Windows XP for a select group" by Gregg Keizer, PCWorld 9/1/2013


Just because Microsoft doesn't plan on giving Windows XP patches to the public after April 8, 2014, doesn't mean it's going to stop making those patches.

In fact, Microsoft will be creating security updates for Windows XP for months—years, even—after it halts their delivery to the general public.

Some will pay big for support

Those patches will come from a program called "Custom Support," an after-retirement contract designed for very large customers who have not, for whatever reason, moved on from an older OS.

As part of Custom Support—which according to analysts, costs about $200 per PC for the first year and more each succeeding year—participants receive patches for vulnerabilities rated "critical" by Microsoft.  Bugs ranked as "important," the next step down in Microsoft's four-level threat scoring system, are not automatically patched.  Instead, Custom Support contract holders must pay extra for those.  Flaws pegged as "moderate" or "low" are not patched at all.

"Legacy products or out-of-support service packs covered under Custom Support will continue to receive security hotfixes for vulnerabilities labeled as 'Critical' by the MSRC [Microsoft Security Response Center]," Microsoft said in a Custom Support data sheet.  "Customers with Custom Support that need security patches defined as 'Important' by MSRC can purchase these for an additional fee.

"These security hotfixes will be issued through a secure process that makes the information available only to customers with Custom Support," the data sheet promised.

Because Microsoft sells Custom Support agreements, it's obligated to come up with patches for critical and important vulnerabilities.  And it may be required to do so for years: The company sells Custom Support for up to three years after it retires an operating system.

Custom Support and the XP security updates that result have been one reason why some experts have held out hope that Microsoft will backtrack from retiring XP next April.  Their reasoning is straightforward: Microsoft will have patches available—its engineers won't have to do any more work than they already committed to doing—so handing them out to all would be a simple matter.

Or not.  Most experts have said that the chance Microsoft will prolong Windows XP's life run between slim and none.  And giving away patches to everyone risks a revolt by those big customers who have paid millions for Custom Support.

But Microsoft does have options.  Here are our suggestions:

Continue patching for free

If Windows XP remains a major presence, as it appears likely, with projections as high as 33.5 percent of all personal computers at the end of April 2014, Microsoft could decide to continue patching the aged OS with free fixes for critical vulnerabilities, maybe even those rated important.

Such a move would be unpalatable to Custom Support customers, but Microsoft could renegotiate the fees—unlikely—or remind those companies of the program's other benefits, which include access to support representatives, as well as to prior patches and hotfixes.

Patch critical vulnerabilities under attack

Microsoft could selectively patch only the critical bugs that are being exploited by hackers.  Presumably, that would be a subset of the complete XP patch collection assembled each month.

Some analysts have picked this option as a possibility.  Last December, Michael Cherry of Directions on Microsoft posed just such a situation.

"Suppose ... a security problem with XP suddenly causes massive problems on the Internet, such as a massive [denial-of-service] problem?" asked Cherry at the time.  "It is not just harming Windows XP users, it is bringing the entire Internet to its knees.  At this time there are still significant numbers of Windows XP in use, and the problem is definitely due to a problem in Windows XP.  In this scenario, I believe Microsoft would have to do the right thing and issue a fix ... without regard to where it is in the support lifecycle."

Charge users for XP patches

Although Microsoft would much rather book revenue from the sale of a newer OS, it may realize that some will refuse to upgrade, and try to make money rather than give away fixes.

It's unlikely that Microsoft would be able to charge $200 annually for post-retirement patches, as it does with Custom Support customers, but it may be able to get away with $50 a year for individuals and small businesses, perhaps with a maximum machine cap at, say, five PCs per customer.

Traditionally, Microsoft's not charged for support, but it could cast this as a special situation caused by the longevity of XP, which was due to the delay of Vista and secondarily, that OS's subsequent flop.  In late 2007, when Microsoft extended XP availability to OEMs by several months, it cited Vista's delayed launch for the unusual move.  (It added another extension in 2008 that kept XP alive on new "netbook" PCs, the then-popular class of cheap laptops, until mid-2010.)

And Microsoft has talked up a transformation to a "devices-and-services" company; a pay-for-support plan would mesh nicely with the latter half of that strategy.

Thursday, March 13, 2014

WORLD WIDE WEB - 25th Birthday

"25 years on, still adapting to life tangled up in the Web" PBS NewsHour 3/12/2014


JUDY WOODRUFF (NewsHour):  The World Wide Web turns 25 years old today.  The date marks the publication of a paper that originally laid out the concept, which eventually led to the vast system of Internet sites we now use.

Jeffrey Brown looks at how it’s changed the world we live in.

JEFFREY BROWN (NewsHour):  One way to do that is to look at how individual Americans think about the Internet and its impact on their lives.

The Pew Research Internet Project did that in a survey just out.  Among much else, it finds that 87 percent of American adults now use the Internet, and the number goes up to 97 percent for young adults from 18 to 29.  Ninety percent of Internet users say the Internet has been a good thing for them personally, though the number drops to 76 percent when asked if the Internet has been a good thing for society generally, with 15 percent saying it’s been bad for society.

And 53 percent of Internet users say the Internet would be, at minimum, very hard to give up.

We’re joined by three people who’ve watched the growth of the Internet from different angles.  Xeni Jardin is a journalist and editor at the Web blog Boing Boing, which covers technology and culture.  Catherine Steiner-Adair is a clinical and consulting psychologist at Harvard Medical School, and author of “The Big Disconnect: Protecting Childhood and Family Relationships in the Digital Age.”  And Daniel Weitzner teaches computer science and Internet public policy in at MIT.  From 2011 to 2012, he was U.S. deputy chief technology officer in the White House.

And welcome to all of you.

And, Daniel Weitzner, I will start with you, because you worked with Tim Berners-Lee, who — one of the main people that started all this 25 years ago.  What has — what surprises you now, sitting here 25 years later, about where we’re at?

DANIEL WEITZNER, Massachusetts Institute of Technology:  Well, it does surprise me how tremendously the Internet and the Web has grown into every aspect of our lives.

I think that a lot of us who were involved in the early days of the Internet and the Web had hoped that it could really reach the whole world.  And there’s no question that Tim Berners-Lee, who — whose architecture for the World Wide Web really helped it to grow, had the ambition that it in fact cover the whole world — represent everything in the world.  But I think it’s amazing how far we have actually come in that direction.

Tuesday, February 4, 2014


"Nadella to head Microsoft; Gates leaves chair role" by AP, Washington Post 2/4/2014

Microsoft has named Satya Nadella, an executive in charge of the company’s small but growing business of delivering software and services over the Internet, as its new CEO.  Company founder Bill Gates is leaving the chairman role for a new role as technology adviser.

The software company announced Tuesday that Nadella will replace Steve Ballmer, who said in August that he would leave the company within 12 months.  Nadella will become only the third leader in the software giant’s 38-year history, after Gates and Ballmer.  Board member John Thompson will serve as Microsoft’s new chairman.

Nadella, who is 46 and has worked at Microsoft for 22 years, has been an executive in some of the company’s fastest-growing and most-profitable businesses, including its Office and server and tools business.

For the past seven months, he was the executive vice president who led Microsoft’s cloud computing offerings.  That’s a new area for Microsoft, which has traditionally focused on software installed on personal computers rather than on remote servers connected to the Internet.  Nadella’s group has been growing strongly, although it remains a small part of Microsoft’s current business.

“Satya is a proven leader with hard-core engineering skills, business vision and the ability to bring people together,” Gates said in a statement.  “His vision for how technology will be used and experienced around the world is exactly what Microsoft needs as the company enters its next chapter of expanded product innovation and growth.”

The company said that Gates, in his new role as founder and technology adviser, “will devote more time to the company, supporting Nadella in shaping technology and product direction.”

Gates will also remain a member of Microsoft’s board.

Analysts hope that Nadella can maintain the company’s momentum in the rapidly expanding field of cloud computing while minimizing the negative impact from Microsoft’s unprofitable forays into consumer hardware.  Major rivals in cloud computing include Google Inc., Inc., Inc. and IBM Corp.

FBR Capital Markets analyst Daniel Ives said he views Nadella as a “safe pick.”

Ives said investors are worried that rivals “from social, enterprise, mobile, and the tablet segments continue to easily speed by the company.”  In a note to investors, he said the company’s main need now is “innovation and a set of fresh new strategies to drive the next leg of growth.”

Microsoft shares rose 8 cents to $36.56 in morning trading Tuesday.

Nadella’s appointment comes at a time of turmoil for Microsoft.

Founded in April 1975 by Gates and Paul Allen, the company has always made software that powered computers made by others — first with its MS-DOS system, then with Windows and its Office productivity suite starting in the late 1980s.  Microsoft’s coffers swelled as more individuals and businesses bought personal computers.

But Microsoft has been late adapting to developments in the technology industry.  It allowed Google to dominate in online search and advertising, and it watched as iPhones, iPads and Android devices grew to siphon sales from the company’s strengths in personal computers.  Its attempt to manufacture its own devices has been littered with problems, from its quickly aborted Kin line of phones to its still-unprofitable line of Surface tablets.

Analysts see hope in some of the businesses Nadella had a key role in creating.

Microsoft’s cloud computing offering, Azure, and its push to have consumers buy Office software as a $100-a-year Office 365 subscription are seen as the biggest drivers of Microsoft’s growth in the next couple of years.  Both businesses saw the number of customers more than double in the last three months of the year, compared with a year earlier.

Those businesses, along with other back-end offerings aimed at corporate customers, are the main reason why investment fund ValueAct Capital invested $1.6 billion in Microsoft shares last year.

Last April, the fund urged investors to ignore the declining PC market — which hurts Microsoft’s Windows business — and to focus on the so-called “plumbing” that Microsoft provides to help companies analyze massive amounts of data and run applications essential to their businesses on Microsoft’s servers or their own.

“Satya was really one of the people who helped build up the commercial muscle,” said Kirk Materne, an analyst with Evercore Partners.  “He has a great understanding of what’s going on in the cloud and the importance of delivering more technology as a service.”

Nadella is a technologist, fulfilling the requirement that Gates set out at the company’s November shareholder meeting, where the Microsoft chairman said the company’s new leader must have “a lot of comfort in leading a highly technical organization.”

Born in Hyderabad, India, in 1967, Nadella received a bachelor’s degree in electrical engineering from Mangalore University, a master’s degree in computer science from the University of Wisconsin, Milwaukee, and a master’s of business administration from the University of Chicago.

He joined Microsoft in 1992 after being a member of the technology staff at Sun Microsystems.

One of his first tasks will be integrating Nokia’s money-losing phone and services business.  Microsoft agreed in September to buy that and various phone patent rights for 5.4 billion euros ($7.3 billion) in one of Ballmer’s last major acts as CEO.  That deal is expected to be completed by the end of March.

Partly because of Nadella’s insider status and the fact that both Gates and Ballmer will remain Microsoft’s largest shareholders and for now, company directors, analysts aren’t expecting a quick pivot in the strategy of making its own tablets and mobile devices.

Some hope, however, that he will make big changes that will help lift Microsoft stock, which has been stuck in the doldrums for more than a decade.  Since Ballmer took office in Jan. 13, 2000, Microsoft shares are down a split-adjusted 32 percent, compared with a 20 percent gain in the S&P 500.

“We do not want to see a continuation of the existing direction for the business, so it will be important that Mr. Nadella be free to make changes,” Nomura analyst Rick Sherlund wrote in a note Friday.