Thursday, August 30, 2012

Telework - the future. And more productive

(Tom paraphrasing/citing here...)

Many firms are uncertain about what policies on home working to adopt. As a result, firms in very similar industries adopt extremely different practices. For example, in the U.S. airline industry Jet Blue allows all regular call-center employees to work from home.

The trade-off between home-life and work-life has also received increasing attention as the number of households in the US with all parents working has increased from 25% in 1968 to 48% by 2008 (Council of Economic Advisors, 2010). These rising work pressures are leading governments in the US and Europe to investigate ways to promote work-life balance. For example, the Council of Economic Advisers (CEA) published a report launched by Michelle and Barak Obama at the White House in summer 2010 on policies to improve work-life balance. One of the key conclusions in the executive summary concerned the need for research to identify the trade-offs in work-life balance policies, stating:

“A factor hindering a deeper understanding of the benefits and costs of flexibility is a lack of data on the prevalence of workplace flexibility and arrangements, and more research is needed on the mechanisms through which flexibility influences workers’ job satisfaction and firms’ profits to help policy makers and managers alike” (CEA, 2010)


First, the performance of the home workers went up dramatically, increasing by 12.2% over the nine month experiment. This improvement came mainly from an 8.9% increase in the number of minutes they worked during their shifts (the time they were logged in taking calls). This was due to a reduction in breaks and sick-days taken by the home workers. The remaining 3.3% improvement was because home workers were more productive per minute worked, apparently due to the quieter working conditions at home.

Second, there were no spillovers on to the rest of the group – interestingly, those remaining in the office had no change in performance.

Third, attrition fell sharply among the home workers, dropping by almost 50% versus the control group. Home workers also reported substantially higher work satisfaction and attitudinal survey outcomes.

Finally, at the end of the experiment the firm was so impressed by the impact of home-working they decided to roll the option out to the entire firm, allowing the treatment and control groups to re-choose their working arrangements. Almost one half of the treatment group changed their minds and returned to the office, while two thirds of the control group (who initially had requested to work from home) decided to stay in the office. This highlights how the impact of these types of management practices are also ex ante unclear to employees.

Read the full report here...


- Posted by Tom/Bluedog

Wednesday, August 29, 2012

If it rains, does Amazon Web Services flood? Lots of people think so

According to a recent survey, many people are confusing the clouds in real life with The Cloud. The survey sponsor, Citrix, points out that the good news is that even those that don’t know exactly what the cloud is recognize the economic benefits.

One in five Americans (22 percent) admit that they’ve pretended to know what the cloud is or how it works -- so keep that in mind when hiring Architects to design for your organization.

When asked what “the cloud” is, a majority responded it’s either an actual cloud (specifically, a “fluffy white thing”), the sky or something related to the weather (29 percent).

Only 16 percent said they think of a computer network to store, access and share data from Internet-connected devices when thinking about "the cloud."

“This survey clearly shows that the cloud phenomenon is taking root in our mainstream culture, yet there is still a wide gap between the perceptions and realities of cloud computing,” said Kim DeCarlis, vice president of corporate marketing at Citrix. “While significant market changes like this take time, the transition from the PC era to the cloud era is happening at a remarkable pace. The most important takeaway from this survey is that the cloud is viewed favorably by the majority of Americans, and when people learn more about the cloud they understand it can vastly improve the balance between their work and personal lives.”


- Posted by Tom/Bluedog

Tuesday, August 28, 2012

Deconstructing the Xerox - Apple Myth

In the aftermath of the Apple patent lawsuit, there's a lot of misinformation making its way around the inter-web. This is a detailed deconstruction of the "Steve Jobs Stole Xerox's Secrets" myth may help readers understand the complexities, spanning decades. A salient chunk:

Here is the first complicating fact about the Jobs visit. In the legend of Xerox PARC, Jobs stole the personal computer from Xerox. But the striking thing about Jobs's instructions to Hovey is that he didn't want to reproduce what he saw at PARC. "You know, there were disputes around the number of buttons—three buttons, two buttons, one-button mouse," Hovey went on. "The mouse at Xerox had three buttons. But we came around to the fact that learning to mouse is a feat in and of itself, and to make it as simple as possible, with just one button, was pretty important."

So was what Jobs took from Xerox the idea of the mouse? Not quite, because Xerox never owned the idea of the mouse. The PARC researchers got it from the computer scientist Douglas Engelbart, at Stanford Research Institute, fifteen minutes away on the other side of the university campus. Engelbart dreamed up the idea of moving the cursor around the screen with a stand-alone mechanical "animal" back in the mid- nineteen-sixties. His mouse was a bulky, rectangular affair, with what looked like steel roller-skate wheels. If you lined up Engelbart's mouse, Xerox's mouse, and Apple's mouse, you would not see the serial reproduction of an object. You would see the evolution of a concept.

The same is true of the graphical user interface that so captured Jobs's imagination. Xerox PARC's innovation had been to replace the traditional computer command line with onscreen icons. But when you clicked on an icon you got a pop-up menu: this was the intermediary between the user's intention and the computer's response. Jobs's software team took the graphical interface a giant step further. It emphasized "direct manipulation." If you wanted to make a window bigger, you just pulled on its corner and made it bigger; if you wanted to move a window across the screen, you just grabbed it and moved it. The Apple designers also invented the menu bar, the pull-down menu, and the trash can—all features that radically simplified the original Xerox PARC idea.

The difference between direct and indirect manipulation—between three buttons and one button, three hundred dollars and fifteen dollars, and a roller ball supported by ball bearings and a free-rolling ball—is not trivial. It is the difference between something intended for experts, which is what Xerox PARC had in mind, and something that's appropriate for a mass audience, which is what Apple had in mind. PARC was building a personal computer. Apple wanted to build a popular computer.




- Posted by Tom/Bluedog

Sunday, August 26, 2012

What's on the horizon for cyber-attacks

Organizations around the Pacific rim are having to stave off more security breaches than others, with not enough staff to shore up the bulwarks.

Businesses in the Asia-Pacific region are having to fight off more security breaches than the rest of the world, according to the ISACA 2012 Governance of Enterprise IT survey, which looked at over 3,000 businesses, including ones from Australia and New Zealand.





Saturday, August 25, 2012

The Last Ninja

It seems a shame, but this fellow seems to be the last ninja.

Jinichi Kawakami is a 63-year-old former engineer who may not fit the typical image of a dark-clad assassin with deadly weapons who can disappear into a cloud of smoke -- but Jinichi Kawakami is reputedly Japan's last ninja.

Kawakami first encountered the secretive world of ninjas at the age of just six, but has only vague memories of first meeting his master, Masazo Ishida, a man who dressed as a Buddhist monk.

"I kept practising without knowing what I was actually doing. It was much later that I realised I was practising ninjutsu," he says.

It is difficult to pin down the emergence of the first ninja, more properly called shinobi. As long as there has been political intrigue, there have been spies and assassins.

Japanese folklore tells us that the ninja descended from a demon that was half man and half crow. The ninja evolved as an opposing force to their upper-class contemporaries, the samurai, in early feudal Japan.


Wednesday, August 22, 2012

The Fed is getting something right

Since the end of Bretton Woods, the reality that money is intrinsically useless is widely known (currency is used only as a medium of exchange). The value of money is set by the supply and demand for money and the supply and demand for other goods (including gold) and services in the global economy. At the start of the Great Depression, investors began trading in currencies and commodities. Gold prices rose as more people demanded the element to replace dollars . When banks couldn't meet the demand, they began failing. The US central bank kept raising interest rates, trying to make dollars more valuable and dissuade people from further depleting gold reserves. Later, when the Bretton Woods agreement fixed the price of gold to control fluctuations in currency exchange rates, the dollar became the de facto replacement. The strong dollar led to inflation and a large balance of payments deficit in the U.S. which in turn helped to create stagflation. The U.S. started to deflate the dollar in terms of its value in gold to curb double digit inflation. In 1971, gold was repriced to $38 per ounce, then again to $42 per ounce in 1973. As the dollar devalued, people were anxious to sell U.S. dollars for gold. In late 1973, the U.S. government decoupled the value of the dollar from gold altogether. The price of gold quickly shot up to $120 per ounce in the free market. At present we see gold at an astronomical price -- buying gold is a standard fall-back during a recession, so no surprise. But a collapse in gold prices could signal further turmoil. Daily reports of countries bracing for the end of the Euro don't help.

Make Stuff, Don't Waste Energy on Fighting

San Francisco Chronicle says it best: stop wasting energy (and lots of money. Lots.) on court cases. Invest in the future.
The problem is that neither company seems to have any idea what to do with its riches. The very act of cash hoarding suggests as much. If either company had projects deemed worthy by corporate execs holding the purse strings, the money would be spent.

The author continues,
Rather than invest in technology that might be commercially viable a decade or two down the road, Apple seems content for now to amass an ocean of cash, defend its 50 percent profit margins with an army of lawyers, and focus on incrementally adjusting - and protecting with a bulwark of broad patents - its current product designs. Sleek and user-friendly designs to be sure, but society-changing technological innovations comparable to the transistor or disk drive they are not.
- Posted by Tom/Bluedog

Monday, August 20, 2012

Use of metadata is driving the growth in effectiveness of the web as a tool for knowledge workers. In my role as cloud architect, I've been charged with re-designing enterprise content management systems. Designing a reasonable hierarchical classification structure -- a taxonomy -- for an organization’s intellectual capital (see what I did there?) is central in the evolution of information systems. Without the implementation of a comprehensive taxonomy, effort will undoubtedly be wasted searching for content "the old fashion way." Lack of informational architecture alone should drive stakeholders to implement a robust, standard vocabulary for labeling or tagging business critical content across their organizations. I've focused on the Dublin Core for the basis of many a project.

The word "metadata" means "data about data". Metadata articulates a context for objects of interest -- "resources" such as MP3 files, library books, or satellite images -- in the form of "resource descriptions". As a tradition, resource description dates back to the earliest archives and library catalogs. The modern "metadata" field that gave rise to Dublin Core and other recent standards emerged with the Web revolution of the mid-1990s.


Creating a concise taxonomy will also improve overall communications within the bounds of the organization and enable staff to better implement procedures and practices. Thus, creating an intuitive taxonomy can be both a team building exercise and an investment with additional dividends. Data quality is driven by a common set (and common understanding) of data standards, domain standards, and business rules. If data is located in one or more locations , Duplication and the need to re-create it are reduced or eliminated. Finally, by using a metadata repository and enforcing common standards, reports and dashboards will reflect the correct data.

Metadata is key to building intelligent and high-performing enterprise solutions. The benefits cascade to many facets of the organization, including business process management, business intelligence, IT management. Better business performance, staff efficiency and stakeholder satisfaction are compelling enough reasons to mesh metadata development into IT strategies.

Friday, August 17, 2012

Being connected - people really like it

Regular readers know that I place a high value on communication, the essence of which is the transfer of information. Throughout history this has been a key element in the evolution of civilized societies, and is a requirement of organization and planning.
The advancement of communication technology has addressed unique aspects of information transfer:
- the speed and distance at which information can be transmitted,
- the permanency of the information, and
- the volume of information that can be transmitted.
Throughout history, technological innovations have allowed for the steady improvement of all three aspects. However, in the last three decades, advances and globalization has made complicated and convoluted these once unique qualities as factors that were once limiting began to disappear.
In the results of this Time Magazine poll, we can see that people are attached to their mobiles. Of course, humans evolved and came to dominate this planet because of our ability to communicate, coordinate, and negotiate.
Some interesting tidbits: around one in four people check their mobile phone at least once every half-hour. People ages 25-29 sleep with their phones. Seventy-six percent of Americans think that being constantly connected is a mostly good thing. This contrasts others around the world, where, in South Korea, for example, mobiles are considered a distraction from responsibility by over sixty percent.

Wednesday, August 15, 2012

Harry Harrison - prolific, imaginative. Via con Dios

I'm sorry to see one of my favorite sci fi authors has died... Harry Harrison was an awesome writer. Read a decent obituary here...


- Posted by Tom/Bluedog

Monday, August 13, 2012

Smaller devices that do more, because of software based radio

Free up radio communication from hardware-specific needs with software -- that's the dream of many an electrical engineer.
A software-defined radio (SDR) is a radio device where components that have been typically implemented as hardware (tuner, filters, amplifiers, modulators/demodulators, etc.) are instead implemented by means of software.
Think of a mobile phone that could support multiple cellular technologies and frequencies at the same time and can be modified in the future without any hardware changes. Apple's upcoming iPhone will likely support this model -- reducing the need to build separate CDMA and GSM versions. When mass-customization meets remote software update, we see Apple's profits shoot up (lower unit production cost = higher margins), and the flexibility of devices explode.
Near-field communications, for example, is a useful tool to enable consumers to make credit card purchases with their mobile phones, based on a chip embedded in the phone chassis. But what if you could use many vendors' chip readers, without supporting the actual hardware antenna and other components? That's the beauty of an SDR.
Around the globe, SDRs could boost mobile operators in less-developed countries. The flexibility to combine different grades of hardware and software to strike the right balance between cost and network resiliency would be more opportunities for more people to be connected, at a lower cost.
The U.S. Federal Communications Commission loves SDR -- such software-defined radios enable the sharing of limited airspace and prevent interference.

Friday, August 10, 2012

What is the cloud? It is...

...alot of things. As Chenda Ngak says in his post over at TechTalk,
Think of the cloud as a disk drive that is owned by a company like Google or Apple, which stores all of your files in a remote location - typically at a server farm...

But the cloud is more than that. The idea of the "cloud" may have been spawned with drawings of the internet extended network (a so-called "wide area network") depicted as a cloud, rather than all the routers and nodes and connecting pathways. These days, consumers think of the cloud as that place where your photos go to live (iCloud), or where you connect to your friends (FaceBook), or check your email (Google's Gmail).
But the cloud can also be the unseen infrastructure that hosts these kind of services. Amazon and Google famously have such platforms. Software developers build solutions using these systems. The underlying working bits are transparent to the user, but there is a significant shift -- your local computer, phone or iPad longer has to do all the heavy lifting, running applications. The network of computers that make up the cloud handles all the processing. Hardware and software demands on the user's side decrease; all you need is an on-ramp to the info superhighway services, which can be as simple as a web browser. The cloud takes care of the rest.

Thursday, August 9, 2012

The 21st Century Employee Model - distributed workforce

Dan Pink chronicled the growing ranks of people who work for themselves in his 2001 book, Free Agent Nation. Many people may have started their own shops to become a part of the freelancer nation, even larger organizations can use freelance workers or others in alternative employment arrangements to help meet some of staffing needs.

In the multimedia/video/film world, experts come together to create, write, shoot, edit and distribute products. Such ad hoc teams form the basis of knowledge work that is creative, and profit-focused.

Such ad hoc project may involve work that requires less than a full complement of staff to complete and has immediate benefits. The work may involve frequent discussions between client (requester) and the team. One of the difficulties of having unaffiliated exerts work in groups on a project (software development, a video, a research paper, etc.) is that there are always a few who do not get fully involved in the project. This is an easy trap to fall into. Usually, one or two of the team members know much more about the topic, tools, or subject matter, and so end up putting the majority of the project, document, proposal, or whatever, together. Others are more adept at creativity, spreadsheets, running the software tools, etc., so these people end up doing a few specific tasks. Problems arise when one or two do all of the work. The rest of the group, with a lack of activities to keep them busy, often end up left out, underutilized, and even put off. Shortly, the stakeholders get frustrated with disorganization and the 80/20 rule kicks in: one or two people do the lion's share. For most, this lessens the benefit of such ad hoc cooperatives.

This can be easily avoided.

Staring out with a strong plan, and getting participant buy-in upfront makes the way forward clear. Providing an easy-to-understand project plan (perhaps with a GANTT or other visualization) and schedule keeps everyone focused on the goals. Assigning and tracking tasks, with everyones' input, results in measurable progress. If you have chosen your team members well, self-motivation will be evident.

It is not uncommon for many of us work regularly with colleagues based in different buildings, cities, countries, and even continents. Members of the work group may be in different time zones, speak different languages, and be of different cultures. Providing feedback and encouraging communication -- in real time or off-line via comments, discussion forums, email or even texting -- promotes bonding and ensures team members know they are valued.

Monday, August 6, 2012

Save Steve!

I am worried... Very concerned... The Woz is in trouble, he's on the Cinnabon Express Bus headed for heart attack city. He needs an intervention.
His views on the Cloud are, well, a little simplistic.



It doesn't seem like the Woz can tell the difference between consumer-oriented cloud application services (iCloud, Dropbox, Gmail) and infrastructure services such as Amazon AWS and Google App - which actually provide 'economies-of-scale' benefits over owning and managing one's own servers. This is particularly true if the consumer is a small business with no IT department.


Woz, scale back on the sugar, Trans fats, and hype. The cloud is only as evil as the companies that build/provide the services.