Sunday, July 31, 2011

Avoiding 'groupthink' -- decision making for teams

There are three main components to consider in consulting a group for decision making:

• Sharing knowledge and related content
• Discussing and overlaying knowledge on the data at hand
• Collectively deciding the best course of action from the choices that are presented.

There are several alternatives to consider:
• Should the leader make the decision?
• Should the leader delegate the decision to some other member of the group?
• Will the group make the decision through some form of majority vote?
• Should all decisions involving the group be made by consensus?
Even when a consensus approach is used, there is a danger that the decision may represent a false consensus.  A false consensus occurs when members of a group appear to accept a proposed course of action but actually have private reservations which, for whatever reason, they choose not to share with the other members of the group.


Thursday, July 28, 2011

LEED ratings and Green

An example of applying entrepreneurial thinking in small-scale construction, I've been thinking about my efforts to have a high LEED rating for our house and its data center. I’ve participated in various discussions where the cost benefit analysis is done for LEED points. I would imagine the conversations at Bluedog's customers might be along the lines of, “We want to berated platinum. Figure out how to make it cost effective." Very different than my approach: what is the highest rating I can garner, with the resources at hand?

Benefits of a LEED Home from U.S. Green Building Council on Vimeo.

">intro to LEED in new home construction.

Wednesday, July 27, 2011

Taking a punch - what do do when your SAAS gets DOSed

Even Bluedog has come under fire, albeit years ago and by a so-called 'script-kiddy' (luckily, not a Denial-of-Service DOS attack), but malicious attacks can afflict anyone with a presence on the WWW. Luckily, we have hardened our permitter, have fail-overs in place, multi-faceted/multi0layered security, and monitor our exposed systems. Here's an excellent white paper on absorbing a cyber-attack, a means of triaging when your systems are under assault...

Software services that are essential for mission success must not only withstand normal wear and tear, stresses and accidental failures, they also must endure the stresses and failures caused by malicious activities and continue to remain usable.

The concept of entrapping / encasing the intruder is not new -- think of the castle barbican murder-hole set-up: the enemy breaches the gate, portcullis, etc., and runs into the narrow hall leading to the castle courtyard. Except, bowmen are waiting to shower arrows upon them from narrow slits high up on the walls.

Get the white paper here...

Tuesday, July 26, 2011

Zombie popularity in pop culture, part 1

Zombies are everywhere... a totally different threat from the ones we were afraid of in the 80s and 90s

Of course, Walking Dead is an awesome comic and tv show, but even the creators acknowledge there's a popular meme driving the public's interest in fighting off the unstoppable menace.

Zombies are the perfect embodiment of the troubles of our times... they come from nowhere, are relentless in their encroaching march, and eventually will overwhelm, no matter what we do.

But there is always hope, because humans in general, and western civilization in particular, are resilient. So don't fret over the current state of affairs, but remember, it pays to think ahead.

Saturday, July 23, 2011

Five Men and a Little Shady

So anger must be managed.  And if you can't manage it yourself, someone will step in to manage it for you.  For instance, say you're just super-pissed.  And you punch a wall.  No worries, that's why we have sheetrock, drywall or gypsum.  Right?  But if someone *see's* you punch the wall.  And you're standing anywhere near your spouse, you're looking at a few weeks of anger management training.

Wednesday, July 20, 2011

What's on my radar?

There's a host of technologies I am watching... 'cause that is how you stay ahead of the curve.


While server and even data center virtualization are moving mainstream, other aspects of infrastructure management are now evolving. Specifically, I/O virtualization is an essential complement to server virtualization. When running many VMs (virtual machines) on a server, each needs its own input/output, but if you satisfy that need with hardware, you eat up space for network and storage interfaces fast.

Semantic web

Web 2.0 is about collaboration. The next evolution of the WWW is already happening -- the semantic web is about building information *about* information. Meta data that describes web sites and other aspects of the "dark" web (data buried in databases). Unlocking the full potential of interlinked information would result in a Global Electronic Library that could combine all the available knowledge on the planet -- all books, periodicals, newsletters, journals, newspapers, web pages, video, audio broadcasts, spoken word, and more -- into a single, searchable resource available to everyone.

Solid state drives

A solid-state drive (SSD) is a data storage device that uses solid-state memory to store persistent data with the intention of providing access in the same manner of a traditional block i/o hard disk drive. SSDs are distinguished from traditional hard disk drives (HDDs), which are electromechanical devices containing spinning disks and movable read/write heads. SSDs, in contrast, use microchips which retain data in non-volatile memory chips and contain no moving parts. SSDs are typically less susceptible to physical shock, quieter, and have lower access time and latency, and use the same interface as hard disk drives, thus easily replacing them in most applications.

Super-fast access speeds, low power consumption, and increased capacity (and corollary smaller space requirements) points to the wide adoption of SSDs. But computer forensics implications include the inability to retrieve erased data and other issues not encountered with current technology hard drives.

Cloud computing

Cloud computing is a style of computing in which virtualized and standard resources are provided as a service over the Internet. It conceptualize Service-Oriented Architecture (SOA) and Virtualization to provide services for business applications online that are accessed from a web browser, while the software and data are stored in one or more data centers of unknown location to the end user.

Computing hardware has already become heavily commoditized, but, up to a few years ago software was still the realm of high priests With the emergence of cloud computing delivery models -- IaaS, PaaS and SaaS -- information technology professionals now have value-based vehicles to select from to support their various computing needs from do-it-yourself (DIY with infrastructure and platform as a service) to do-it-for-you (in the form of software-as-a-service).

Mobile apps

The realization of Global Positioning in new applications can be seen in mobile apps -- for smart phones, tablets and other venues. The proliferation of these platforms and a vibrant app ecosystems such as the ones around the iPhone/iPad and Android devices are revolutionizing the internet. This shift will arrive through a new wave of innovation that links cloud-based services, smart computing, and app-enabled devices, including cars, appliances, and entertainment systems. Location-aware means more data, resulting in further changes in how we operate in our world. Focused-use applications, aimed at addressing discrete computing needs, and enabled with user experience will extend the reach of these powerful platforms.

DC hardware power management

One solution to power system optimization is DC power. Since utility AC power must ultimately be converted to DC power for use by all silicon chip-based IT equipment and because stored energy systems (batteries, flywheels, etc.) provide DC power for backup, a DC power architecture requires fewer total conversions from grid to chip, creating the opportunity to reduce costs and increase efficiency. It also eliminates the need to de-rate usable capacity due to unbalanced loads, eliminating the concept of stranded power and allowing full utilization of power infrastructure. At the moment, power-over-ethernet is one application of DC power utilization on a small scale; converting entire data center infrastructure to Direct Current as part of a "green data center" movement (including cooling and energy efficiency and reclamation) would result in significant environmental, utility and other money savings.

"NoSQL" databases

Since the 1980s the relational database (RDBMS) has been the dominant model for database management. But, today, non-relational, “cloud,” or “NoSQL” databases are gaining ground as an alternative model for persistent data management. These type of data stores permit elastic scaling, handle large volumes of data with aplomb, require less administration (and help move business logic into the application tier, where it belongs), and support flexible data models.

Large-scale Business Intelligence

MapReduce is a patented software framework introduced by Google to support distributed computing on large data sets on clusters of computers. MapReduce enables something entirely new: the ability to crunch petabytes of data in a fraction of the time it would normally take -- on commodity hardware, no less. Apache Hadoop, now available via Amazon Web Services in the form of Amazon Elastic MapReduce, is the best-known implementation, but MapReduce is also being incorporated into mainstream solutions from IBM, Oracle, and others. As a framework that allows developers to write functions that process data, MapReduce enables fast analysis of large data sets.

Biofeedback or thought-control of electronics

A number of companies and research institutions have shown how brain waves, captured using sensors on a skull cap or head-set, can be used to control computer systems. The applications are medical — giving communications and control of the environment to heavily disabled people — military and, increasingly, in consumer and computer games control interfaces. This may seem like science fiction but the thought-control human-computer interface is here now

Printed electronics

The possibility of the rapid printing of multiple conductive, insulating and semiconductive layers to form electronic circuits holds out the prospect of much lower cost ICs than those prepared by conventional fabrication methods. Printing semiconductors usually implies the use of organic materials (although see below) with very different performance to silicon. It is also implies much larger minimum geometries than can be attained in silicon. But there are applications that can benefit from modest performance on flexible subtrates at low cost; the RFID tag is one and the active-matrix backplane for displays is another.

Energy harvesting

Energy harvesting is not a new idea. We have had the motion-powered wristwatch for many years. But as electronic circuits move from consuming milliwatts to consuming microwatts an interesting thing happens. It becomes possible to contemplate drawing power for those circuits, not from the electricity grid or from a battery but from a variety of ambient phenomena. And this is expected to have far-reaching impact.
One of the early applications is to have vibration-powered, wireless sensors in place on machinery, in vehicles. The battery-less aspect of such sensors removes the need for maintenance.

Resistive RAM or the memristor

Using conductive metal oxide (CMOx) technology to enable two-terminal devices that display a memory-effect in their resistance characteristic would effectively implement research championed by Hewlett-Packard Labs, on the memristor, often described as the fourth passive circuit element after resistors, capacitors and inductors.

Battery technologies

on nickel- and lithium-based battery chemistries, such as nickel oxyhydroxide, olivine-type lithium iron phosphate and nanowires, are gunning to displace the venerable but problematic alkaline-manganese dioxide formulations.

Hydrogen-based economy

By shifting to a hydrogen economy, we will simultaneously solve a long list of problems tied to the oil economy (pollution, limited resources, global warming, etc.) while creating new opportunities with hydrogen (clean, renewable, plentiful energy).

Applications for hydrogen are widespread: automotive (hydrogen powered fuel cell vehicles), industrial (hydrogen powered factories), municipal (powering cities with large-scale hydrogen power plants) and residential (home-based hydrogen power plants that convert natural gas to electricity).

Augmented reality

Augmented reality layers information on top of video on mobile platforms, combining related data in ways geographic information systems (GIS) do for two-dimensional information. Applying geo-tagged information enhances what the user is looking at. For example, using GPS and a smart phone’s compass, the app can guess what the user is seeing and then provides information about points of interest in the line of sight that are then overlaid on the camera screen.

Nanotech-based medical diagnostics, targeted drug delivery

Applications in drug delivery, in vitro and in vivo diagnostics and implant technology are currently being exploited with nanotechnology. A promising area is using nanofibers obtained through electrospinning. These fibers may serve as stable or biodegradable scaffolds in bioreactors, as reinforcing structures for blood vessels, biodegradable compounds for wound healing or as a coverage for otherwise bioincompatible materials such as stainless steel for drug-eluting stents.

Miniature drug delivery that targets specific tumors is another area, or the use of micro-robots that swim in a patient's blood stream, transmitting real time data to monitoring medical devices.

Quantum computing

The evolution of miniaturized computing continues with quantum computing, opening the door to a new realm of computational power. Related, scientists have developed a new way to manipulate atoms inside diamond crystals so that they store information long enough to function as quantum memory, which encodes information not as the 0s and 1s crunched by conventional computers but in states that are both 0 and 1 at the same time. Physicists use such quantum data to send information securely, and hope to eventually build quantum computers capable of solving problems beyond the reach of today’s technology.

Friday, July 1, 2011

Cellular capacity expanded... or not?

Of course, as soon as I put a stake in the sand, one of my associates comes up with a different perspective. Vince D'Onofrio made some excellent points, so I am re-posting here what he wrote to me directly -- I think his commentary warrants a wider audience...
I have seen all kinds of distributed radio technologies and do not believe result of DIDO technology is really possible the way it has been reported in this article. Except for the bit about overcoming Shannon's Law. I believe it is possible for DIDO to add additional capacity in the right environment, but it still must follow mother natures laws. So it has to have some limitations. Each person getting full cell capacity regardless of the number of users is never going to happen. There are always choke points in the network unless there is unlimited money to spend on infrastructure and unlimited radio spectrum available. Undesired signals from other in-band users will be perceived as noise by the radio receiver and noise chokes throughput, thus limiting data rates. There is no easy way to test the impact of unrestrained capacity without a fully loaded network. Nether Steve Perlman or his cohorts have the capital to do that. Anything can be proven on paper with enough math. But there are reasons why most theoretical patents never get implemented in the real world.

Even if we were able fix radio link capacity problems by distributing the transfer nodes, the switching centers are going to hit a wall. The only way I see this technology working is if all of the content comes from other local mobiles are being used as relay stations. This also means distributing backhaul data traffic (PSTN and Internet traffic) to local fixed connection points or miniature base stations. Otherwise no real-time transmissions will be possible due to interconnection delays in the network.

Though, it is possible that new technology could address the needs of certain high bandwidth users better that what is out on the street today. Since 99% of users would be happy with only 1% of what a base station has to offer, there might be enough surplus capacity for the bandwidth hungry gamer after all, despite the lack of high throughput for all users. We may approach this capacity anyway with LTE Advanced technology which is well on its way to being implemented globally.

Today, most users are not actually annoyed with the speed of their wireless networks. What they perceive as a slow network is the delay associated with provisioning and setup time, which can be horribly long (one minute or more when the network is congested). This delay is not a radio link problem. You can experience these delays even when there is enough channel capacity. LTE takes care of this problem with new protocols that create an Instant On mobile Internet experience. Users will perceive faster network speeds because the overhead delays will be greatly minimized.

My guess is that DIDO is best used for small local networks like college campuses where everyone is talking or gaming with their neighbors and there are relatively few high bandwidth consumers. In a dense urban environment, with 1,000+ simultaneous users possible in a few city blocks, the technology will likely fail to deliver enough throughput for all because of the combination of excessive load and noise.

Though, hats off to Steve Pearlman if he can prove me wrong. I would love to see something truly disruptive change the industry landscape.

-Vince D'Onofrio
Wireless Telecom Consultant

Are the airwaves saturated?

I've had an ongoing discussion (years, really, since I had one of these -->) with an associate about the limits of wireless bandwidth. Like the popular counterargument to Malthus, my position is, "Humans are smart, we'll come up with ways to expand capacity of a seemingly limited resource..."

It seems this approach might be one way to make sure we can eventually have the wireless cloud so many hope for... Rearden Companies (what a great name, IMHO) suggests a set of techniques and technologies to radically expand the capacity of the wireless spectrum.