In May of 2011, I presented a back-end web services architecture for U.D. Dept of Health and Human Services Center for Medicaid & Medicare to the CTO (Mark Hogle). If the contractors for the public portal had gone with the technical reference architecture (TRA) as it was written and approved by the CTO, the concerns regarding back-end data in-transit, unsecured, would not be warranted.
Specifically, the reference architecture I developed called for,
Where the highest level of practical protection is called for, encrypting all message fields should be included in the architecture. XML Encryption (and decryption) requires fully parsing the XML transaction and then, for select message section(s), performing a set of processing-intensive XML and cryptographic encryption (decryption) operations. Deploying both XML Encryption and XML digital signatures can significantly affect the performance of high-transaction applications due to their resource-intensive nature. This can be mitigated by using hardware (an appliance, for example) rather than a software-based solution.
What are the implications of ignoring this (common-sense?) policy? Typically, a man-in-the-middle attack could be orchestrated. This breach is a form of active eavesdropping by which the attacker makes independent connections with the targets and relays messages between them, making them believe that they are talking directly to each other over a private connection. Data could be modified or absconded with.
Another problem is the repudiation -- where did this message originate from? Without this assurance, a provider is unable to ensure that a party to a SLA cannot deny the authenticity of their signature on a document or the sending of a message that they originated. Repudiations ensure electronically signatures are trustworthy, to ensure that a person cannot later deny that they furnished the signature. Any financial transaction needs this.
Plus, there's a bonus! The TRA specified performance testing! The issues around poor performance (that, of course, are not client-specific such as poor HTML coding) would never have made it from the test lab to deployment. In the TRA, CMS mandates Web Services testing and performance engineering. Specifically, these processes should use a systematic, quantitative approach to building Web Services that meets both business and performance objectives. While crafting software to meet business objectives is the developer’s primary focus, performance engineering should also map to critical use cases that take into account performance objectives, including response time, throughput, resource utilization, and workload.
Web Services testing should focus on regression testing and benchmarking against stated performance goals for individual services. The UDDI directory should be employed to document those goals. The purpose of such testing is to demonstrate that a service meets performance criteria, thus testing should assess load and stress.
The rationale for measuring Web service performance is multifold:
• Consumers need to know response times and anticipated throughput via APIs.
• Service resource demands are needed for different workloads.
• SLAs or other contractual obligations will rely on performance as a key concept.
Nobody wants problems with the President's attempt at reforming the health care insurance marketplace in this country. But just by applying the existing design constraints at the outset, HHS/CMS would have plugged another hole in the leaky dike that the Health Care Exchange has become, before any drips started.