In April, MInnesota students were in the middle of their newly rolled out Common Core assessments when the servers crashed. The servers of the state’s test contractor jammed up, and 15,000 students were either unable to start or finish their test. In Oklahoma, the servers of its contractor crashed and student work was lost.
There are serious questions about whether school districts have the technology necessary for students to take the tests. My town’s school network is so antiquated that a single elementary school class caused the entire district’s network to crash. The teacher and students were conversing via Skype, and it was too much for the network. Note: it was the entire district, not the school that crashed.
We depend on computers and the internet for much of our lives, but when it comes to the roll out of large-scale technology programs, small problems often become major. When it comes to the federal government, the roll out becomes even more complicated. As Sean Gallagher Ars Technica’s IT editor writes:
The rocky launch of the Department of Health and Human Services’ HealthCare.gov is the most visible evidence at the moment of how hard it is for the federal government to execute major technology projects. But the troubled “Obamacare” IT system—which uses systems that aren’t connected in any way to the federal IT infrastructure—is just the tip of the iceberg when it comes to the government’s IT problems.
Despite efforts to make government IT systems more modern and efficient, many agencies are stuck in a technology time warp that affects how projects like the healthcare exchange portal are built. Long procurement cycles for even minor government technology projects, the slow speed of approval to operate new technologies, and the vast installed base of systems that government IT managers have to deal with all contribute to the glacial adoption of new technology. With the faces at the top of agency IT organizations changing every few years, each bringing some marquee project to burnish their résumés, it can take a decade to effect changes that last.
That inertia shows on agency networks. The government lags far behind current technology outside the islands of modernization created by high-profile projects. In 2012, according to documents obtained by MuckRock, the Drug Enforcement Agency’s standard server platform was still Windows Server 2003.
Magnifying the problem is the government’s decades-long increase in dependency on contractors to provide even the most basic technical capabilities. While the Obama administration has talked of insourcing more IT work, it has been mostly talk, and agencies’ internal IT management and procurement workforce has continued to get older and smaller.
Over 50 percent of the federal workforce is over 48 years old—and nearly a quarter is within five years of retirement age. And the move to reliance on contractors for much of IT has drained the government of a younger generation of internal IT talent that might have a fresher eye toward what works in IT.
But even the most fresh and creative minds might go numb at the scale, scope, and structure forced on government IT projects by the way the government buys and builds things in accordance with “the FAR”—Federal Acquisition Regulations. If it isn’t a “program of record,” government culture dictates, it seems it’s not worth doing.
The Republican Party has done everything possible to derail the Affordable Care Act, but the problems associated with healthcare.gov are a matter of technology that already affect our entire federal government. The private sector takes for granted updated networks and software. The federal government cannot. Consider the Defense Finance and Accounting Service, the payroll department of the American military. Reuters has been reporting on the mistakes in the paychecks of service men and women.
Precise totals on the extent and cost of these mistakes are impossible to come by, and for the very reason the errors plague the military in the first place: the Defense Department’s jury-rigged network of mostly incompatible computer systems for payroll and accounting, many of them decades old, long obsolete, and unable to communicate with each other. The DFAS accounting system still uses a half-century-old computer language that is largely unable to communicate with the equally outmoded personnel management systems employed by each of the military services.
The military has systems that use COBOL.
The main reason is rooted in the Pentagon’s continuing reliance on a tangle of thousands of disparate, obsolete, largely incompatible accounting and business-management systems. Many of these systems were built in the 1970s and use outmoded computer languages such as COBOL on old mainframes. They use antiquated file systems that make it difficult or impossible to search for data. Much of their data is corrupted and erroneous.
It would be nice to think we could turn all this over to the private sector, but privatization isn’t the answer. Some of the problems are endemic to federal bureaucracy: “Long procurement cycles for even minor government technology projects, the slow speed of approval to operate new technologies, and the vast installed base of systems that government IT managers have to deal with all contribute to the glacial adoption of new technology.” Another problems are due to the perennial lack of money available for mundane projects. I can’t find any statistics on the increased use of contractors in the federal government, but if I recall correctly, the Bush Administration tried to contract out as much work as it could. Even if it were cheaper to hire a contractor (hardly a given), we don’t have enough people working for the government to develop the expertise. We need skilled, knowledgeable people in the government who can operate the systems they purchase from the private sector.
The Obama Administration certainly deserves some blame for the failure of healthcare.gov’s efficient roll out. At the very least, there was a failure to lead on the technological aspect of the project. It doesn’t take a genius to realize that launching a project involving multiple-databases, millions of users, and new technology requires prior rigorous testing. But given the state of the federal government’s IT infrastructure, it’s hardly a surprise. If Obama hadn’t fought for ACA, we wouldn’t be concerned with the sad state of our government’s technology. Perhaps now is a good time to look at the ultimate reason for healthcare.gov’s shortcomings.
mike_cote says
COBOL was one of my best languages at University. Both COBOL and Pascal (and occasionally Fortran and occasionally LISP) were the cutting edge before C and C Plus ruined the fourth level language progression.
Mark L. Bail says
I remember my friends taking courses in Pascal. It’s amazing, however, that it’s still being used.
SomervilleTom says
C and Cpp distracted attention for awhile, but the industry has moved far beyond COBOL and Pascal.
The web runs on a technology stack that takes for granted many of the ideas that were new when COBOL and Pascal were in use. “Object oriented”, in particular, is simply the way things are.
Javascript, Python, Perl, php, Java, and Cpp are ubiquitous (except, apparently, in the military).
It is duplicitous to spend decades slashing government and “privatizing” government IT and software engineering, and then complain because the we end up with a government that has such limited ability do modern information support.
Anybody who’s been around private industry very long knows that “enterprise-scale” projects (where failure threatens the very existence of the enterprise) are hoary, hairy, and ENORMOUSLY expensive. Even the best of those efforts fail with some regularity. A lie that needs to be called out as a lie is, therefore, the assertion that “privatizing” is some silver bullet that solves government information system ills.
We live in an information era. The NSA scandal demonstrates that the government is able to do some things exceedingly well — not surprisingly, generally things that the government views as being very much in its own self-interest.
We need to make across-the-board excellence in information technology central to the self-interest of our government. We need to invest in the resources (human and physical) to make that happen. We need to increase tax revenues enough to fund it.
A good place to start looking for those increased tax revenues is the group of people who have collected immense — staggering — wealth from privatizing government information services.
David says
on the whole healthcare.gov business was a response on Twitter to a report that President Obama had told CNN that information technology was one of the areas in which the private sector had far outpaced the public. Someone responded, “the NSA seems to be doing pretty well with their Commodore 64s.” Made me LOL.
jconway says
Is still being taught, friend now working at BBN.Raytheon studied it at WPI.
mike_cote says
which has been reported as a dead language many times over.
johnk says
doodle game to celebrate the 50th year anniversary over the weekend. You can see the game here:
https://www.google.co.nz/
mike_cote says
I can’t believe this weekend is finally here!
danfromwaltham says
Obama is the one who wouldn’t budge on the start date. And nobody has been held accountable. Even worse, it isn’t secure.
$600 million wasted, if you talk to experts in this field, a site like this should cost only a fraction of that amount.
http://m.washingtonpost.com/politics/private-consultants-warned-of-risks-before-healthcaregovs-oct-1-launch/2013/11/18/9d2db5f4-5096-11e3-9fe0-fd2ca728e67c_story.html
jcsinclair says
As someone who works in the federal IT area, the $600M figure has always seemed ridiculously high to me, so I did a little searching and found the press release announcing the awarding of the contract. $93.7M over 5 years, with probably only $56M actually released to the contractor to date. Still pretty high, but more believable than $600M.
My understanding is the $600M figure included work performed by the contractor that wasn’t related to healthcare.gov.