A new study shows that regulation of websites makes them less trustworthy.
Jamal's research involved an analysis of the top 100 business websites in the United States before the recently passed laws to regulate the tracking and selling of information had taken effect. Working with a colleague, Dr. Norman Bowie of the University of Minnesota, Jamal then compared the U.S. results with the results of an analysis of the top business sites in the U.K., where a large government bureaucracy regulates and monitors the actions of online operators.
Using specialized "web-crawler" software, they were able to navigate through the sites and learn whether or not the site operators were tracking their activities. Then, over a period of six months, the researchers monitored how each of the companies used information related to their online activities. Finally, the researchers checked each site for disclosure information and cross-referenced what they found–or didn't find–with the results of their software analysis.
"We found that the unregulated websites in the U.S. function just as securely and privately as the regulated sites in the U.K. Most of the web operators in the U.S. and the U.K. were open and honest and did a good job of protecting user information," Jamal said.
"However, we also found that a small percentage of operators in both countries were cheaters, and the worst of the worst of the sites operated in the U.K,"
I'm not quite sure that one can conclude, as a result of this study, that regulation makes things worse. The fact that the worst offenders were in the U.K. could be attributed to other factors ignored by the study. But, the study does show that regulation doesn't make things better, and that is what I would prefer to point out.
I recently finished a book called Why Most Things Fail. It was interesting, but a bit too long. Only the last two chapters were really useful. Paul Ormerod, the author, compares industries to ecosystems and companies to organisms. He points out similar rates of failure and "extinction events" at various time intervals, when biological extinction is plotted next to economic extinction. One of his major conclusions is that we don't really know as much as we think we do about economics and corporate strategy, and that it is nearly impossible to consistently and accurately predict the future. Of course, most companies mistakenly believe that it's simply a matter of getting the right strategy in place and everything else will follow. Ormerod, on the other hand, would argue that you just need to start somewhere and work on your feedback loops that pull in new knowledge about markets and customers and provide information that is truly useful. Business becomes a matter of survival plus experimentation which, when combined with good execution and a little luck, can turn out really well. But I'm off topic now…
The point in bringing up Ormerod's book is that he lambasts governments for pretending they can actually predict the outcomes of their policies. That wouldn't be a big deal if new programs could be launched and then shut down when they don't work, or modified somehow, but we know from experience that government has massive inertia in the other direction. Once a government exists, it most likely will exist even when it becomes functionally obsolete. So, back to the idea of regulation on the web (and business regulation in general), government interference with business rarely makes a significant difference. I still believe what I have always believed – that government regulation makes the most sense when there are significant existing externalities that are not factored in to market prices, and even then I think the government's role should be to enhance information flow to the market, not dictate how it should work.
Maybe if I was as wise and omnipotent as our representatives in Washington, I would feel differently. (yes, that was sarcasm)