Web 2.0 Blog – Discovering Innovation Opportunities using Social Media

Posts Tagged ‘Policy

The future of the internet will involve more authentication than it does today but here is a potential interim solution to provide some level of authentication for Gov 2.0 presence on online social networks such as facebook and twitter. standard policy of having a reciprocal link back to a facebook fan page or twitter account on a .Gov/.Mil website which the social network page points to could be a simple interim solution. I call it Reciprocal Link Authentication.

Government 2.0 includes a government presence on non-government websites such as online social networks (OSNs) (think facebook fan pages and twitter accounts) so that citizen’s can encounter government guidance and assistance where they ‘live’ in cyberspace.  But how can citizens be certain that the government account/representative is authentic?    If you run into someone in the street and they say they are working for the government, how do you know for certain?  They provide you will a badge or ID right at the beginning of the conversation.

If we encounter government workers as official government representatives in non-government cyberspace, should we also be able to see some sort of identification?   Since cyberidentity is more easily assumable in many cases than aliases in real life (especially on social networks), shouldn’t there be a way to verify the authenticity of someone claiming to represent a government? Often times government officials on OSNs such as agency fan pages on facebook or informational twitter accounts will have an official seal or emblem. The problem with this is that it is trivial and relatively low-risk to copy or create an image of a seal or official looking emblem and put it on an anonymous OSN account compared to duplicating a paper credential which someone might show you in person.

The commercial solution for authentication won’t work on social network pages. Here’s why.

Commercial websites sometimes provide SSL encrypted links to independent authentication websites (Verisign, Godaddy, among others) to prove their authenticity.  The problem with the government using this method is that it would add paperwork and costs to implement SSL badges or require changes in existing online social networks profile options.  Also I don’t think there are products which work with OSNs and the authenticators to verify anyone on social networks yet.  Perhaps more importantly, the government would be then depending on a commercial company to prove its authenticity.  Basically it’s a non-starter if you want to actually achieve a Government 2.0 presence online in the near future for several reasons ranging from practicality to policy to politics to costs.

But wait, there may be a much easier and better way. .Gov and .Mil web sites already are monitored and checked for authenticity unlike .com and .org sites.   So you don’t need an independent cyber authenticator such as Verisign because any .Gov or .Mil site can serve as that authenticator.

Reciprocal Link Authentication.

Why not have a simple policy that any online social network account or non-.Gov/.Mil online presence have a link to a .Gov/.Mil webpage which then links back to that same OSN account?   So if someone wanted to verify a government twitter account, they could simple click on the URL provided and easily find a linkback to that same twitter account on the .Gov/.Mil webpage they landed on.  If the account is hijacked then a notice of the problem could be put up until the account identity is secured again.  If this is done on all federal OSN accounts, the cybercommunity will become quickly accustomed to the authentication method and if a hijacker removed the authentication link, the visitors will know to dismiss the account.  And if they see something which sounds a bit off, then can instantly verify it by following the link back to the OSN account.     It would not mean much work since online government representatives at non .Gov/.Mil sites almost always have some .Gov/.Mil landscape under their control.

Reciprocal Link Authentication seems easy, low cost and instantly provides a universal method to authenticate any online government representation without much effort.  Sure its not perfect from a cybersecurity point of view, buts it goes a long way to addressing several important concerns about government representation on non-government websites.

Advertisements

This post is in beta. I am looking for help in better understanding the connection between policy and effort so we can discuss it at the upcoming Gov 2.0 camp.  I am not an expert by any means in this area, but am struggling to understand the problem from a data perspective.  The semantic web initiatives and in general the goal of a collaborative government drove me to seek this understanding of how policy is connected to effort.

One of the 3 things which the NAPA paper on Enabling Collaboration:  Three Priorities for The New Administration identifies as a barrier to a more collaborative government is  ‘An inability to relate to information, and information to decision making.’   This hints at a critical problem in creating new initiatives which is not having enough information to plan a path to implement a new initiative.  I believe the solution is to map the connections between policy, responsibility, effort and procedure as critical pieces of data to inform decision making.  This has the potential to speed  progress in creating a more agile, innovative and collaborative government  just as mapping the genome has sped progress in genetics.

Specifically, I see missing connections between policy, responsibility, procedure and effort required to create new initiatives.   Let’s call it PREP (Policy-Responsibility-Effort-Procedure) data since everyone loves an acronym.   So PREP is essentially a line connecting 4 points from policy to the person trying to create a new initiative.  From a policy at a high level, to offices which have responsibility to ensure the policy is followed, to procedures created by those offices and to the effort to follow those procedures.  (I am sure in reality its more complicated than that but lets keep it simple for argument’s sake).  Of course each initiative has multiple policies it must be compliant with, so multiple lines between the effort and policies.  The procedures are often interdependent, yet created independently by separate offices often in isolation from what other offices do.  In the end you have a thick mesh to get through, that needs to be rediscovered for each new initiative.   I  have come to the conclusion that mapping PREP data is critical to creating a more collaborative, agile and innovative government.

The problem starts with policy being handled as a 19th century invention, that is as an isolated document.   Then the document is passed to various departments with responsibility to make sure the policy is followed.  These departments create procedures to ensure the policy is followed. When someone wants to create a new initiative or project, they need to determine all procedures from all policies involved and then put in an effort to follow  these often disjointed procedures which often have hidden interdependencies.  This seems to be a primary cause of what we commonly call the ‘bureaucracy’.

Many well intended policies come together to produce unintended, entangled procedures which form a barrier to quickly creating new initiatives.  Essentially this is a emergent property of the many policies which have been implemented over the years, as well as the many offices created to follow the many policies.  The result is fewer or slowed new initiatives leading to less innovation and collaboration. (Since almost by definition collaborative efforts will involve new initiatives.)  A confounding problem is that new technology is causing procedures to have to be reconsidered and policies reinterpreted which adds to the complexity.

There is data on policy to effort connections but it does not seem to be centrally accessible or uniformly stored.  And the large differences between  interpretation by individuals on every node in the PREP data which can change for every decision confound the problem of understanding what is really happening.

New policies to instigate new initiatives are now being issued and fast results are expected, but because of this unseen mesh which holds up execution, the top levels are frustrated with the work not getting done.   Meanwhile the people in agencies feel that they are too constrained to get the new initiative started.  Since the mesh is invisible, solutions to change the system become confusing and difficult to follow because they normally add to the mesh rather than disentangle and streamline it.

Solution: Map the PREP (Policy-Responsibility-Effort-Procedure) data and use this map to create guidance on streamlining implementation of policies as well as identifying duplicate or unnecessary procedures.  The data should include the amount of ongoing or one time man hours involved in the effort to follow a procedure, the average calendar delay caused by a procedure, and any interdependency with other procedures.

How? Initially just collect the data in a standardized and centrally accessible format.   It will be almost immediately useful. Use collaborative techniques to collect a lot of data quickly even if it means lower quality data initially. Then gradually move the data to a Semantic/RDF storage system where it can be queried in many different ways and linked to the broader set of definitions such as law, case history etc.

This will be the start of making a more agile, collaborative and data centric government.

The Challenge to this approach: Besides data management which is not too bad initially.   A lot of these hidden paths are not 100% ok with 100%  of interpretations of policies,  so how do we create a collaborative environment without people worrying that the interpretations which allow them to get things done will be ruled to be incorrect?  It seems this needs to be a research project that can’t be looked for that purpose.