Posts Tagged ‘openness’
Andrew McLaughlin, deputy chief technology officer for Internet policy at the White House, worked as a top policy expert for Google before joining the administration of President Barack Obama. McLaughlin talked about transparency and democracy in his keynote.
Details of the session:
U.S. government leaders believe that a wide-open Internet promotes growth, innovation and democracy, according to Andrew McLaughlin, the deputy chief technology officer of Internet policy for the White House. He talked about openness, transparency, innovation and democracy during his closing remarks at the IGF-USA conference July 21 at the Georgetown Law Center in Washington, D.C.
He said President Barack Obama and the leaders of the federal government want to keep the Internet transparent and decentralized because they believe openness spurs creativity and discussion online.
“We’ve been trying to advance those policies,” McLaughlin said. “Openness is a normative value, which is to say a good in and of itself, but also an important network value. It helps everyone connected to the network understand what’s going on in the network.”
McLaughlin drew a strong distinction between the regulatory model developed for telephone services and the policies being established for the Internet, warning that the latter communications entity is definitely not simply a successor to the former. He said the public-switched telephone network was a closed system that was centralized, tightly controlled based on proprietary technologies and vertically integrated.
In contrast, he said, the Internet is an open, decentralized network that’s built around layers where power really rests in the edge of network, rather than its core. McLaughlin said the government needs to find a way to take advantage of this “ever more cheaper, ever more powerful technology” to help promote transparency.
“Transparency can be loosey-goosey term,” he said. “It can be related to openness in one sense. (It also) means the thing you put in is same thing that comes out at the other end. I think transparency in the network needs to come with transparency in policy making.”
McLaughlin said the first memorandum President Obama signed on his first day of office centered on the transparency of government, and one clear example of governmental openness is the digitizing of the Federal Register.
“We took the Federal Register and started publishing it in XML format, and when we did this, within about 24 hours a group of people at Princeton threw up a simple online application that allows you to type in search terms, and you can get e-mail or an RSS feed that pops up in your inbox any time something is published in the Federal Register that you’re interested in,” McLaughlin said. “That’s great because it’s 70,000 pages a year. It’s inscrutable. Now it’s all freely available.”
So yes, the Internet inherently spurs innovation, creation, growth and global dialogues. But it can’t be a staid resource. McLaughlin said its continued positive evolution is integral to its future success.
“We all have an interest in keeping the Internet global,” McLaughlin said. “The Internet should be open, and the Internet should be decentralized. It is and should be treated as a layered stack.
“The Internet governance work we are doing needs to recognize that and treat each of those layers differently. The Internet needs to evolve. We need to be open to that kind evolution and not let the Internet be hardened into its current structure. It’s breathtaking that in my lifetime this communications network has opened possibilities, enabled change and presented encouraging new horizons for the culture and for the practice and performance of democracy.”
-Colin Donohue, http://imaginingtheinternet.org
This panel, moderated by Robert Guerra of Freedom House, focused on critical Internet resources and how to ensure that the underlying principles that have led to the Internet’s success persist in the face of security challenges. These principles include openness (open standards, open technologies), accessibility transparency, bottom-up decision-making, cooperation and multi-stakeholder engagement. Key to implementing these principles is also a broadened understanding of the role of the infrastructure providers, such as global and national Internet services/connectivity providers who build and operate the backbones and edge networks. The panel was also expected to address some of the implications for the implementation of DNSSEC and IPv6 on a national basis that contribute to the security and resiliency of CIR on a global basis.
Details of the session:
The Internet’s success well into the future may be largely dependent on how it responds and reacts to increasing security challenges, according to panelists in a critical Internet resources workshop at the IGF-USA conference July 21 in Washington, D.C.
The Internet continues to evolve. It is also growing, as it becomes accessible to billions more people. The major challenge of our generation is to make the Internet more secure while continuing to promote openness, accessibility, transparency, bottom-up decision-making, cooperation and multistakeholder engagement. It is important that organizations continue to retain these values as much as possible as they react to cybersecurity and cybertrust issues.
Panelists at this workshop included:
- Moderator Robert Guerra, Freedom House
- Trent Adams, outreach specialist for the Internet Society
- Matt Larson, vice president of DNS research for VeriSign
- Steve Ryan, counsel to the American Registry for Internet Numbers
- Patrick Jones, senior manager of continuity and risk management for ICANN
- Jeff Brueggeman, vice president for public policy for AT&T
Panelists all expressed a desire to continue to engage in multifaceted talks because a single governmental entity is not the solution; it takes many people working together. As Brueggeman put it, there’s no “silver bullet” for the issue of Internet security.
“What we do on a day-to-day basis is ensure that those conversations take place,” Adams said. “The (critical Internet) resource is not a thing you can touch. You have this mesh of interconnected components that is the critical resource. You can’t pull one of those components out. Everyone must be around that table.”
So what’s the solution? The answer to that question is still a little unclear because Internet service providers and other organizations are often reactive to issues. Brueggeman said it’s time to embrace a forward-thinking approach.
“Things can get complicated when you’re reacting to an attack,” he said. “The best way to deal with these things is to try to think about them up front. How do we detect and prevent rather than react after the fact? How can we have more cooperative information sharing before attacks to try to prevent them and have the best information we can?”
Ryan stressed, though, that not all government is bad. He said citizens and organizations need to think “carefully about what the role of the government is.” But still, there should be a symbiotic relationship.
“There’s become a sense in government policy circles, including in the most sophisticated, that somehow (the Internet) runs on its own and you can’t break it,” he said. “I have news for you: You can break it. We look at government as something that has an increasingly important role because the Internet has an increasingly important role in economies.”
Ryan continued by saying non-governmental organizations have a responsibility to work with governments and to educate the people who work in them. He and the other panelists agreed that an international governmental organization wouldn’t work, though, unless core Internet values are embraced and upheld. They said a set-up that would allow countries around the world to vote on how the Internet is governed would not be a favorable solution.
“Until we get it right,” Ryan said, “I think we’re muddling along rather well.”
DNS issues and DNSSEC
Larson spoke specifically about the security of the Domain Name System because he views the DNS as an absolutely critical Internet resource. “If you don’t have the DNS, you don’t have the Internet,” he noted. He said users can’t truly trust the DNS, though, which is a bit disconcerting because of its necessity.
He supports DNSSEC—Domain Name System Security Extensions—which give users digital signatures (origin authentication) and data integrity. “Once you have that, you can validate data and have a higher level of confidence that the data you’re getting back is valid,” Larson said.
(You can read more about DNSSEC here: http://en.wikipedia.org/wiki/Dnssec.)
He also said that DNSSEC makes DNS more trustworthy and critical to users as more applications—not just host names—depend on it. “We’re going to look back and realize it enabled a whole new class of applications to put information in the DNS,” Larson said. “Now you can trust the information coming out of the DNS.”
Going from IPv4 to a combination with IPv6
Ryan emphasized the importance of Internet Protocol version 6, IPv6, a new Internet layer protocol for packet switching that will allow a “gazillion numbers” vastly expanding the address space online. There is a rapidly decreasing pool of numbers left under IPv4. Ryan said the increased flexibility of IPv6 will allow for the continued growth of the Internet, but it won’t be a free-for-all.
“The numbers we have issued are not property,” he said. “We have a legal theory that’s embodied in every contract we’ve issued. They belong to community. If you’re not using them, you have to give them back. They are in essence an intangible, non-property interest, so over the next couple of years there will be some very interesting legal issues.”
ICANN in action
Jones said ICANN, which recently passed its 10-year milestone, has continued to work collaboratively with the community to take on major initiatives, such as the introduction of internationalized domain names in the root.
“We have taken requests from countries for internationalized country codes and approved 15,” Jones said.
“There’s a huge development in those regions of the world where you can now have domain names and an Internet that reflects their own languages and scripts. That will have an impact as discussion around critical Internet resources continues, especially in the IGF space.”
Physical critical resources
Brueggeman said AT&T has a broader perspective of critical Internet resources because the company is responsible for carrying Web traffic and for the underlying infrastructure, not just involved in issues tied to the DNS. He said the transition to IPv6 is daunting because it’s not backward-compatible. His main challenge has been in outreach efforts to customers.
“We have to deal with a lot of traffic that’s generated as we’re making changes to DNSSEC and IPv6,” he said. “In some cases, you might create some new security concerns, but overall both are important essential transitions.”
Brueggeman emphasized that multistakeholder discussions will be important in the coming years.
“We really need all of the parties who have the direct stake at the table to be part of the solution,” he said. “We need to have the resources handled in a way that promotes openness and promotes interoperability. There’s a huge policy risk of not managing these resources in a multistakeholder way.”
-by Colin Donohue, http://imaginingtheinternet.org
IGF-USA Scenario Discussion: Panelists, participants discuss future that puts Internet governance in hands of governments worldwide
IGF participants broke into three different rooms to discuss three different, possible potential-future scenarios for the Internet in 2020. In this session, the description given to the discussants was: Most of us assume that the ICT industry, media companies and NGOs will continue to be the leading players on the Internet stage, with governments playing just a supporting role. This scenario describes an alternate future, where citizens and industry worldwide demand that their governments take center stage to clean up an Internet that has become infected with dangerous content and criminal conduct.
Details of the session:
Panelists and gathered participants in a scenario session at IGF-USA 2010 in Washington, D.C., expressed discouragement about an Internet future that will quickly witness larger international governmental control that would ultimately remove power from the ICT industry, media companies and NGOs, who now continue to be the main controllers of the Internet.
“A scenario is not a prediction,” said panel moderator Steve DelBianco, the executive director of NetChoice Coalition. “It’s designed to be provocative, but plausible. It’s designed to challenge your assumptions.”
Some members of the audience were skeptical that the scenario, as a whole, is plausible, but all agreed that if it became reality, it would be a frightening prospect. (Read the full description of the scenario here: http://api.ning.com/files/KeHnmv3O-PHbKeh0tKl8RaAjWl7S9siFVN8YEM6lN0ImimLqwuq6B2UlGNDtHBKp7MwNPjexPsur3DKlypEhgQ__/GlobalGovernmentfortheInternet.pdf)
DelBianco presented three converging forces that serve as drivers for the scenario:
- Consumers lose trust in online content and e-commerce.
- Businesses can no longer tolerate losses from fraud and lawsuits.
- Governments have successfully used electronic monitoring to thwart terrorist attacks.
As a result of those three forces, the scenario proposed the following about the Internet in 2020:
- Governments cooperate to oversee online content and e-commerce to a greater degree than ever before
- Government and businesses require biometric ID for online users
- Online publishers are now liable for user-generated content and conduct
- You need an “Online License” to use the Internet.
Janice Lachance, the chief executive officer of the Special Libraries Association and an invited panelist for the session, said she is anxious about the scenario’s potential to stem the openness of the Internet.
“I think this scenario gives us all a lot to think about,” Lachance said. “As someone who has an organization that’s concerned with the free flow of information and the access to information, I think that excessive government involvement raises red flags for us. It probably isn’t all bad, but if it’s certainly getting to the point that’s described here, I fear we will have a lot of consequences if you’re trying to do business.”
Walda Roseman, the founder of CompassRose International, said she thinks there’s a “rolling thunder” toward more governmental control because of the increasing security threats facing online users. A member of the audience agreed, saying the scenario is not so unlikely because it’s happening at lower levels already.
“All of these situations are going on, just not at a tipping point,” said the participant. “I don’t think this is necessarily avoidable. I think the focus should be on how to facilitate solutions, rather than to prevent something that currently exists.”
One possible solution, according to Roseman, is to rely on more and better intergovernmental cooperation. She said it’s necessary for countries to find ways to hold more cohesive and inclusive dialogues.
“Can we shape conclusions as a world as opposed to quickly avoid them?” Roseman said. “We’re seeing a lot of collaboration among governments, and the collaboration is not yet 100 percent on the cybersecurity issues, but it’s a different alliance there. We’re wanting intergovernmental organizations to make the policy decisions and a whole lot more than the policy decisions.”
Several audience members said they don’t foresee national governments getting together on the issue of Internet governance in the near future when they can’t even come to concrete conclusions on financial regulation or climate change, for example.
So if “some bizarre world government” isn’t created to handle the issue, as one participant said, then it will fall, most likely, to the local governmental level or the United Nations. Even then, there was some articulated concern that a governmental body simply can’t respond and react in a timely fashion to any problems that may arise.
“I’m concerned about notion of institutional competence,” said an audience member. “Does the government have the competence to run the Internet? I don’t think they have the expertise or the quickness to react.”
Markus Kummer, executive coordinator of the Internet Governance Forum, said that government shouldn’t shoulder all of the blame for ineffective policies.
“We need to avoid having a black-and-white picture of all government is bad and all the other institutions are good,” Kummer said. “I think it’s a little more complex than that. How do we find a fruitful cooperation among all the actors?”
No matter who might claim Internet governance, panelists and participants expressed concerns about the future of anonymity, security, openness and freedom of information on the Internet. They said it’s up to the people who work together through IGF to continue having conversations that could lead to a positive future. “Citizens and business have lost patience, and they need solutions,” DelBianco said. “If we don’t deliver, entities that discuss may be seen as not fast enough to solve problems. We need to show progress. Lots of organizations will have to start delivering results so we don’t get the result we don’t want. We need to avoid having the Exxon Valdez of Internet security.”
Two U.S. government employees who were part of the audience for the scenario said the United States needs to look carefully and closely at how it views and values the Internet to figure out what it truly wants and needs. “This is moving so much faster than we expected,” said a U.S. State Department participant. “Are we going to lose by maybe trying to be idealistic and assuming that everyone else is going to take on our same model? Maybe we need to get together as U.S. citizens and ask, ‘What do we absolutely want for our Internet, what do we want as a country?’ and get really clear on that so that when we start making foreign policy decisions we’re not compromising our values.”
-Colin Donohue, http://www.imaginingtheinternet.org
Response from Andrew McLaughlin, White House, to Lee Rainie’s ‘What We Don’t Know About the Future of the Internet’
“Let me drill down on one issue,” McLaughlin responded. “I am pleased Lee highlighted these architectural issues because they tend to not get a lot of attention. The Internet we have today started out as a research network that is now being treated – properly so – as critical infrastructure. The basic considerations that led to the construction of the TCP and IP protocols were to solve a set of issues that arise from the kind of shared data from universities. There are a lot of components that were not built in that, as Lee outlined, would potentially be quite useful for some of the activities that we would like to take place on the Internet. We are now confronting some fundamental choices about, for example, authentication on the Internet.”
He noted that if you want to secure routing you want to know the places you do want to get packets from and those from which you don’t want to get packets, “where, for example, malware or virus attacks might be.” Yet in places where authoritarian structures control infrastructure complete authentication of identity behind and origins of information becomes problematic.
One of the great features of the Internet is that is facilitating a profound flourishing of direct citizen-to-citizen speech in places that don’t have much of a tradition of that or have a tradition of centralized control over information. So you would alter that architecture and build in that authentication at great peril. – Andrew McLaughlin
He noted that one of the best results of the IGF and ICANN processes is the ways in which they illuminate discussions about the architecture, protocols and principles of the Internet to a much wider audience.
“It was not exactly the original intention for ICANN,” he noted, “but it has been the effect. The project of inculcating a way of thinking about problems with the Internet architecture is profoundly important,” he said, noting that you have to know the language of the architecture to operate in today’s political and economic environment. “Without that understanding, you can’t talk intelligently about cybersecurity, how to protect privacy, how to facilitate authenticated business and governmental transactions, and so forth – well, you can talk intelligently about it, but you will be missing something.”
He noted that the efforts of multistakeholder organizations in shaping an understanding and knowledge of information networks and the people building and scaling them is important.
The role of the IGF and the value of the ICANN process extends beyond the agendas that are typically before them. – McLaughlin
“The reason these issues are conundrums – the security, authentication, privacy, identity issues,” he said, “is that the Internet is a voluntarily interconnected set of networks. There is no central controlling authority; there is no body, no government that can decree what the technical implementations of the network will be. That fact is part of the fundamental strength of the Internet, part of what made it scale so fast, part of what’s made it so powerful, part of what made it facilitate so much speech and free expression in so many surprising ways in every culture around the world.”
He said in this decentralized environment change must now be accomplished “in terms of nudges, in terms of incentives, in terms of persuasion, rather than by decree.” He added that while the Internet architecture at this point may protect the speech of a “dissident in a repressive society or in our own society,” but there are many Internet transactions now threatened by spoofing, DNS attacks and other threats that would not occur if we had a better authentication system.
Understanding how to be precise about those balances and how to get them implemented in a voluntarily interconnected set of networks is a central problem that confronts us over the next few years. From the [Obama] administration’s perspective, the goal of an open Internet that supports free expression, that supports the kind of array, vast wave of human creativity and free expression that we see coursing over the nerves and veins of the Internet every day – maintaining that, accelerating that, enabling that is fundamentally important. – McLaughlin
He also noted that moving forward in encouraging the principles of open government “will guide the administration’s efforts to make progress on these problems.”
He noted that the Obama administration is working to make more information accessible to everyone.
“We want to be a more open government and free the data,” he said, “to make the government a platform for citizen innovation, citizen activities, new business models, and so forth that ride over the data the government has and the taxpayers pay for. The federal government sits on a staggering amount of data, and it can be incredibly valuable if it’s made public in machine-readable formats and can be remixed and reused and combined with other kinds of data. That’s a fundamental commitment.”
-Janna Anderson, http://www.imaginingtheinternet.org