Posts Tagged ‘Verisign’
IGF-USA 2012 Case Vignettes: Turning Principles into Practice – Or Not: Internet Governance/ICANN; Consumer Privacy; Cyber Security; Dialogues about Lessons Learned
Brief session description:
Thursday, July 26, 2012 – This workshop was aimed at examining the role principles are playing in framing debates, achieving consensus and influencing change – or not. Proposals for Internet principles are popping up everywhere, from national to regional and global discussions, on a wide range of issues. In 2011, IGF-USA examined a number of principles in a session titled “A Plethora of Principles.” This session follows on that one. Session planners noted that it’s not enough to simply develop a set of principles, the question is: how are principles actually implemented how are they inspiring change? Are they new voluntary codes of conduct, new regulations, new laws? Principles can become a baseline for gaining high-level agreements. They may go beyond the expectations possible through legislation or regulation, so some argue that principles should be written to be aspirational. Some argue for legislation, regulation or enforcement mechanisms to ‘hold industry accountable’ to promises made in principles designed as sets of commitments. This workshop examined three case vignettes: 1) How the principles of a white paper were incorporated into ICANN’s formation and what the status of these principles are today within ICANN’s mission and core activities; 2) how consumer privacy principles have fared in global and national settings in terms of these points ‘turning into practice’; and 3) how cybersecurity/botnet principles are faring.
Details of the session:
The moderator for this session was Shane Tews, vice president for global public policy and government relations at Verisign. Panelists included:
- Becky Burr, chief privacy officer, Neustar Inc.: Turning White Paper Principles into actuality in ICANN
- Menessha Mithal, associate director of the division of privacy and identity protection, Federal Trade Commission: Consumer privacy principles
- Eric Burger, director of the Georgetown University Center for Secure Communications: Cybersecurity and botnets
- Carl Kalapesi, co-author of the World Economic Forum’s report Rethinking Personal Data: Strengthening Trust: the World Economic Forum perspective
Before an informal agreement, policy or formal regulation is adopted, passed or approved it takes its initial steps as an idea. The trick lies in bringing it from a formative state to something actionable, otherwise it may languish as a suggested goal, followed by and adhered to by no one.
During the IGF-USA panel titled “Turning Principles into Practice – or Not” participants shared successful case studies as examples of how to create actionable practices out of ethereal goals. Citing processes ranging from US efforts to counteract botnets to domain name system governance and to consumer privacy, three panelists and one respondent drew from their own experiences in discussing ways in which people might successfully bridge the gap between idea and action.
Meneesha Mithal, associate director of the Federal Trade Commission’s Division of Privacy and Identity Protection, weighed in on the efficacy of principles versus regulation by offering a series method to act on a problem.
“It’s not really a binary thing – I think there’s a sliding scale here in how you implement principles and regulation,” she said. She cited corporate self-regulatory codes, the work of international standard-setting bodies, multistakeholder processes, safe harbors and legislation as possible means for action.
Mithal highlighted online privacy policies as an example of the need for a sliding scale. The status quo has been to adhere to the concepts of notice and choice on the part of consumers; this has resulted in corporations’ creation of lengthy, complicated privacy policies that go unread by the consumers they are meant to inform. Recently, pressure has been placed on companies to provide more transparent, effective means of informing customers about privacy policies.
“If it had been in a legislative context, it would have been difficult for us to amend laws,” Mithal said, though she admitted that such flexible agreements are “sometimes not enough when you talk about having rights that are enforceable.”
And Mithal did note that, given the current climate surrounding the discussion of online privacy, it’s still the time for a degree of broad-based privacy legislation in America.
Eric Burger, a professor of computer science at Georgetown University, spoke on the topic of botnets, those dangerous cyber networks that secretly invade and wrest control of computers from consumers, leaving them subservient to the whims of hackers looking for a challenge, or criminals looking for the power to distribute sizable amounts of malware.
Given the sheer number of stakeholders – ISPs concerned about the drain on their profits and the liability problems the strain of illegal information shared by the botnets, individual users concerned over whether their computers have been compromised and government agencies searching for a solution – Burger said that the swift adoption of principles is the ideal response.
Among those principles are sharing responsibility for the response to botnets, admitting that it’s a global problem, reporting and sharing lessons learned from deployed countermeasures, educating users on the problem and the preservation of flexibility to ensure innovation. But Burger did admit the process of arriving at this set of principles wasn’t without its faults. “Very few of the users were involved in this,” he said, citing “heavy government and industry involvement, but very little on the user side,” creating a need to look back in a year or two to examine whether the principles had been met and whether they had been effective in responding to the swarm of botnets.
Becky Burr, chief privacy officer and deputy general counsel at Neustar, previously served as the director of the Office of International Affairs at the National Telecommunications and Information Administration, where she had a hands-on role in the US recognition of ICANN (NTIA). She issued a play-by-play of the lengthy series of efforts to turn ICANN from a series of proposed responses into a legitimate governing entity, which was largely aided by a single paragraph in a framework issued by President Bill Clinton’s administration in 1997.
Written as a response to the growing need for the establishment of groundwork on Internet commerce and domain names, the paper called for a global, competitive, market-based system for registering domain names, which would encourage Internet governance to move from the bottom-up. The next day, the NTIA issued the so-called “Green Paper” which echoed many of the principles of the administration’s framework and drew extensive feedback from around the world, including negative feedback over the suggestion that the US government add up to five gTLDs during the transitional period.
After reflection on the feedback to both the white and green papers, and a series of workshops among multiple stakeholders to flesh out the principles of stability, competition, private-sector leadership, bottom-up governance and realistic representation of the affect communities, ICANN held its first public meeting Nov. 14, 1998, underwent several reforms in 2002, and ever since, in Burr’s words, “is still the best idea, or at least no one’s figured out a better idea.”
“The bottom line is to iterate, make sure you articulate your principles and try to find some built-in self-correcting model,” Burr said.
While Burr’s play-by-play described how a relatively independent, formal institution was formed to offer DNS governance, Carl Kalapesi, a project manager at the World Economic Forum, offered a more informal approach, relying on the informal obligations tied to agreeing with principles to enforce adherence.
“Legislative approaches by their nature take a very, very long time,” Kalapesi said. He vigorously supported the importance of principles in offering “a common vision of where we want to get to,” which leaders can sign onto in order to get the ball rolling.
He offered the example of the “Principles of Cyber Resilience,” offered to CEOs at last year’s World Economic Forum with the goal of making them more accountable for the protection of their own networks and sites while still allowing them flexibility to combat problems in a way that best suited their own work-flow and supply chains.
Central to Kalapesi’s argument in favor of principle-based solutions is their flexibility.
“Half of the uses of data didn’t exist when the data was collected – we didn’t know what they were going to do with it,” he said, alluding to the concerns over the use of private data by the likes of Google and Facebook, which accelerate and evolve at a rate with which formal legislation could never keep up.
Burr later echoed this point in theorizing that 1998′s Child Online Protection Act might soon be obsolete, but Mithal remained firm that a “government backstop” should be in place to ensure that there’s something other than the vague notion of “market forces” to respond to companies who step back from their agreements.
— Morgan Little
The Internet and the Web are continuing to expand at exponential rates. When the board of the Internet Corporation for Assigned Names and Numbers opened up a whole new world of names for Internet addresses with its historic vote in June 2011, new gTLDs and their implications for users became extremely important. This session explored the Internet users’ experiences that might be expected as the Domain Name System (DNS) is prepared to under a massive expansion, adding hundreds or even a thousand new gTLDs to “allow for a greater degree of innovation and choice.”
Details of the session:
Every time an individual pulls up a webpage the Domain Name System is used. Moderators and industry leaders who met at an IGF-USA 2011 workshop say changes announced by ICANN this summer will bring new challenges and opportunities. Generic top-level domains, also known as gTLDs were previously quite limited. They included .com, .info, .net and .org. On June 20, 2011, the board of the Internet Corporation for Assigned Names and Numbers (ICANN) voted to allow companies and organizations to choose any fitting suffix for their domain names. The new gTLDs will be operational in 2013. Among the likely names are .sport, .bank and .app.
The moderator of the event was Frederick Felman, chief marketing officer for Mark Monitor, a major domain management company based in the United States. Panelists included:
- Suzanne Radell, senior policy adviser in the office of international affairs at the U.S. National Telecommunications and Information Administration
- Amber Sterling, senior intellectual property specialist for the Association of American Medical Colleges
- Pat Kane, senior vice president for naming services for Verisign
- Jon Nevett, co-founder and executive vice president of Donuts Inc. and president of Domain Dimensions, LLC, a consultancy on domain name issues
- Brian Winterfeldt, partner at the Washington, D.C., law firm Steptoe & Johnson, where he is a member of the intellectual property group
- Ron Andruff, president and CEO of DotSport, managing the new top-level domain .sport – http://www.dotsportllc.com/about
Details of the Session
The panelists speculated that as few as 500 and as many as 2,000 domain names could be added in the near future as ICANN opens its application pool up in January 2012. These new names can range from generic names like .pizza, brand names like .apple or geographic names like .London.
“Sports is one of those unique things,” he said. “Like music, [it] transcends borders, transcends languages, transcends cultures. It is relevant.”
It is important that we allow multiscript applications so we can reach all people of all languages, he said.
ICANN’s decision to open up the applicant pool is still relevantly new to the general public, which could lead to confusion, said Felman.
But the general population is beginning to join in the conversations, Kane explained. But Radell cautioned that the government is very concerned about the potential for fraud and general user confusion. When something goes wrong, people are going to turn to their government to ask why this was allowed to happen, she said.
Members of the Governmental Advisory Committee (GAC) worked very closely with ICANN to make sure safeguards were put into place to protect the users, Radell added.
One audience member asked how something like .bank would affect his ability to access his bank’s website. He questioned how the URL would be structured, and how Google Chrome users, who don’t use a URL at all, only a search bar, would access the sites. The panelists agreed that expectations for end users are still being developed.
Non-profits are another group that could have some trouble with the new domain names, said Sterling. In the past 15 years, non-profits have seen more donations through the use of the Internet, but it has also seen the Internet abused in the process.
Brand owners are concerned about the fraud that could occur in the future with increased domain names and if multiple groups apply for the same domain name, said Winterfeldt. There is mediation through ICANN and brand owners will be notified if their domain is being sought by another company.
Another concern is whether the increase in domain names would lead to another .com bubble and fizzle out. “In essence, whether they survived was not the point, said Hedlund. “It’s about adding competition and how the market responds.”
– Anna Johnson
IGF participants broke into three different rooms to discuss three different, possible potential-future scenarios for the Internet in 2025. In this session, the brief description given to the discussants noted that “Government Prevails” scenario imagines a future affected by man-made and natural challenges and disasters – wars, civil strife, an aging world and interventionist governments. This scenario assumes that while “the ICT industry, media companies and NGOs” are the leading players on the Internet stage today [some people might disagree with this assumption], by 2025 governments and inter-governmental organizations will have come to rule the Internet as a result of actions taken to protect particular interests from negative exposure.
Details of the session:
A small group of Internet stakeholders from various sectors met to discuss the Government Prevails potential-future scenario at the Internet Governance Forum-USA 2011 at Georgetown University Law Center.
This scenario sets up a closed-off future for the Internet. You can read the one-page PDF used to launch this discussion here:
The potential future drivers of change people were asked to consider included:
- Manmade and natural disasters push governments to exert more control over Internet resources.
- Changes in the Domain Name System force intergovernmental organizations to impose new global regulatory regimes.
- Networked image sensing through devices such as Kinect and GPS are used to identify and track people, with positive and negative effects, but the net result is a global surveillance culture.
- Governments limit bandwidth for video conferencing when they find revenues for hotels, airlines and other travel-related economic entities in sharp decline.
- Lawsuits and other developments cause governments to create blacklists of websites prohibited from Internet access.
- Anonymity on the Internet is brought to an end as a response to viruses, worms and credit card fraud and user authentication is required.
- Governments take every opportunity to coordinate and consolidate power under various mandates for global solutions and by 2025 governments and law enforcement are deeply embedded in all aspects of the Internet.
NetChoice Executive Director Steve DelBianco began the session by sharing the drivers of this future and what the Internet might look like in 2025.
“The scenario at its key is an attempt to be provocative about a potential future,” said DelBianco, who emphasized this session was supposed to search for what could be plausible and to develop opinions on the possible benefits and disadvantages of a future and what could be done to mitigate its impact.
“Is this the George Orwell scenario where it is a question of not whether but when?” Roseman said.
Although there was a list of questions the leaders intended to discuss, the session quickly turned into a running debate, bouncing from topic to topic as the participants introduced them. Two main themes quickly emerged.
The first was the conflict between security versus privacy.
Carl Szabo cited the situation in London, where hundreds of security cameras were added to city streets with the intention of reducing crime. The result was criminals adapting to the increased surveillance by wearing hooded sweatshirts.
“As we give away these rights and privileges for alleged increased security, it’s not necessarily going to return with security,” he said.
Slava Cherkasov, with the United Nations, brought up the recent case of Brooklyn boy Leiby Kletzky, who was allegedly abducted, murdered and dismembered by a stranger, Levi Aron. In that case, it was a security camera outside a dentist’s office that led to Aron’s arrest, confession and the recovery of the boy’s body within an hour of viewing the footage.
Judith Hellerstein, with the D.C. Internet Society, said that government use of data is acceptable when there is an understanding about privacy and intent.
“You also have to sort of figure out how governments are going to use that technology in hand,” she said.
In the scenario, an issue was introduced, based on reality, where pictures of protesting crowds were tagged, allowing for the identification of people at the scene of a potential crime.
Elon University student Ronda Ataalla expressed concern over limiting tagging in photographs, because it was a limit on expression.
But David McGuire of 463 Communications reminded the room that civil liberties traditionally don’t poll well.
“Free speech isn’t there to protect the speech we all like,” he said.
DelBianco expanded the tagging issue to raise the issue of “vigilante justice,” people using debatably privacy-violating practices to identify people they consider wrong-doers, and brought up Senate Bill 242 in California, which would alter the way social networks create default privacy settings for users. This bill was narrowly defeated 19 to 17 June 2.
Chris Martin with the USCIB talked about how not all companies are interested in using their technology for ill or personal gains, listing Google and their withholding of the use of facial recognition technology to protect people’s privacy.
This subject is also related to the second main discussion topic: the government versus industry and the private sector.
Covington questioned Martin about whether he saw governments developing that same facial recognition technology, as described in the scenario, and using it to monitor citizens.
“Some,” was his reply, before adding that all Internet governance was about maximizing good and minimizing evil.
There was then a brief discussion about the Patriot Act and relinquishing civil liberties online in the circumstances of a national emergency. Who decides when the emergency has passed?
Szabo and others questioned if the government was even the right organization to take over in the event of a disaster.
“It’s much easier to say, ‘Let them deal with it so I don’t have to,’ but the question is, ‘Will they do it better?’” he said.
Cherkasov said not necessarily, mentioning that when Haiti was struck by the severe earthquake in January 2010, it took two weeks for government organizations to develop a database to search for missing people, but in Japan in March 2011, it took Google only 90 minutes to come up with the same technology. He then returned to the security camera situation, concluding that citizens were the first line of response and information in a disaster scenario.
“There will always be maybe an ebb and a flow but it’s the power of the people that will ultimately be able to create that balance,” Roseman said. “But it’s going to have to be a proactive effort to get and keep that balance.”
Roseman also said one of the benefits of the industrial and private sector was an ability to use funds more freely than the government, which, presumably, does operate on a limited budget.
“When you have governments and the private sector and industry working together, you generate a lot more money and opportunity to drive change,” she said.
McGuire, though, expressed concern that industry and the private sector have some misconceptions about the power of the Internet, believing that it is too powerful for any law or government to cut it down. He said many, including those in the area of Silicon Valley, Calif., think the Internet will always be able to circumvent policy.
Most session participants seemed to agree that the potential scenario was troubling.
“It makes me want to move to somewhere where there are more sheep than humans,” joked Covington.
But Brett Berlin, of George Mason University, said that the Internet, and the choices that are made about governing it, are ultimately people-driven decisions, reminding the rest of the room that technology works for people and not the other way around.
“If we are foolish enough to think that open Internet will fundamentally allow us to be better, we are making a mistake.”
– Rachel Southmayd
Data Retention; privacy; security; geo-location; mobility; government/law enforcement cooperation; transnational location issues: these are among the emerging cloud computing challenges in Internet Governance. Promoted by industry and government alike, “the cloud” seems to be the answer in providing emerging online services – addressing costs; access; diversity of infrastructure; reliability; and security. Yet its extremely distributed nature raises Internet governance questions. This workshop addressed the Internet governance questions facing cloud computing, including the emergence of the mobile cloud.
Details of the session:
Where the cloud’s data is located, who has access to it and what happens if it’s breached took center stage during the cloud computing workshop at the IGF-USA conference July 18 in Washington, D.C.
The moderator for the session was Mike Nelson, a professor at Georgetown University and research associate at CSC Leading Edge Forum.
Panelists included a range of industry, governmental and civil organization representatives:
- Jeff Brueggeman, vice president of public policy for AT&T
- Danny McPherson, chief security officer for Verisign
- Amie Stepanovich, Electronic Privacy Information Center
- Marc Crandall, product counsel for Google
- John Morris, general counsel and director of Internet standards, Center for Democracy & Technology (CDT)
- Fred Whiteside, director of cybersecurity operations for the U.S. Department of Commerce, and National Institute of Standards and Technology Target Business Use Case Manager
- Jonathan Zuck, president of the Association for Competitive Technology (ACT)
Georgetown University professor Mike Nelson said governments are happy to use the cloud for cross-border control because it would enable government applications to work better, and it would save money. But the data have to stay within a host country.
“Tension between government controls on cross-border data flows are often caused by the desire for more privacy for citizens in their country versus the global cloud,” he said. “How do we get to a global cloud that is actually globalized, where data is allowed to move wherever it wants to and yet have the private assurances we’ve had in the past?”
There are many who believe location equals control, said Marc Crandall of Google. But that is not always the case when entering various servers and using a resource like the cloud.
“So location may not necessarily equal control,” Crandall said. “The thing about the cloud is I tend to feel that location does not necessarily equal secure. Where something is located doesn’t make it any more or less secure.”
Having governments worry about security standardization and privacy would be a better focus, he said.
Jonathan Zuck, president of the Association for Competitive Technology, said people need to begin to focus on international citizenry in regards to the cloud. It’s not about where the cloud is located or whose cloud the consumers are using, but looking at a larger more competitive group of providers.
And where data are located comes can raise concerns about who has access to that information. If the data are located in a country with little judicial review or fewer privacy regulations, will users’ information be at risk?
“There should be an emerging global standard,” said Jeff Brueggeman, vice president of public policy for AT&T. “As to privacy, the more we improve international cooperation on cybersecurity and law enforcement so that there is more comfort over legitimate concerns that if the data is not stored can they go after a bad guy. But again we have to deal with real issues as well as setting up the right policies to help distinguish between legitimate concern and government overreaching.”
If there is a breach and private information has been hacked, as has been seen in recent attacks against Google and Sony, what should the companies do to be transparent but also uphold their legal obligations?
If an organization is hacked and information is stolen, but that’s not made known publicly, it could be a violation of fair disclosure, said Danny McPherson, chief security officer of Verisign.
“Lots of folks don’t share that type of information,” he said. “Every state or region or nation or union has different native laws and that is extremely problematic in that perspective.”
There are many times that information may not be classified but is of a private nature, such as trade agreements that would need to stay confidential, said Fred Whiteside, director of cybersecurity operations for the U.S. Department of Commerce. It is complex, he said, and as someone who hears many classified discussions on security breaches, he added that it would trouble him for sensitive information to be made public.
Amie Stepanovich, of Electronic Privacy Information Center, said businesses and industries should start worrying about encrypting the information before it is hacked and instead of worrying about the cost-benefit analysis.
“I think the benefit of data encryption is really worth it,” she said. “Its been proven again and again. Companies feel somehow they have to touch that burner to see if it’s hot before they move to that.”
Regardless, while the focus has been on the concerns and security issues surrounding the cloud, there are many benefits that should receive their due credit.
“I think the fact we are all here is a testament to the cloud,” she said. “Or else we wouldn’t be so concerned with what the problems are if we didn’t recognize there are so many benefits of the cloud.”
– Anna Johnson
This panel, moderated by Robert Guerra of Freedom House, focused on critical Internet resources and how to ensure that the underlying principles that have led to the Internet’s success persist in the face of security challenges. These principles include openness (open standards, open technologies), accessibility transparency, bottom-up decision-making, cooperation and multi-stakeholder engagement. Key to implementing these principles is also a broadened understanding of the role of the infrastructure providers, such as global and national Internet services/connectivity providers who build and operate the backbones and edge networks. The panel was also expected to address some of the implications for the implementation of DNSSEC and IPv6 on a national basis that contribute to the security and resiliency of CIR on a global basis.
Details of the session:
The Internet’s success well into the future may be largely dependent on how it responds and reacts to increasing security challenges, according to panelists in a critical Internet resources workshop at the IGF-USA conference July 21 in Washington, D.C.
The Internet continues to evolve. It is also growing, as it becomes accessible to billions more people. The major challenge of our generation is to make the Internet more secure while continuing to promote openness, accessibility, transparency, bottom-up decision-making, cooperation and multistakeholder engagement. It is important that organizations continue to retain these values as much as possible as they react to cybersecurity and cybertrust issues.
Panelists at this workshop included:
- Moderator Robert Guerra, Freedom House
- Trent Adams, outreach specialist for the Internet Society
- Matt Larson, vice president of DNS research for VeriSign
- Steve Ryan, counsel to the American Registry for Internet Numbers
- Patrick Jones, senior manager of continuity and risk management for ICANN
- Jeff Brueggeman, vice president for public policy for AT&T
Panelists all expressed a desire to continue to engage in multifaceted talks because a single governmental entity is not the solution; it takes many people working together. As Brueggeman put it, there’s no “silver bullet” for the issue of Internet security.
“What we do on a day-to-day basis is ensure that those conversations take place,” Adams said. “The (critical Internet) resource is not a thing you can touch. You have this mesh of interconnected components that is the critical resource. You can’t pull one of those components out. Everyone must be around that table.”
So what’s the solution? The answer to that question is still a little unclear because Internet service providers and other organizations are often reactive to issues. Brueggeman said it’s time to embrace a forward-thinking approach.
“Things can get complicated when you’re reacting to an attack,” he said. “The best way to deal with these things is to try to think about them up front. How do we detect and prevent rather than react after the fact? How can we have more cooperative information sharing before attacks to try to prevent them and have the best information we can?”
Ryan stressed, though, that not all government is bad. He said citizens and organizations need to think “carefully about what the role of the government is.” But still, there should be a symbiotic relationship.
“There’s become a sense in government policy circles, including in the most sophisticated, that somehow (the Internet) runs on its own and you can’t break it,” he said. “I have news for you: You can break it. We look at government as something that has an increasingly important role because the Internet has an increasingly important role in economies.”
Ryan continued by saying non-governmental organizations have a responsibility to work with governments and to educate the people who work in them. He and the other panelists agreed that an international governmental organization wouldn’t work, though, unless core Internet values are embraced and upheld. They said a set-up that would allow countries around the world to vote on how the Internet is governed would not be a favorable solution.
“Until we get it right,” Ryan said, “I think we’re muddling along rather well.”
DNS issues and DNSSEC
Larson spoke specifically about the security of the Domain Name System because he views the DNS as an absolutely critical Internet resource. “If you don’t have the DNS, you don’t have the Internet,” he noted. He said users can’t truly trust the DNS, though, which is a bit disconcerting because of its necessity.
He supports DNSSEC—Domain Name System Security Extensions—which give users digital signatures (origin authentication) and data integrity. “Once you have that, you can validate data and have a higher level of confidence that the data you’re getting back is valid,” Larson said.
(You can read more about DNSSEC here: http://en.wikipedia.org/wiki/Dnssec.)
He also said that DNSSEC makes DNS more trustworthy and critical to users as more applications—not just host names—depend on it. “We’re going to look back and realize it enabled a whole new class of applications to put information in the DNS,” Larson said. “Now you can trust the information coming out of the DNS.”
Going from IPv4 to a combination with IPv6
Ryan emphasized the importance of Internet Protocol version 6, IPv6, a new Internet layer protocol for packet switching that will allow a “gazillion numbers” vastly expanding the address space online. There is a rapidly decreasing pool of numbers left under IPv4. Ryan said the increased flexibility of IPv6 will allow for the continued growth of the Internet, but it won’t be a free-for-all.
“The numbers we have issued are not property,” he said. “We have a legal theory that’s embodied in every contract we’ve issued. They belong to community. If you’re not using them, you have to give them back. They are in essence an intangible, non-property interest, so over the next couple of years there will be some very interesting legal issues.”
ICANN in action
Jones said ICANN, which recently passed its 10-year milestone, has continued to work collaboratively with the community to take on major initiatives, such as the introduction of internationalized domain names in the root.
“We have taken requests from countries for internationalized country codes and approved 15,” Jones said.
“There’s a huge development in those regions of the world where you can now have domain names and an Internet that reflects their own languages and scripts. That will have an impact as discussion around critical Internet resources continues, especially in the IGF space.”
Physical critical resources
Brueggeman said AT&T has a broader perspective of critical Internet resources because the company is responsible for carrying Web traffic and for the underlying infrastructure, not just involved in issues tied to the DNS. He said the transition to IPv6 is daunting because it’s not backward-compatible. His main challenge has been in outreach efforts to customers.
“We have to deal with a lot of traffic that’s generated as we’re making changes to DNSSEC and IPv6,” he said. “In some cases, you might create some new security concerns, but overall both are important essential transitions.”
Brueggeman emphasized that multistakeholder discussions will be important in the coming years.
“We really need all of the parties who have the direct stake at the table to be part of the solution,” he said. “We need to have the resources handled in a way that promotes openness and promotes interoperability. There’s a huge policy risk of not managing these resources in a multistakeholder way.”
-by Colin Donohue, http://imaginingtheinternet.org
IGF-USA Scenario Discussion: Internet Islands – The Rise of Digital Fortresses and the End of the Digital Republic
IGF participants broke into three different rooms to discuss three different, possible potential-future scenarios for the Internet in 2020. In this session, the brief description given to the discussants was: By 2020 the Internet as we know it in 2010 is no more. Concerns over national security and cybercrime led to calls for “safe zones” on the Net. Governments taxed e-commerce as a way to address budget deficits and trade barriers were constructed, closing off markets for goods and information. Mega-companies constructed their own walls to keep criminals out and customers in. At the same time the digital divide grew quickly as poorer nations and smaller companies could not afford to keep up with new security requirements and the entry fees needed to access the secure parts of the Web. Large parts of the world have found themselves “outside the wall” and left to fend for themselves, facing a combination of rapacious criminals, radical groups and bottom-feeding enterprises. For those on an Internet Island, life goes on, but in a more limited way than before.
Details of the session:
A small group of telecommunications leaders and advocates of human rights and privacy met to discuss the Internet Islands potential-future scenario at the Internet Governance Forum-USA 2010 at Georgetown University Law Center. They were led by Garland McCoy, founder of the Technology Policy Institute, Andrew Mack, founder and principal of AMGlobal Consulting, and Iren Borissova, senior manager for international public policy at VeriSign.
This scenario sets up a closed-off future for the Internet. Metaphorical islands have crept in, developed by businesses and governments to limit the flow of outside information while keeping users on the islands secure. You can read the one-page PDF used to launch this discussion here: http://api.ning.com:80/files/OVKwetXFSDRrq4nfkx0duSjNpXJLGlyyKV0S4i2A1FVDA4WwNCN3fHRTtQr5eq7L286HdzHWVJjsf0uynsER71dCuDBn4G8M/InternetIslands.pdf
Scenario facilitators McCoy, Mack and Borissova and other discussants described the Internet of 2010 as a mainland with some islands and more continuing to bubble to the surface. They proposed that having multistakeholder conversations is the way to avoid a more fragmented future and prevent future islands from cutting off the rest of the digital world.
“One of the major antidotes we could take to fight against it is having multistakeholder dialogues like those that we are engaged in now,” said Leslie Martinkovics, director of international public policy and regulatory affairs for Verizon.
The group imagined four island types: totalitarian, culture, liberal and corporate. The totalitarian islands are the governments who limit access and regulate what users are viewing. In some cases government officials require users to identify themselves in order to oversee what is being viewed.
On the liberal islands, while there are good intentions, countries or groups set up virtual trade barriers to gain revenue. Some participants likened this to the fees on rental cars at airports, where visitors are taxed instead of the voters.
A corporate island is one where companies provide a safe haven for their customers while providing additional security measures to prevent criminal breaches. And the cultural islands are created by countries and groups who wish to preserve their culture. The French mandate to resist the incursion of other cultures and focus on local content was used as an example of a cultural island.
But are these really islands, asked McCoy, or are they peninsulas with chokeholds to the mainland’s information. And Courtney Radsch, senior program officer at Freedom House working on the Global Freedom of Expression Campaign and the Southeast Asia Human Rights Defender Initiative, reminded the group that increased access does not always mean increased information.
The scenario participants agreed that international groups like the IGF must continue to meet and bring experts and interested individuals together to discuss the future of the Internet to prevent these islands from continuing to surface.
-Anna Johnson, http://www.imaginingtheinternet.org