Posts Tagged ‘icann’
Brief session description:
Thursday, July 26, 2012 – The US House of Representatives has passed four cybersecurity bills, and the US Senate has indicated an intent to consider cybersecurity legislation in the current session. The US Department of State is working with its global partners to develope relationships, collaborative action and norms of behavior for cyberspace. The US Department of Commerce has spearheaded a government initiative on botnets and is working with industry on botnet mitigation measures. The Department of Homeland Security is increasing its cybersecurity staffing for strategic and operational concerns. And the White House is transitioning its team on cybersecurity policy with a second cybersecurity adviser to the president. Stuxnet and Flame attacks have captured international attention. Cybersecurity remains a key theme in discussions in the United Nations, the International Telecommunications Union, the Organization for Economic Cooperation and Development, the Asia-Pacific Economic Cooperation, ICANN and the annual Global Internet Governance Forum. This workshop addressed questions such as: What are businesses, countries, and the technical community doing in this heightened era of cyber security concern? What should they be doing? What are the considerations for individual users here in the U.S. and around the world? How can all these pockets of activity help protect – and not hamper the protection of – the very medium that provides for productivity, communications, efficiencies, innovation, and expression?
Details of the session:
The session was moderated by Audrey Plonk, global security and Internet policy specialist at Intel Corporation. Panelists were:
- Tom Dukes, senior advisor, Office of the Coordinator for Cyber Issues, US Department of State
- Jeff Greene, senior policy counsel, Cyber Security and Identity, Symantec
- Kendall Burman, senior national security fellow, Center for Democracy and Technology
- Patrick Jones, senior director of security, ICANN
Panelists from the government and private sectors gathered at IGF-USA’s cybersecurity workshop to discuss how these entities are collaborating to deal with domestic cybersecurity threats and international cybersecurity issues.
This issue is especially pertinent right now. There have been a number of high-level conferences and meetings in Washington and other locales over the summer of 2012 on this topic, and, as moderator Audrey Plonk, global security and Internet policy specialist for the Intel Corporation, puts it, “Cybersecurity is the new black.”
Jeff Greene, panelist and senior policy counsel of cybersecurity and identity at Symantec, agreed. “At this time three years ago, cybersecurity was something that was mentioned in passing,” he commented. “Now the interest is exponential.”
Symantec’s business is centered on protecting enterprises from cyberthreats. Greene, who until recently worked with the Department of Homeland Security, said that according to this year’s Symantec Internet Security Threat Report, 75 percent of the enterprises Symantec deals with were threatened with a cyber attack in 2011.
He added that while the incidence of spam decreased in 2011, there has been a shift to web-based attacks. Greene also said the government and private sector are working together to reduce such threats.
“It is remarkable how much of the threat dynamic in both sectors is the same,” Greene said. “We see criminal and other malicious activity largely the same as the government does, so this is all work through government, private and international cooperation.”
Panelist Kendall Burman had a different view on government access to private sector and citizen information in terms of cybersecurity. As a senior national security fellow for the Center for Democracy and Technology, she has spent time exploring security and surveillance from the perspective of a member of a group focused on consumer privacy.
“I think that the tricky area from a civil liberties perspective is when the government is in a position of receiving that information, making sure that that information is limited to cybersecurity threats, and what the government can then do then once it receives it,” Burman said.
Panelist Tom Dukes, senior adviser for the Office of the Coordinator for Cyber Issues at the US Department of State, weighed in from a government standpoint on cybersecurity issues, including the important role of the US government in pushing other countries to increase their outreach and share their perspectives on cybersecurity issues.
“Obviously what the US says, the positions we take, are highly influential and they are certainly looked at by a great many other countries,” Dukes said.
“One thing that the US has been trying to do for the last couple years in terms of addressing cyberpolicy issues in general, cybersecurity included, is to try to take sort of a leadership role in helping shape the world debate on how we think about these issues.”
Dukes said that the US has also made progress in terms of leading a global discussion on reaching a consensus about cyber security norms. Greene said that while the U.S. would like to set its own cybersecurity policies, this could cause global problems.
“If everyone has a different set of rules, (global policymaking)’s going to be pretty difficult,” Greene said.
Panelist Patrick Jones, senior director of security for ICANN, shared his view that while US policymaking is important in terms of cybersecurity, politicians should be aware of the effects that any laws they make may have globally.
“It’s helpful for policymakers, when they’re coming up with legislation, that they think of the Internet as global and consider that the decisions they make may have technical impacts that they’re not considering that impact the way people are using the Internet today – give those a thorough understanding before decisions are made about a particular legislation,” Jones said.
One of the final points of discussion during the workshop was the differences between cybersecurity and information security.
In the discussion it was noted that cybersecurity, in the US view on Internet governance, deals primarily with protection from Internet threats. Information security, in the Russian and Chinese view, also includes censoring the civic sector and content from many Western media and knowledge organizations.
Dukes said there are two considerations for openness and freedom of information that convince most leaders in the world to find common ground in the fairly liberal US position on cybersecurity issues.
First is the basic human rights aspect of the argument; many countries accept that people should, whenever possible within the bounds of public safety, have certain rights of free speech, communication and assembly. Most countries agree that this should apply online.
Dukes’ second point is the economic benefit of keeping the Internet as open and free-flowing as possible. “Many evolving world countries are really desperate to find ways that they can harness the power of the Internet to increase economic opportunity, to increase GDP, to increase development and growth,” he said. “Those arguments seem to be very pragmatic, but it’s hard for countries to disagree with that.”
— Mary Kate Brogan
IGF-USA 2012: Critical Internet Resources (CIRs) – Evolution of the Internet’s Technical Foundations
Brief session description:
Thursday, July 26, 2012 – Since the initiation of the Internet Governance Forum (IGF), Critical Internet Resources (CIR) and the evolution of the Internet’s technical foundations have been a central focus of ongoing Internet governance debates. Varied views can engender misunderstandings that influence the opinions of global stakeholders, and different views exist about how to advance CIRs. International governmental approaches are proposed by some, while others strongly support the present bottom-up, consensus-driven models. Three foundational technological changes – IPv6, secure Domain Name System (DNSsec) and secure routing – framed the discussion in this workshop. Deployment of these new technical and organizational approaches raises significant challenges to stakeholders, operations and governance arrangements.
Details of the session:
The moderator for the session was Walda Roseman, chief operating officer of the Internet Society. Panelists included:
- Steve Crocker, chair of the board of the Internet Corporation for Assigned Names and Numbers
- John Curran, president and CEO of the American Registry of Internet Numbers
- Richard Jimmerson, director for deployment and operationalization, Internet Society
- Vernita Harris, deputy associate administrator in the Office of International Affairs of NTIA, US Department of Commerce
Thursday’s IGF-USA conference at Georgetown Law Center featured an assembled panel of government and corporate experts who addressed the controversial issues concerning the control of critical Internet resources.
Walda Roseman, chief operating officer of the Internet Society (ISOC), chaired the discussion on the implementation and security of CIRs.
CIRs include IP addresses, domain names, routing tables and telecommunications, or what Steve Crocker, CEO and co-founder of Shinkuro Inc., Internet Hall of Fame member and chair of the board of ICANN, called the base of Internet architecture upon which everything else is built.
Moving from Internet Protocol Version 4 to IPv6
One of the most pressing concerns regarding CIRs is the revision of Internet Protocol (commonly referred to as IP) from version 4 to version 6, now the most dominant protocol for Internet traffic.
IPv4 used 32-bit addresses, allowing for approximately 4.2 billion unique IP addresses, but the growth of the Internet has exceeded those limits. IPv6 uses 128-bit addresses, allowing for about 3.4×1038 unique addresses. This number is equal to approximately 4.8×1028 addresses for each of the seven billion people alive in 2012.
Because headers on IPv4 packets and IPv6 packets are quite different, the two protocols are not interoperable and thus they are both being run in what is called a “double stack.”
However, IPv6 is, in general, seen to be a conservative extension of IPv4. Most transport and application-layer protocols need little or no change to operate over IPv6. The exceptions to this are the application protocols that embed internet-layer addresses, such as FTP and NTPv3. In these, the new address format may cause conflicts with existing protocol syntax.
Internet service providers, the Internet Society and many large Internet-based enterprises worked to support a World IPv6 Launch on June 6 this year to help accelerate the adoption of IPv6.
John Curran, president and CEO of the American Registry for Internet Numbers, said upgrading to IPv6 is a necessary step for “any enterprise that wants to still be in business in five years,” because it enables them to continue to reach new customers and grow.
When asked about the costs or burdens of upgrading to IPv6 for small businesses, Curran explained that in most cases the burden would fall on the hosting company through which they run their website.
Chris Griffiths, director of high-speed Internet and new business engineering for Comcast, confirmed this, stating his company would have to upgrade to continue to attract new clients.
Security issues always loom large in Internet evolution
The development of the Internet has led to a need for Domain Name System Security, or DNSSEC. Curran explained that DNSSEC maintains the integrity of the Internet by ensuring the information users obtain is from the source they believe they are corresponding with, essentially preventing redirection to fraudulent websites.
Redirection could come from hackers, hijackers and phishers, but also the US government, should initiatives such as SOPA or PIPA pass.
“My primary interest is keeping the open Internet alive,” said Richard Jimmerson, director of deployment and operationalization for ISOC. “Somebody in this room will want to invent the next Facebook or Yahoo! Today, that is possible, but if we do not pay attention to certain things, that may not be possible anymore.”
Griffiths said Comcast and other Internet technology companies work together through governance processes now in place to address, for example, the types of security vulnerabilities that can drive action to work to avoid future risk, and in making adjustments in infrastructure and dealing with other emerging challenges.
Conflicts arise over the management of CIRs
The US government currently maintains the most control globally over CIRs. This is not well received by some critics around the world, as they fear that the United States may abuse its power. Some have also proposed that they would like to see a roadmap of the Internet for the next 20 years.
Curran addressed these concerns by stating that the US government has a positive track record regarding the respectful and neutral administration of its responsibility for CIRs, mostly leaving all of the operational details to multistakeholder global governance bodies such as the Internet Engineering Task Force and ICANN, and added that roadmap would not likely be effective as there are too many unknowns moving forward.
Vernita Harris, deputy associate administrator of the National Telecommunications and Information Administration, explained that the newest Internet Assigned Numbers Authority (IANA) contract indicates it expects that ICANN and aspects of control over the Internet architecture “will be multi-stakeholder driven, addressing the concerns of all users both domestic and international.”
— Brennan McGovern
IGF-USA 2012 Case Vignettes: Turning Principles into Practice – Or Not: Internet Governance/ICANN; Consumer Privacy; Cyber Security; Dialogues about Lessons Learned
Brief session description:
Thursday, July 26, 2012 – This workshop was aimed at examining the role principles are playing in framing debates, achieving consensus and influencing change – or not. Proposals for Internet principles are popping up everywhere, from national to regional and global discussions, on a wide range of issues. In 2011, IGF-USA examined a number of principles in a session titled “A Plethora of Principles.” This session follows on that one. Session planners noted that it’s not enough to simply develop a set of principles, the question is: how are principles actually implemented how are they inspiring change? Are they new voluntary codes of conduct, new regulations, new laws? Principles can become a baseline for gaining high-level agreements. They may go beyond the expectations possible through legislation or regulation, so some argue that principles should be written to be aspirational. Some argue for legislation, regulation or enforcement mechanisms to ‘hold industry accountable’ to promises made in principles designed as sets of commitments. This workshop examined three case vignettes: 1) How the principles of a white paper were incorporated into ICANN’s formation and what the status of these principles are today within ICANN’s mission and core activities; 2) how consumer privacy principles have fared in global and national settings in terms of these points ‘turning into practice’; and 3) how cybersecurity/botnet principles are faring.
Details of the session:
The moderator for this session was Shane Tews, vice president for global public policy and government relations at Verisign. Panelists included:
- Becky Burr, chief privacy officer, Neustar Inc.: Turning White Paper Principles into actuality in ICANN
- Menessha Mithal, associate director of the division of privacy and identity protection, Federal Trade Commission: Consumer privacy principles
- Eric Burger, director of the Georgetown University Center for Secure Communications: Cybersecurity and botnets
- Carl Kalapesi, co-author of the World Economic Forum’s report Rethinking Personal Data: Strengthening Trust: the World Economic Forum perspective
Before an informal agreement, policy or formal regulation is adopted, passed or approved it takes its initial steps as an idea. The trick lies in bringing it from a formative state to something actionable, otherwise it may languish as a suggested goal, followed by and adhered to by no one.
During the IGF-USA panel titled “Turning Principles into Practice – or Not” participants shared successful case studies as examples of how to create actionable practices out of ethereal goals. Citing processes ranging from US efforts to counteract botnets to domain name system governance and to consumer privacy, three panelists and one respondent drew from their own experiences in discussing ways in which people might successfully bridge the gap between idea and action.
Meneesha Mithal, associate director of the Federal Trade Commission’s Division of Privacy and Identity Protection, weighed in on the efficacy of principles versus regulation by offering a series method to act on a problem.
“It’s not really a binary thing – I think there’s a sliding scale here in how you implement principles and regulation,” she said. She cited corporate self-regulatory codes, the work of international standard-setting bodies, multistakeholder processes, safe harbors and legislation as possible means for action.
Mithal highlighted online privacy policies as an example of the need for a sliding scale. The status quo has been to adhere to the concepts of notice and choice on the part of consumers; this has resulted in corporations’ creation of lengthy, complicated privacy policies that go unread by the consumers they are meant to inform. Recently, pressure has been placed on companies to provide more transparent, effective means of informing customers about privacy policies.
“If it had been in a legislative context, it would have been difficult for us to amend laws,” Mithal said, though she admitted that such flexible agreements are “sometimes not enough when you talk about having rights that are enforceable.”
And Mithal did note that, given the current climate surrounding the discussion of online privacy, it’s still the time for a degree of broad-based privacy legislation in America.
Eric Burger, a professor of computer science at Georgetown University, spoke on the topic of botnets, those dangerous cyber networks that secretly invade and wrest control of computers from consumers, leaving them subservient to the whims of hackers looking for a challenge, or criminals looking for the power to distribute sizable amounts of malware.
Given the sheer number of stakeholders – ISPs concerned about the drain on their profits and the liability problems the strain of illegal information shared by the botnets, individual users concerned over whether their computers have been compromised and government agencies searching for a solution – Burger said that the swift adoption of principles is the ideal response.
Among those principles are sharing responsibility for the response to botnets, admitting that it’s a global problem, reporting and sharing lessons learned from deployed countermeasures, educating users on the problem and the preservation of flexibility to ensure innovation. But Burger did admit the process of arriving at this set of principles wasn’t without its faults. “Very few of the users were involved in this,” he said, citing “heavy government and industry involvement, but very little on the user side,” creating a need to look back in a year or two to examine whether the principles had been met and whether they had been effective in responding to the swarm of botnets.
Becky Burr, chief privacy officer and deputy general counsel at Neustar, previously served as the director of the Office of International Affairs at the National Telecommunications and Information Administration, where she had a hands-on role in the US recognition of ICANN (NTIA). She issued a play-by-play of the lengthy series of efforts to turn ICANN from a series of proposed responses into a legitimate governing entity, which was largely aided by a single paragraph in a framework issued by President Bill Clinton’s administration in 1997.
Written as a response to the growing need for the establishment of groundwork on Internet commerce and domain names, the paper called for a global, competitive, market-based system for registering domain names, which would encourage Internet governance to move from the bottom-up. The next day, the NTIA issued the so-called “Green Paper” which echoed many of the principles of the administration’s framework and drew extensive feedback from around the world, including negative feedback over the suggestion that the US government add up to five gTLDs during the transitional period.
After reflection on the feedback to both the white and green papers, and a series of workshops among multiple stakeholders to flesh out the principles of stability, competition, private-sector leadership, bottom-up governance and realistic representation of the affect communities, ICANN held its first public meeting Nov. 14, 1998, underwent several reforms in 2002, and ever since, in Burr’s words, “is still the best idea, or at least no one’s figured out a better idea.”
“The bottom line is to iterate, make sure you articulate your principles and try to find some built-in self-correcting model,” Burr said.
While Burr’s play-by-play described how a relatively independent, formal institution was formed to offer DNS governance, Carl Kalapesi, a project manager at the World Economic Forum, offered a more informal approach, relying on the informal obligations tied to agreeing with principles to enforce adherence.
“Legislative approaches by their nature take a very, very long time,” Kalapesi said. He vigorously supported the importance of principles in offering “a common vision of where we want to get to,” which leaders can sign onto in order to get the ball rolling.
He offered the example of the “Principles of Cyber Resilience,” offered to CEOs at last year’s World Economic Forum with the goal of making them more accountable for the protection of their own networks and sites while still allowing them flexibility to combat problems in a way that best suited their own work-flow and supply chains.
Central to Kalapesi’s argument in favor of principle-based solutions is their flexibility.
“Half of the uses of data didn’t exist when the data was collected – we didn’t know what they were going to do with it,” he said, alluding to the concerns over the use of private data by the likes of Google and Facebook, which accelerate and evolve at a rate with which formal legislation could never keep up.
Burr later echoed this point in theorizing that 1998′s Child Online Protection Act might soon be obsolete, but Mithal remained firm that a “government backstop” should be in place to ensure that there’s something other than the vague notion of “market forces” to respond to companies who step back from their agreements.
— Morgan Little
IGF-USA 2012 Workshop: The Changing Landscape of the Domain Name System – New Generic Top Level Domains (gTLDs) and Their Implications for Users
Brief session description:
Thursday, July 26, 2012 – Early in 2012, ICANN launched the process to introduce vast numbers of new generic top-level domains (gTLDs) — allowing, for the first time, the customization of Internet addresses to the right of the dot. Few people understand that there are already 22 existing gTLDs and 242 country code TLDs, with a total of 233 million registered second level names across all TLDs. In the coming years, these existing TLDs will be joined by numerous new gTLDs, likely resulting in the registration of millions of new second-level domains. Some will use scripts that are unfamiliar to English speakers or readers. How exactly these new gTLDs will impact the world of users and registrants is yet to be determined. Will they add significant new registration space, cause confusion, provide some unique innovations, or, most likely all of the above to some degree? ICANN received a wide range of applications – including brand names, generic terms, and geographic and regional terms. The workshop was organized to discuss Issues and questions including: changes to how domain name registrants and users may organize and search for information online; how defensive registrations may impact existing registrants; whether ICANN gave a sufficient focus to Internationalized Domain Names; how applications from potential registries from developing countries are supported; whether fraud and abuse that exists in the existing gTLD space will migrate easily into the new ‘spaces’ or even be compounded; and how conflicts between applicants from noncommercial sector will impact the users of the Internet.
Details of the session:
The session was moderated by Ron Andruff, president and CEO of DotSport, LLC. Panelists included:
- Laura Covington, associate general counsel for global brand and trademarks, Yahoo!
- Bobby Flaim, supervisory special agent with the Federal Bureau of Investigation
- Suzanne Radell, senior policy adviser, NTIA, and US Government Advisory Council representative at ICANN
- Elisa Cooper, director of product marketing, MarkMonitor (remote participant)
- Alan Drewsen, executive director of the International Trademark Association
- Andrew Mack, principal and founder of AMGlobal Consulting
- Krista Papac, chief strategy officer for ARI Registry Services
Respondents were Dan Jaffe, executive vice president for government relations of the Association of National Advertisers, and Jeff Neuman, vice president for business affairs of Neustar and Generic Names Supporting Organization councilor at ICANN.
There is a mix of concern and optimism for how the new generic top-level domains (gTLDs) will change the landscape of the Internet, but it’s certain that a new era of the Internet is coming.
A diverse panel at IGF-USA Thursday at Georgetown Law Center offering perspectives ranging from the side of brands to trademark security agreed on one thing: The introduction of new gTLDs will open the Internet up to more users, but also to more actors and cyber squatters. The panel agreed that the gTLD program will result in a tremendous amount of change, but how it will affect the landscape and whether that change is good, sparked the most discussion.
This year, there are 2.3 billion users of the Internet and 555 million websites. The numbers are staggering, considering the Internet is only about 14 years old, said moderator Ron Andruff, president and CEO of RNA Partners Inc.
There are 22 existing gTLDs – including .com, .net, .org and .edu – and 242 country code TLDs.
Elisa Cooper, director of product marketing at MarkMonitor, joined the panel remotely to give an analysis and breakdown of new gTLD application statistics.
Of 1,930 applications for a new gTLD, 652 were .Brand applications. Cooper divides the applications into three categories: brand names, community based and generic. The two flavors of generic are closed and open – the latter makes registries available to the general public with little eligibility requirements. Cooper also revealed:
- There is a relatively low number of Internationalized Domain Names – only 116.
- Geographically, the majority of the applications have come from North America and Europe.
- Of the .Brand applications – which go through the standard application process – technology,
media and financial sectors led the way.
- The most highly contested strings were .APP, .INC, .HOME and .ART
- The top three applicants were Donuts, Google and Amazon.
Laura Covington, who serves as chief trademark and brand counsel for Yahoo!, joined the panel from a .brand applicant company and offered a brand owner perspective. Yahoo! applied for .yahoo and .flickr
“I think there are a lot of exciting opportunities from a marketing perspective, even from a security perspective with the new gTLDs and the new .brands in particular,” Covington said. “And I also think that it’s going to have to change the face of how trademark owners, brand owners deal with their enforcement issues, how they approach protecting their marks going forward.”
Yahoo! is viewing the new gTLDs as an amazing new world and new way to reach customers, though Covington admits uncertainty toward what search engines will do once gTLDs are added to the mix of search algorithms. As a brand owner, she has concerns with how to deal with the second-level names because there will be an exponential increase in opportunity for cyber squatters.
Bobby Flaim, FBI special agent, is primarily concerned with the pre-existing problems with domestic and international law enforcement of the Internet and how the problems may worsen as bad actors become more prevalent.
The existing system has some major problems with cyber squatting, said Jaffe, group executive vice president of ANA. He said he didn’t want to be the panel’s doomsayer, but he added that no one should assume the new gTLD program will roll out in a smooth or timely manner.
One hugely positive impact of the new gTLDs Covington sees is an influx of new voices and new participants in the multistakeholder process.
Krista Papac, general manager of ARI Registry Services, agreed.
“I do have faith in the multistakeholder model and hope that we continue to find our way through it and deal with the different issues,” Papac said.
Papac is running some of the registries for the new gTLDs and sees a lot of opportunity to create more secure environments and more opportunities from brands.
Suzanne Radell, senior policy adviser in the Office of International Affairs at NTIA and US GAC Representative, said that more people and more interest in the program will be crucial to ICANN’s evolution.
“We’ve got our fingers crossed that the benefits to consumers, to users are not outweighed by risks and costs,” Radell said. “So we’re looking very much forward to a review of the new gTLD program.”
Alan Drewsen, executive director of INTA, said he expects that the introduction of the new gTLDs will go more slowly and be less successful than hoped.
“ICANN will continue to exist, though I think it’s done everything possible to put its life in jeopardy,” Drewsen said, making the audience and panel laugh.
INTA has been critical of the process that ICANN has led over the last several years in introducing the new gTLDs.
“Given the amount of time and money that the members have invested in this process and the potential consequences that can flow from its failure, INTA will continue to work collaboratively with a lot of these constituencies to get the best possible results,” Drewsen said.
Andrew Mack, principal of AMGlobal Consulting, sees a large concentration in the global North and the English-speaking world. People in the global South won’t be able to participate in a program they don’t know exists. Seventeen gTLD applications are better than none, he said, but the number of applicants from other parts of the globa total to a paltry amount compared to highly connected regions already experiencing huge economic shifts due to the Internet. Mack said his pessimism is rooted in the fact that Africa and Asia are missing out when they could really benefit.
“And we want them to be part of our Internet,” Mack said.
There is an influx of new participants from existing participants, Neuman of Neustar noted.
The new gTLDs open up a lot of opportunities for business and marketing folks, but each person on the panel defined success in different ways.
“It’s definitely going to be an exciting time,” said Brian Winterfeldt, a partner with Steptoe & Johnson LLP. “I think we really are moving into sort of a new era of the Internet with this expansion and I think it’s going to be very exciting to see how it evolves.”
— Ashley Barnas
Internet Governance Forum-USA, 2011 New Challenges to Critical Internet Resources Blocking and Tackling – New Risks and Solutions
The security, stability and resiliency of the Internet are recognized as vital to the continued successful growth of the Internet as a platform for worldwide communication, commerce and innovation. This panel focused on domain name service blocking and filtering and the implementation of Internet Protocol version 6 (IPv6) and critical Internet resources in general. The panel addressed some of the implications of blocking and filtering; the implementation of DNSSEC and IPv6 on a national basis; and future challenges to the Internet in a mobile age.
Details of the session:
The Internet’s success well into the future may be largely dependent on how it responds and reacts to new challenges, according to panelists in a session on critical Internet resources at the IGF-USA conference July 18 in Washington, D.C.
The Internet continues to evolve. It is also growing, as it becomes accessible to billions more people. A major challenge now and in years to come is to make the Internet more secure while continuing to promote openness, accessibility, transparency, bottom-up decision-making, cooperation and multistakeholder engagement. It is important that organizations continue to retain these values as much as possible as they react to cybersecurity and cybertrust issues.
This workshop was conducted in two parts, both moderated by Sally Wentworth, senior manager of public policy for the Internet Society. Panelists included:
- John Curran, president and CEO of the American Registry for Internet Numbers (ARIN)
- Steve Crocker, CEO and co-founder of Shinkuro, and vice chair of the board for ICANN, the Internet Corporation for Assigned Names and Numbers
- George Ou, expert analyst and blogger for High Tech Forum http://www.hightechforum.org/
- Rex Bullinger, senior director for broadband technology for the National Cable and Telecommunications Association (NCTA)
- Paul Brigner, chief technology policy officer, Motion Picture Association of America (MPAA)
- Bobby Flaim, special agent in the Technical Liaison Unit of the Operational Technology Division of the FBI
- David Sohn, senior policy counsel for the Center for Democracy and Technology (CDT)
- Don Blumenthal, senior policy adviser for the Public Interest Registry (PIR)
- Jim Galvin, director of strategic relationships and technical standards for Afilias
Wentworth began the session by referring to it by an alternate title, “Blocking and Tackling.” The alternate title proved appropriate, as the two sets of panelists became more and more passionate in their assertions during the discussions on two topics that affect the future health of the Internet: the implementations of IP version 6, or IPv6, and especially Domain Name System (DNS) blocking and filtering.
The first panel consisted of John Curran, Steve Crocker, Rex Bullinger, Jim Galvin and Bobby Flaim. It centered on a discussion of IPv6 and the difficulty in implementing a system that is viewed as the “broccoli of the Internet” – something that is technologically necessary, but for consumers is less of a priority because a switch is not incentivized.
The technological necessity is an inevitability. IPv4 has 4.3 billion independent IP addresses. The central pool of addresses ran dry Feb. 3, 2011. The last five blocks were passed out to the five regional address distributors. Depending on the rate of growth, Curran explained, those addresses may not last very long. In fact, the Pacific region has already handed out all its addresses. The 7 billion people in the world can’t fit into 4.3 billion addresses, especially when most have more than one address to their names.
“We have to move, the hard part is getting them to move,” Curran said, referring to consumers.
The biggest problem is that IPv6 and IPv4 are two languages that don’t talk to each other.
“You have to enable IPv6 in order to speak to IPv6 Internet users. It’s literally possible for us to break the global Internet if we don’t transition to IPv6,” Curran said.
The dual stack model was identified by the Internet Engineering Task Force (IETF) to be the most effective and efficient way to begin integrating IPv6 into the lexicon.
“What we have is a coexistence processes rather than a transition,” Crocker explained. “We’re going to have these two existing parallels. You’ve got pure IPv4 exchanges and conversations, pure IPv6 exchanges, then you’ll have somewhat more complex interchanges between IPv4 users and IPv6 systems and those will require some sort of translation.”
“The ISPs,” Curran said, “are stuck in the trap as to whether there’s enough demand to make the switch.”
Currently, most devices, such as laptops, are IPv6 enabled and have been for some time, so they’re coexisting, rather than transitioning directly from IPv4, Curran said.
“Things are not going to be replaced overnight,” Galvin said. “The providers are ready, but the customers aren’t. The laptops are ready, but the interaction is not ready.”
There are other problematic implications of enacting the new IP version, particularly as it relates to ensuring that how the IP addresses are being allocated and to whom and logging NATS, according to Flaim. Another element was addressed by an audience member – the possible advantages possessed by countries with less infrastructure than the United States. Is the United States being left behind because its Internet is too big? Crocker contended that it’s a possible scenario that some developing regions could leap frog over IPv4 and go directly to IPv6.
DNS Blocking and Filtering
The second panel of the afternoon consisted of Crocker, George Ou, Paul Brigner, Galvin, David Sohn and Don Blumenthal and centered on a lively discussion about the merits of DNS blocking and filtering as an answer to copyright infringement.
The panel was divided on its views – some felt that DNS filtering and blocking represented the perfect answer to copyright infringement and media piracy, while others felt that a solution based on technical adjustment was the wrong way to approach what they deemed a human problem, not a technical problem. While a consensus was not achieved, the discussion addressed serious concerns about the damaging effects of illegal downloading on the content production industry, as well as the greater constitutional, technical and political implications of the use of DNS filtering and blocking.
Panelists referenced the Protect IP legislation that is currently in the U.S. Senate. The legislation is aimed at off-shore websites that infringe copyright laws by hosting pirated media. One of the ways the bill works is to undercut the sites by going after their advertising and funding sources.
The trouble, Crocker explained, is that the blockages or filters are not only “trivial to get around” but the motivation is there. Sohn agreed that DNS filtering is not beneficial, especially in this case, because it is easy to evade. Using DNS filtering to prevent users from accessing entire domains that may contain illegal content rather than addressing the content itself on specific websites is too broad, Sohn suggested.
Galvin agreed: “While content providers have a legitimate need to protect their assets, there seems to be the automatic assumption that DNS filtering is the right way to do that. You’re concerned about content, so you want to do content filtering.”
Galvin cautioned the panel and the audience to be aware of consequential damages.
Sohn also raised concerns about prior restraint. “Under prior restraint law, you cannot restrict speech unless it’s actually proven illegal,” he said. “DNS filtering orders would often occur on a preliminary basis, as part of a restraining order, rather than after a full legal determination by a court.”
There was further discussion that on a technical level, the use of these tactics would be problematic because it would break DNSSEC and destabilize the Internet. Especially when DNSSEC was designed to detect just that type of illegal activity, Crocker maintained.
On the other side of the issue, Brigner explained that in using DNS filtering, criminal sites would be removed from the “global phonebook,” preventing individuals from accessing them and propagating the consumption of illegal media.
“We’re not asking for a new row of cannons,” he said in reference to Crocker’s earlier metaphor about the Vassa, a Swedish warship that sank because its poor design was based on the king’s desire for more firepower in spite of architectural faults. “Let’s use sound engineering principals.”
An audience member’s suggestion of the use of an industry seal program was also met with varying levels of support and dissension. Ou expressed that an industry seal program may be easily counterfeited, while others felt that using a “human” solution rather than a technical solution was a more appropriate answer to the problem.
In the end, the dialogue raised many new questions about the real world implications of IPv6 and DNS blocking technologies.
“It’s important that these various communities find a way to address the issues in a way that’s respectful of the technology, respectful of the global Internet and people’s need to carry out their business, and the freedom of expression,” Wentworth said in closing.
“There’s a lot of competing interests, but they don’t have to be mutually exclusive.”
– Bethany Swanson
This panel, moderated by Robert Guerra of Freedom House, focused on critical Internet resources and how to ensure that the underlying principles that have led to the Internet’s success persist in the face of security challenges. These principles include openness (open standards, open technologies), accessibility transparency, bottom-up decision-making, cooperation and multi-stakeholder engagement. Key to implementing these principles is also a broadened understanding of the role of the infrastructure providers, such as global and national Internet services/connectivity providers who build and operate the backbones and edge networks. The panel was also expected to address some of the implications for the implementation of DNSSEC and IPv6 on a national basis that contribute to the security and resiliency of CIR on a global basis.
Details of the session:
The Internet’s success well into the future may be largely dependent on how it responds and reacts to increasing security challenges, according to panelists in a critical Internet resources workshop at the IGF-USA conference July 21 in Washington, D.C.
The Internet continues to evolve. It is also growing, as it becomes accessible to billions more people. The major challenge of our generation is to make the Internet more secure while continuing to promote openness, accessibility, transparency, bottom-up decision-making, cooperation and multistakeholder engagement. It is important that organizations continue to retain these values as much as possible as they react to cybersecurity and cybertrust issues.
Panelists at this workshop included:
- Moderator Robert Guerra, Freedom House
- Trent Adams, outreach specialist for the Internet Society
- Matt Larson, vice president of DNS research for VeriSign
- Steve Ryan, counsel to the American Registry for Internet Numbers
- Patrick Jones, senior manager of continuity and risk management for ICANN
- Jeff Brueggeman, vice president for public policy for AT&T
Panelists all expressed a desire to continue to engage in multifaceted talks because a single governmental entity is not the solution; it takes many people working together. As Brueggeman put it, there’s no “silver bullet” for the issue of Internet security.
“What we do on a day-to-day basis is ensure that those conversations take place,” Adams said. “The (critical Internet) resource is not a thing you can touch. You have this mesh of interconnected components that is the critical resource. You can’t pull one of those components out. Everyone must be around that table.”
So what’s the solution? The answer to that question is still a little unclear because Internet service providers and other organizations are often reactive to issues. Brueggeman said it’s time to embrace a forward-thinking approach.
“Things can get complicated when you’re reacting to an attack,” he said. “The best way to deal with these things is to try to think about them up front. How do we detect and prevent rather than react after the fact? How can we have more cooperative information sharing before attacks to try to prevent them and have the best information we can?”
Ryan stressed, though, that not all government is bad. He said citizens and organizations need to think “carefully about what the role of the government is.” But still, there should be a symbiotic relationship.
“There’s become a sense in government policy circles, including in the most sophisticated, that somehow (the Internet) runs on its own and you can’t break it,” he said. “I have news for you: You can break it. We look at government as something that has an increasingly important role because the Internet has an increasingly important role in economies.”
Ryan continued by saying non-governmental organizations have a responsibility to work with governments and to educate the people who work in them. He and the other panelists agreed that an international governmental organization wouldn’t work, though, unless core Internet values are embraced and upheld. They said a set-up that would allow countries around the world to vote on how the Internet is governed would not be a favorable solution.
“Until we get it right,” Ryan said, “I think we’re muddling along rather well.”
DNS issues and DNSSEC
Larson spoke specifically about the security of the Domain Name System because he views the DNS as an absolutely critical Internet resource. “If you don’t have the DNS, you don’t have the Internet,” he noted. He said users can’t truly trust the DNS, though, which is a bit disconcerting because of its necessity.
He supports DNSSEC—Domain Name System Security Extensions—which give users digital signatures (origin authentication) and data integrity. “Once you have that, you can validate data and have a higher level of confidence that the data you’re getting back is valid,” Larson said.
(You can read more about DNSSEC here: http://en.wikipedia.org/wiki/Dnssec.)
He also said that DNSSEC makes DNS more trustworthy and critical to users as more applications—not just host names—depend on it. “We’re going to look back and realize it enabled a whole new class of applications to put information in the DNS,” Larson said. “Now you can trust the information coming out of the DNS.”
Going from IPv4 to a combination with IPv6
Ryan emphasized the importance of Internet Protocol version 6, IPv6, a new Internet layer protocol for packet switching that will allow a “gazillion numbers” vastly expanding the address space online. There is a rapidly decreasing pool of numbers left under IPv4. Ryan said the increased flexibility of IPv6 will allow for the continued growth of the Internet, but it won’t be a free-for-all.
“The numbers we have issued are not property,” he said. “We have a legal theory that’s embodied in every contract we’ve issued. They belong to community. If you’re not using them, you have to give them back. They are in essence an intangible, non-property interest, so over the next couple of years there will be some very interesting legal issues.”
ICANN in action
Jones said ICANN, which recently passed its 10-year milestone, has continued to work collaboratively with the community to take on major initiatives, such as the introduction of internationalized domain names in the root.
“We have taken requests from countries for internationalized country codes and approved 15,” Jones said.
“There’s a huge development in those regions of the world where you can now have domain names and an Internet that reflects their own languages and scripts. That will have an impact as discussion around critical Internet resources continues, especially in the IGF space.”
Physical critical resources
Brueggeman said AT&T has a broader perspective of critical Internet resources because the company is responsible for carrying Web traffic and for the underlying infrastructure, not just involved in issues tied to the DNS. He said the transition to IPv6 is daunting because it’s not backward-compatible. His main challenge has been in outreach efforts to customers.
“We have to deal with a lot of traffic that’s generated as we’re making changes to DNSSEC and IPv6,” he said. “In some cases, you might create some new security concerns, but overall both are important essential transitions.”
Brueggeman emphasized that multistakeholder discussions will be important in the coming years.
“We really need all of the parties who have the direct stake at the table to be part of the solution,” he said. “We need to have the resources handled in a way that promotes openness and promotes interoperability. There’s a huge policy risk of not managing these resources in a multistakeholder way.”
-by Colin Donohue, http://imaginingtheinternet.org
The 2009 IGF-USA session description of this panel is: “Critical Internet Resources (CIR) and the evolution of the Internet’s technical foundations are a central theme of Internet governance debates. Three foundational technological changes – IPv6 (the ‘new’ version of the protocol for the Internet); secure DNS (domain name system security) and secure routing – will underpin the dialogue between key experts from the Internet community, business and government. The successful implementation of these technologies can expand and improve the security of the Internet’s core infrastructures, but deployment raises significant challenges for Internet infrastructure providers and policy makers, and has implications for governance arrangements.”
Brenden Kuerbis, operations director for the Internet Governance Project, based at Syracuse University, served as moderator for a panel that included Alain Durand, director and IPv6 architect, office of the CTO of Comcast; David Conrad, VP for research and IANA Strategy for the Internet Corporation for Assigned Names and Numbers (ICANN); Fiona Alexander, associate administrator, National Telecommunications and Information Administration, U.S. Department of Commerce; and Stephen Ryan, general counsel for the American Registry for Internet Numbers (ARIN).
Kuerbis noted that documents drawn up during the World Summits on the Information Society suggest that critical Internet resources should be managed through global agreements.
“In the third year of IGF, control of CIR was raised forcefully by a member of the Chinese delegation,” Kuerbis said.
Going forward, the management of critical Internet resources is likely to become more contentious. – Brenden Kuerbis
He noted the implementation of IPv6 and attempts to introduce more security will complicate the management of CIR.
David Conrad said there are critical Internet resources at all layers of the Internet infrastructure. Not all are being discussed at IGF. “You need electricity, you need IP addresses, routing infrastructure, ports,” he said. “In my experience in the IGF context the focus has only been on a select set of resources – those that are involved in what ICANN does. Electricity is more important than whether or not you can get a domain name. There is a focus on the developed world.”
He added that DNS security and routing are important topics that once again tend to have the policy dialogue centered around ICANN. “It is a place where most of the decisions are made around critical Internet resources – it is a community, just like the RIRs are communities that develop policies in a community-driven, bottom-up process. I encourage you to participate in these meetings.”
Stephen Ryan of ARIN discussed the Regional Internet Registries and their role in CIR. There are five recognized registries located in regions around the world. They were established in the 1990s. He said each “develops policies in its own regions regarding Internet numbering and associated issues.” The leaders of the five registries also meet to set common global policies. The boards are voluntary, and anyone is invited to participate in the process of governing the RIRs. These organizations provide Whois service and assign and give out numbers – IP addresses.
There was some discussion of the fact that IPv4 addresses are being depleted. This was anticipated years ago, and IPv6 is being adopted. “What’s our biggest challenge in regard to critical Internet resources?” he asked. “The numbers resources and the switch to IPv6. The fixed number of IPv4 numbers the free pool of remaining IPv4 resources is small.
Clearly we’re going to have to run IPv4 and IPv6 systems in tandem and that’s going to cause problems. Not many people in America understand IP numbers and that their modems won’t work. – Stephen Ryan
He closed by smiling and saying, “Buy Cisco stock, that’s a tip.”
Alain Durand of Comcast spoke as a panel member who could speak to the CIR concerns of large technology companies.
We are trying to actively participate. The bottom-up policy process has been successful. It has been flexible enough to meet all of our demands and we would like it to go on. – Alain Durand
The depletion of IPv4 addresses is of concern, he said. “If you are a large service provider with many customers and you are growing you are going to be impacted more than individual users,” he said. “We have been concerned about imbalances between the RIRs in the world and that is why we have been participating in RIPE discussions, LACNIC discussions and participated in this process as a member of the community.”
Fiona Alexander of NTIA agreed that too much of the discussion of the World Summit on the Information Society text is absorbed by “people’s preoccupation with the domain name system.”
“The network is so decentralized,” she said in reference to the global Internet and the people engaged in working toward its evolution, “but the one organizing group everyone recognizes tends to be ICANN. When you read the WSIS text it explicitly says there are things beyond domain names. We should look at other things as a national priority and as we go into the global discussion of critical Internet resources.”
She said people in government are recognizing they need to understand the layers of architecture to understand its evolution and address needs.
“As the discussion is progressing in our own government about issues related to Internet or telecommunications you really have to understand the network architecture to make smart policy.
You have to more and more understand the different layers of this network. Governments are listening they are interested in these issues. – Fiona Alexander
She added that governments know the uptake of IPv6 is important. “This is on the agenda of governments,” she said. “Our own government is struggling with this. We are working closely with NIST as we look at these issues – it helps that we are both in the Department of Commerce. It’s one of the things we are looking at as we assess the transitions that are fundamental to the network.”
-Janna Anderson, http://www.imaginingtheinternet.org