Posts Tagged ‘Internet Society’
IGF-USA 2012: Critical Internet Resources (CIRs) – Evolution of the Internet’s Technical Foundations
Brief session description:
Thursday, July 26, 2012 – Since the initiation of the Internet Governance Forum (IGF), Critical Internet Resources (CIR) and the evolution of the Internet’s technical foundations have been a central focus of ongoing Internet governance debates. Varied views can engender misunderstandings that influence the opinions of global stakeholders, and different views exist about how to advance CIRs. International governmental approaches are proposed by some, while others strongly support the present bottom-up, consensus-driven models. Three foundational technological changes – IPv6, secure Domain Name System (DNSsec) and secure routing – framed the discussion in this workshop. Deployment of these new technical and organizational approaches raises significant challenges to stakeholders, operations and governance arrangements.
Details of the session:
The moderator for the session was Walda Roseman, chief operating officer of the Internet Society. Panelists included:
- Steve Crocker, chair of the board of the Internet Corporation for Assigned Names and Numbers
- John Curran, president and CEO of the American Registry of Internet Numbers
- Richard Jimmerson, director for deployment and operationalization, Internet Society
- Vernita Harris, deputy associate administrator in the Office of International Affairs of NTIA, US Department of Commerce
Thursday’s IGF-USA conference at Georgetown Law Center featured an assembled panel of government and corporate experts who addressed the controversial issues concerning the control of critical Internet resources.
Walda Roseman, chief operating officer of the Internet Society (ISOC), chaired the discussion on the implementation and security of CIRs.
CIRs include IP addresses, domain names, routing tables and telecommunications, or what Steve Crocker, CEO and co-founder of Shinkuro Inc., Internet Hall of Fame member and chair of the board of ICANN, called the base of Internet architecture upon which everything else is built.
Moving from Internet Protocol Version 4 to IPv6
One of the most pressing concerns regarding CIRs is the revision of Internet Protocol (commonly referred to as IP) from version 4 to version 6, now the most dominant protocol for Internet traffic.
IPv4 used 32-bit addresses, allowing for approximately 4.2 billion unique IP addresses, but the growth of the Internet has exceeded those limits. IPv6 uses 128-bit addresses, allowing for about 3.4×1038 unique addresses. This number is equal to approximately 4.8×1028 addresses for each of the seven billion people alive in 2012.
Because headers on IPv4 packets and IPv6 packets are quite different, the two protocols are not interoperable and thus they are both being run in what is called a “double stack.”
However, IPv6 is, in general, seen to be a conservative extension of IPv4. Most transport and application-layer protocols need little or no change to operate over IPv6. The exceptions to this are the application protocols that embed internet-layer addresses, such as FTP and NTPv3. In these, the new address format may cause conflicts with existing protocol syntax.
Internet service providers, the Internet Society and many large Internet-based enterprises worked to support a World IPv6 Launch on June 6 this year to help accelerate the adoption of IPv6.
John Curran, president and CEO of the American Registry for Internet Numbers, said upgrading to IPv6 is a necessary step for “any enterprise that wants to still be in business in five years,” because it enables them to continue to reach new customers and grow.
When asked about the costs or burdens of upgrading to IPv6 for small businesses, Curran explained that in most cases the burden would fall on the hosting company through which they run their website.
Chris Griffiths, director of high-speed Internet and new business engineering for Comcast, confirmed this, stating his company would have to upgrade to continue to attract new clients.
Security issues always loom large in Internet evolution
The development of the Internet has led to a need for Domain Name System Security, or DNSSEC. Curran explained that DNSSEC maintains the integrity of the Internet by ensuring the information users obtain is from the source they believe they are corresponding with, essentially preventing redirection to fraudulent websites.
Redirection could come from hackers, hijackers and phishers, but also the US government, should initiatives such as SOPA or PIPA pass.
“My primary interest is keeping the open Internet alive,” said Richard Jimmerson, director of deployment and operationalization for ISOC. “Somebody in this room will want to invent the next Facebook or Yahoo! Today, that is possible, but if we do not pay attention to certain things, that may not be possible anymore.”
Griffiths said Comcast and other Internet technology companies work together through governance processes now in place to address, for example, the types of security vulnerabilities that can drive action to work to avoid future risk, and in making adjustments in infrastructure and dealing with other emerging challenges.
Conflicts arise over the management of CIRs
The US government currently maintains the most control globally over CIRs. This is not well received by some critics around the world, as they fear that the United States may abuse its power. Some have also proposed that they would like to see a roadmap of the Internet for the next 20 years.
Curran addressed these concerns by stating that the US government has a positive track record regarding the respectful and neutral administration of its responsibility for CIRs, mostly leaving all of the operational details to multistakeholder global governance bodies such as the Internet Engineering Task Force and ICANN, and added that roadmap would not likely be effective as there are too many unknowns moving forward.
Vernita Harris, deputy associate administrator of the National Telecommunications and Information Administration, explained that the newest Internet Assigned Numbers Authority (IANA) contract indicates it expects that ICANN and aspects of control over the Internet architecture “will be multi-stakeholder driven, addressing the concerns of all users both domestic and international.”
— Brennan McGovern
Brief session description:
Thursday, July 26, 2012 – This major session of the opening plenary of IGF-USA discussed the current state of play with various proposals ranging from the WCIT, the UN Commission on Science and Technology and Enhanced Cooperation, areas where more government may be called for from their perspective or strong improvements in “governance.” Panelists offered a range of perspectives about government and governance.
Details of the session:
The session was moderated by Marilyn Cade, the chief catalyst of IGF-USA. Panelists included:
- Rebecca MacKinnon, the Bernard L. Schwartz Senior Fellow at the New America Foundation
- Marc Rotenberg, president of the Electronic Privacy Information Center
- Jacquelynn L. Ruff, vice president of International Public Policy and Regulatory Affairs for Verizon Communications
- Paul Brigner, the regional bureau director of the North American Bureau at the Internet Society
- John Curran, president and CEO of the American Registry for Internet Numbers
- Kristin Peterson, co-founder and CEO of Inveneo
- Fiona Alexander, associate administrator of the Office of International Affairs at NTIA
If there’s a keyword lying at the heart of the Internet Governance Forum it is “multistakeholder.” Key is the belief that individuals from various backgrounds—from private industry to civil society to government to academia—benefit from gathering and discussing their visions for the future, and the viability thereof. Whether they’re able to reach any consensus after gathering and discussing the issues is another matter entirely.
The 2012 IGF-USA conference, held at Georgetown Law Center in Washington, D.C., Thursday, opened with a panel showing just how diverse these individuals can be, and how varied their focus is in regard to the pressing issues facing the parties looking to influence the continued growth of the Internet.
Rebecca MacKinnonof the New America Foundation opened the seven-member discussion by highlighting the importance of the “digital commons,” the non-commercial backbone providing structure to a number of vital digital institutions. Because of the shared nature of this backbone, which stretches across traditional nation-state boundaries, MacKinnon said she believes the world is on the verge of a reformation of the current governing concepts, as individual states try to gain control over institutions that involve those beyond their jurisdiction.
In the modern era, MacKinnon asserted, individuals are “not just citizens of nation-states and communities, we’re citizens of the Internet.”
“We have to be informed about how power is exercised,” she continued, highlighting a need for everyone involved to play their part in shaping the direction of the Internet’s evolution.
This, in turn, circles back to not just the perceived necessity for multi-stakeholder solutions, but the lingering questions as to how those solutions are reached.
“How do we ensure that the policy-making mechanisms actually allow input from all affected stakeholders?” MacKinnon asked.
She theorized that societies are on the precipice of a “Magna Carta” moment, in which the traditional concepts that dictate the ways in which governments work will be disrupted by this multistakeholder model.
This drew some rebuttals to some degree from other members of the panel.
Fiona Alexander, associate administrator at the Department of Commerce’s National Telecommunications and Information Administration, agreed with MacKinnon that some nations may be standing at that edge, but said the Magna Carta moment isn’t to be expected of every country, or even every stakeholder taking part in current dialogue.
“They [unnamed stakeholders] have in many cases failed to live up to what’s expected of them,” she said, which leaves those advocating for multistakeholder solutions in a situation where they’re defending a model for governance under siege, fostering doubts for its efficacy.
And a large number of those stakeholders are far behind those in developed, Western countries in regard to Internet penetration.
Kristin Peterson, co-founder and CEO of Inveneo, a non-profit organization dedicated to the proliferation of communications technology in the developing world, shared just how much work needs to be done in bridging the gap between dominant Internet stakeholders and those just attaining reasonable access to the Web.
“Internet access is important not just on individual level, but on a functional level, an organizational level,” she said.
Part of this is due to the remoteness of developing, rural areas, which drives up the cost of infrastructure to a counterproductive degree.
A single 1MB connection, Peterson highlighted, which would be suitable for a school or a medical clinic, costs upwards of $800 a month in Haiti. Another unnamed country that Inveneo has worked with has less than 100MB in total. And that 1MB of Internet access? It costs roughly $2,000 per month.
On the opposite end of the spectrum, far removed from countries just beginning to break down the barriers preventing them from gaining full access to the Internet, are stakeholders who, in the minds of some, will have an inordinate amount of influence over multi-stakeholder debates.
Marc Rotenberg, president of the Electronic Privacy Information Center, highlighted the influence of corporate entities as one such problem.
Comparing growing corporate influence over the Internet to “the clouds gathering at the beginning of a Batman movie,” Rotenberg warned those in attendance, “You have to pay attention when the skies darken, things are about to happen.”
One such entity, which Rotenberg accused of having an ever-growing outsized influence over the Internet, is Google, whose growing presence on the Web is the “Number-one threat to Internet freedom.”
Regardless of whether that’s the case, such problems do require a means to draw in those affected by the evolving dialogue on Internet governance.
“How do we get people engaged, how do we raise a flag and pull in society, business, governments?” asked John Curran, president and CEO of the American Registry for Internet Numbers.
Curran offered perspective into the scope of the problems facing Internet stakeholders, the shape of which appears on multiple layers, with technological standards and protocols existing at the bottom layer. They require little political involvement, moving up to domain names and IP addresses, which aren’t necessarily the most hot-button social issues under debate within the halls of Congress. Nonetheless, they bring about privacy and tracking concerns, peaking with the broad, end-user experiences that draw in such general topics as intellectual property use, censorship and national security.
And, of course, given the nature of IGF, the multistakeholder model is seen as the best means to approach such problems.
Paul Brigner, the regional director of the North American Bureau at the Internet Society and Jacquelynn Ruff, vice president of international public policy and regulatory affairs for Verizon, offered insight into how new players are accepting and integrating into the multistakeholder approach.
Telecommunications firms, well aware of the dwindling demand for their traditional services in the wake of the Internet revolution, are “moving away from focusing on traditional telecommunications to Internet protocol and Internet issues,” Brigner said.
An issue such as the possible transition to a sending party pays structure, for example, is an issue that demands the inclusion and participation of a multitude of affected parties. Under such a regime, “You’re not free, necessarily, to innovate at low cost like you experience today,” Brigner said. “The end-to-end nature of the Internet that allows these sort of things to evolve.”
To alleviate some of the difficulty inherent in such discussions, Ruff cited the importance of enhanced cooperation, the notion of mapping past developments, current deficiencies and projecting future ambitions in a way that involves all interested parties. Emphasizing examples within UNESCO, ICANN and the Council of Europe, Ruff celebrated enhanced cooperation’s increasing rate of adoption.
The world is at “a fork in the road on the global discussion on where the future lies,” she said. And applying enhanced cooperation to the traditional multi-stakeholder methodology could be an effective means to remedy the arguments over which path to take.
That said, a plethora of stakeholders have their own interpretation and they will be seizing the opportunities granted by this IGF event and future conferences to throw their hat into the ring drawn by the opening plenary session’s panelists.
— Morgan Little
Internet Governance Forum-USA, 2011 New Challenges to Critical Internet Resources Blocking and Tackling – New Risks and Solutions
The security, stability and resiliency of the Internet are recognized as vital to the continued successful growth of the Internet as a platform for worldwide communication, commerce and innovation. This panel focused on domain name service blocking and filtering and the implementation of Internet Protocol version 6 (IPv6) and critical Internet resources in general. The panel addressed some of the implications of blocking and filtering; the implementation of DNSSEC and IPv6 on a national basis; and future challenges to the Internet in a mobile age.
Details of the session:
The Internet’s success well into the future may be largely dependent on how it responds and reacts to new challenges, according to panelists in a session on critical Internet resources at the IGF-USA conference July 18 in Washington, D.C.
The Internet continues to evolve. It is also growing, as it becomes accessible to billions more people. A major challenge now and in years to come is to make the Internet more secure while continuing to promote openness, accessibility, transparency, bottom-up decision-making, cooperation and multistakeholder engagement. It is important that organizations continue to retain these values as much as possible as they react to cybersecurity and cybertrust issues.
This workshop was conducted in two parts, both moderated by Sally Wentworth, senior manager of public policy for the Internet Society. Panelists included:
- John Curran, president and CEO of the American Registry for Internet Numbers (ARIN)
- Steve Crocker, CEO and co-founder of Shinkuro, and vice chair of the board for ICANN, the Internet Corporation for Assigned Names and Numbers
- George Ou, expert analyst and blogger for High Tech Forum http://www.hightechforum.org/
- Rex Bullinger, senior director for broadband technology for the National Cable and Telecommunications Association (NCTA)
- Paul Brigner, chief technology policy officer, Motion Picture Association of America (MPAA)
- Bobby Flaim, special agent in the Technical Liaison Unit of the Operational Technology Division of the FBI
- David Sohn, senior policy counsel for the Center for Democracy and Technology (CDT)
- Don Blumenthal, senior policy adviser for the Public Interest Registry (PIR)
- Jim Galvin, director of strategic relationships and technical standards for Afilias
Wentworth began the session by referring to it by an alternate title, “Blocking and Tackling.” The alternate title proved appropriate, as the two sets of panelists became more and more passionate in their assertions during the discussions on two topics that affect the future health of the Internet: the implementations of IP version 6, or IPv6, and especially Domain Name System (DNS) blocking and filtering.
The first panel consisted of John Curran, Steve Crocker, Rex Bullinger, Jim Galvin and Bobby Flaim. It centered on a discussion of IPv6 and the difficulty in implementing a system that is viewed as the “broccoli of the Internet” – something that is technologically necessary, but for consumers is less of a priority because a switch is not incentivized.
The technological necessity is an inevitability. IPv4 has 4.3 billion independent IP addresses. The central pool of addresses ran dry Feb. 3, 2011. The last five blocks were passed out to the five regional address distributors. Depending on the rate of growth, Curran explained, those addresses may not last very long. In fact, the Pacific region has already handed out all its addresses. The 7 billion people in the world can’t fit into 4.3 billion addresses, especially when most have more than one address to their names.
“We have to move, the hard part is getting them to move,” Curran said, referring to consumers.
The biggest problem is that IPv6 and IPv4 are two languages that don’t talk to each other.
“You have to enable IPv6 in order to speak to IPv6 Internet users. It’s literally possible for us to break the global Internet if we don’t transition to IPv6,” Curran said.
The dual stack model was identified by the Internet Engineering Task Force (IETF) to be the most effective and efficient way to begin integrating IPv6 into the lexicon.
“What we have is a coexistence processes rather than a transition,” Crocker explained. “We’re going to have these two existing parallels. You’ve got pure IPv4 exchanges and conversations, pure IPv6 exchanges, then you’ll have somewhat more complex interchanges between IPv4 users and IPv6 systems and those will require some sort of translation.”
“The ISPs,” Curran said, “are stuck in the trap as to whether there’s enough demand to make the switch.”
Currently, most devices, such as laptops, are IPv6 enabled and have been for some time, so they’re coexisting, rather than transitioning directly from IPv4, Curran said.
“Things are not going to be replaced overnight,” Galvin said. “The providers are ready, but the customers aren’t. The laptops are ready, but the interaction is not ready.”
There are other problematic implications of enacting the new IP version, particularly as it relates to ensuring that how the IP addresses are being allocated and to whom and logging NATS, according to Flaim. Another element was addressed by an audience member – the possible advantages possessed by countries with less infrastructure than the United States. Is the United States being left behind because its Internet is too big? Crocker contended that it’s a possible scenario that some developing regions could leap frog over IPv4 and go directly to IPv6.
DNS Blocking and Filtering
The second panel of the afternoon consisted of Crocker, George Ou, Paul Brigner, Galvin, David Sohn and Don Blumenthal and centered on a lively discussion about the merits of DNS blocking and filtering as an answer to copyright infringement.
The panel was divided on its views – some felt that DNS filtering and blocking represented the perfect answer to copyright infringement and media piracy, while others felt that a solution based on technical adjustment was the wrong way to approach what they deemed a human problem, not a technical problem. While a consensus was not achieved, the discussion addressed serious concerns about the damaging effects of illegal downloading on the content production industry, as well as the greater constitutional, technical and political implications of the use of DNS filtering and blocking.
Panelists referenced the Protect IP legislation that is currently in the U.S. Senate. The legislation is aimed at off-shore websites that infringe copyright laws by hosting pirated media. One of the ways the bill works is to undercut the sites by going after their advertising and funding sources.
The trouble, Crocker explained, is that the blockages or filters are not only “trivial to get around” but the motivation is there. Sohn agreed that DNS filtering is not beneficial, especially in this case, because it is easy to evade. Using DNS filtering to prevent users from accessing entire domains that may contain illegal content rather than addressing the content itself on specific websites is too broad, Sohn suggested.
Galvin agreed: “While content providers have a legitimate need to protect their assets, there seems to be the automatic assumption that DNS filtering is the right way to do that. You’re concerned about content, so you want to do content filtering.”
Galvin cautioned the panel and the audience to be aware of consequential damages.
Sohn also raised concerns about prior restraint. “Under prior restraint law, you cannot restrict speech unless it’s actually proven illegal,” he said. “DNS filtering orders would often occur on a preliminary basis, as part of a restraining order, rather than after a full legal determination by a court.”
There was further discussion that on a technical level, the use of these tactics would be problematic because it would break DNSSEC and destabilize the Internet. Especially when DNSSEC was designed to detect just that type of illegal activity, Crocker maintained.
On the other side of the issue, Brigner explained that in using DNS filtering, criminal sites would be removed from the “global phonebook,” preventing individuals from accessing them and propagating the consumption of illegal media.
“We’re not asking for a new row of cannons,” he said in reference to Crocker’s earlier metaphor about the Vassa, a Swedish warship that sank because its poor design was based on the king’s desire for more firepower in spite of architectural faults. “Let’s use sound engineering principals.”
An audience member’s suggestion of the use of an industry seal program was also met with varying levels of support and dissension. Ou expressed that an industry seal program may be easily counterfeited, while others felt that using a “human” solution rather than a technical solution was a more appropriate answer to the problem.
In the end, the dialogue raised many new questions about the real world implications of IPv6 and DNS blocking technologies.
“It’s important that these various communities find a way to address the issues in a way that’s respectful of the technology, respectful of the global Internet and people’s need to carry out their business, and the freedom of expression,” Wentworth said in closing.
“There’s a lot of competing interests, but they don’t have to be mutually exclusive.”
– Bethany Swanson
IGF participants broke into three different rooms to discuss three different, possible potential-future scenarios for the Internet in 2025. In this session, the brief description given to the discussants noted that “Government Prevails” scenario imagines a future affected by man-made and natural challenges and disasters – wars, civil strife, an aging world and interventionist governments. This scenario assumes that while “the ICT industry, media companies and NGOs” are the leading players on the Internet stage today [some people might disagree with this assumption], by 2025 governments and inter-governmental organizations will have come to rule the Internet as a result of actions taken to protect particular interests from negative exposure.
Details of the session:
A small group of Internet stakeholders from various sectors met to discuss the Government Prevails potential-future scenario at the Internet Governance Forum-USA 2011 at Georgetown University Law Center.
This scenario sets up a closed-off future for the Internet. You can read the one-page PDF used to launch this discussion here:
The potential future drivers of change people were asked to consider included:
- Manmade and natural disasters push governments to exert more control over Internet resources.
- Changes in the Domain Name System force intergovernmental organizations to impose new global regulatory regimes.
- Networked image sensing through devices such as Kinect and GPS are used to identify and track people, with positive and negative effects, but the net result is a global surveillance culture.
- Governments limit bandwidth for video conferencing when they find revenues for hotels, airlines and other travel-related economic entities in sharp decline.
- Lawsuits and other developments cause governments to create blacklists of websites prohibited from Internet access.
- Anonymity on the Internet is brought to an end as a response to viruses, worms and credit card fraud and user authentication is required.
- Governments take every opportunity to coordinate and consolidate power under various mandates for global solutions and by 2025 governments and law enforcement are deeply embedded in all aspects of the Internet.
NetChoice Executive Director Steve DelBianco began the session by sharing the drivers of this future and what the Internet might look like in 2025.
“The scenario at its key is an attempt to be provocative about a potential future,” said DelBianco, who emphasized this session was supposed to search for what could be plausible and to develop opinions on the possible benefits and disadvantages of a future and what could be done to mitigate its impact.
“Is this the George Orwell scenario where it is a question of not whether but when?” Roseman said.
Although there was a list of questions the leaders intended to discuss, the session quickly turned into a running debate, bouncing from topic to topic as the participants introduced them. Two main themes quickly emerged.
The first was the conflict between security versus privacy.
Carl Szabo cited the situation in London, where hundreds of security cameras were added to city streets with the intention of reducing crime. The result was criminals adapting to the increased surveillance by wearing hooded sweatshirts.
“As we give away these rights and privileges for alleged increased security, it’s not necessarily going to return with security,” he said.
Slava Cherkasov, with the United Nations, brought up the recent case of Brooklyn boy Leiby Kletzky, who was allegedly abducted, murdered and dismembered by a stranger, Levi Aron. In that case, it was a security camera outside a dentist’s office that led to Aron’s arrest, confession and the recovery of the boy’s body within an hour of viewing the footage.
Judith Hellerstein, with the D.C. Internet Society, said that government use of data is acceptable when there is an understanding about privacy and intent.
“You also have to sort of figure out how governments are going to use that technology in hand,” she said.
In the scenario, an issue was introduced, based on reality, where pictures of protesting crowds were tagged, allowing for the identification of people at the scene of a potential crime.
Elon University student Ronda Ataalla expressed concern over limiting tagging in photographs, because it was a limit on expression.
But David McGuire of 463 Communications reminded the room that civil liberties traditionally don’t poll well.
“Free speech isn’t there to protect the speech we all like,” he said.
DelBianco expanded the tagging issue to raise the issue of “vigilante justice,” people using debatably privacy-violating practices to identify people they consider wrong-doers, and brought up Senate Bill 242 in California, which would alter the way social networks create default privacy settings for users. This bill was narrowly defeated 19 to 17 June 2.
Chris Martin with the USCIB talked about how not all companies are interested in using their technology for ill or personal gains, listing Google and their withholding of the use of facial recognition technology to protect people’s privacy.
This subject is also related to the second main discussion topic: the government versus industry and the private sector.
Covington questioned Martin about whether he saw governments developing that same facial recognition technology, as described in the scenario, and using it to monitor citizens.
“Some,” was his reply, before adding that all Internet governance was about maximizing good and minimizing evil.
There was then a brief discussion about the Patriot Act and relinquishing civil liberties online in the circumstances of a national emergency. Who decides when the emergency has passed?
Szabo and others questioned if the government was even the right organization to take over in the event of a disaster.
“It’s much easier to say, ‘Let them deal with it so I don’t have to,’ but the question is, ‘Will they do it better?’” he said.
Cherkasov said not necessarily, mentioning that when Haiti was struck by the severe earthquake in January 2010, it took two weeks for government organizations to develop a database to search for missing people, but in Japan in March 2011, it took Google only 90 minutes to come up with the same technology. He then returned to the security camera situation, concluding that citizens were the first line of response and information in a disaster scenario.
“There will always be maybe an ebb and a flow but it’s the power of the people that will ultimately be able to create that balance,” Roseman said. “But it’s going to have to be a proactive effort to get and keep that balance.”
Roseman also said one of the benefits of the industrial and private sector was an ability to use funds more freely than the government, which, presumably, does operate on a limited budget.
“When you have governments and the private sector and industry working together, you generate a lot more money and opportunity to drive change,” she said.
McGuire, though, expressed concern that industry and the private sector have some misconceptions about the power of the Internet, believing that it is too powerful for any law or government to cut it down. He said many, including those in the area of Silicon Valley, Calif., think the Internet will always be able to circumvent policy.
Most session participants seemed to agree that the potential scenario was troubling.
“It makes me want to move to somewhere where there are more sheep than humans,” joked Covington.
But Brett Berlin, of George Mason University, said that the Internet, and the choices that are made about governing it, are ultimately people-driven decisions, reminding the rest of the room that technology works for people and not the other way around.
“If we are foolish enough to think that open Internet will fundamentally allow us to be better, we are making a mistake.”
– Rachel Southmayd
This panel, moderated by Robert Guerra of Freedom House, focused on critical Internet resources and how to ensure that the underlying principles that have led to the Internet’s success persist in the face of security challenges. These principles include openness (open standards, open technologies), accessibility transparency, bottom-up decision-making, cooperation and multi-stakeholder engagement. Key to implementing these principles is also a broadened understanding of the role of the infrastructure providers, such as global and national Internet services/connectivity providers who build and operate the backbones and edge networks. The panel was also expected to address some of the implications for the implementation of DNSSEC and IPv6 on a national basis that contribute to the security and resiliency of CIR on a global basis.
Details of the session:
The Internet’s success well into the future may be largely dependent on how it responds and reacts to increasing security challenges, according to panelists in a critical Internet resources workshop at the IGF-USA conference July 21 in Washington, D.C.
The Internet continues to evolve. It is also growing, as it becomes accessible to billions more people. The major challenge of our generation is to make the Internet more secure while continuing to promote openness, accessibility, transparency, bottom-up decision-making, cooperation and multistakeholder engagement. It is important that organizations continue to retain these values as much as possible as they react to cybersecurity and cybertrust issues.
Panelists at this workshop included:
- Moderator Robert Guerra, Freedom House
- Trent Adams, outreach specialist for the Internet Society
- Matt Larson, vice president of DNS research for VeriSign
- Steve Ryan, counsel to the American Registry for Internet Numbers
- Patrick Jones, senior manager of continuity and risk management for ICANN
- Jeff Brueggeman, vice president for public policy for AT&T
Panelists all expressed a desire to continue to engage in multifaceted talks because a single governmental entity is not the solution; it takes many people working together. As Brueggeman put it, there’s no “silver bullet” for the issue of Internet security.
“What we do on a day-to-day basis is ensure that those conversations take place,” Adams said. “The (critical Internet) resource is not a thing you can touch. You have this mesh of interconnected components that is the critical resource. You can’t pull one of those components out. Everyone must be around that table.”
So what’s the solution? The answer to that question is still a little unclear because Internet service providers and other organizations are often reactive to issues. Brueggeman said it’s time to embrace a forward-thinking approach.
“Things can get complicated when you’re reacting to an attack,” he said. “The best way to deal with these things is to try to think about them up front. How do we detect and prevent rather than react after the fact? How can we have more cooperative information sharing before attacks to try to prevent them and have the best information we can?”
Ryan stressed, though, that not all government is bad. He said citizens and organizations need to think “carefully about what the role of the government is.” But still, there should be a symbiotic relationship.
“There’s become a sense in government policy circles, including in the most sophisticated, that somehow (the Internet) runs on its own and you can’t break it,” he said. “I have news for you: You can break it. We look at government as something that has an increasingly important role because the Internet has an increasingly important role in economies.”
Ryan continued by saying non-governmental organizations have a responsibility to work with governments and to educate the people who work in them. He and the other panelists agreed that an international governmental organization wouldn’t work, though, unless core Internet values are embraced and upheld. They said a set-up that would allow countries around the world to vote on how the Internet is governed would not be a favorable solution.
“Until we get it right,” Ryan said, “I think we’re muddling along rather well.”
DNS issues and DNSSEC
Larson spoke specifically about the security of the Domain Name System because he views the DNS as an absolutely critical Internet resource. “If you don’t have the DNS, you don’t have the Internet,” he noted. He said users can’t truly trust the DNS, though, which is a bit disconcerting because of its necessity.
He supports DNSSEC—Domain Name System Security Extensions—which give users digital signatures (origin authentication) and data integrity. “Once you have that, you can validate data and have a higher level of confidence that the data you’re getting back is valid,” Larson said.
(You can read more about DNSSEC here: http://en.wikipedia.org/wiki/Dnssec.)
He also said that DNSSEC makes DNS more trustworthy and critical to users as more applications—not just host names—depend on it. “We’re going to look back and realize it enabled a whole new class of applications to put information in the DNS,” Larson said. “Now you can trust the information coming out of the DNS.”
Going from IPv4 to a combination with IPv6
Ryan emphasized the importance of Internet Protocol version 6, IPv6, a new Internet layer protocol for packet switching that will allow a “gazillion numbers” vastly expanding the address space online. There is a rapidly decreasing pool of numbers left under IPv4. Ryan said the increased flexibility of IPv6 will allow for the continued growth of the Internet, but it won’t be a free-for-all.
“The numbers we have issued are not property,” he said. “We have a legal theory that’s embodied in every contract we’ve issued. They belong to community. If you’re not using them, you have to give them back. They are in essence an intangible, non-property interest, so over the next couple of years there will be some very interesting legal issues.”
ICANN in action
Jones said ICANN, which recently passed its 10-year milestone, has continued to work collaboratively with the community to take on major initiatives, such as the introduction of internationalized domain names in the root.
“We have taken requests from countries for internationalized country codes and approved 15,” Jones said.
“There’s a huge development in those regions of the world where you can now have domain names and an Internet that reflects their own languages and scripts. That will have an impact as discussion around critical Internet resources continues, especially in the IGF space.”
Physical critical resources
Brueggeman said AT&T has a broader perspective of critical Internet resources because the company is responsible for carrying Web traffic and for the underlying infrastructure, not just involved in issues tied to the DNS. He said the transition to IPv6 is daunting because it’s not backward-compatible. His main challenge has been in outreach efforts to customers.
“We have to deal with a lot of traffic that’s generated as we’re making changes to DNSSEC and IPv6,” he said. “In some cases, you might create some new security concerns, but overall both are important essential transitions.”
Brueggeman emphasized that multistakeholder discussions will be important in the coming years.
“We really need all of the parties who have the direct stake at the table to be part of the solution,” he said. “We need to have the resources handled in a way that promotes openness and promotes interoperability. There’s a huge policy risk of not managing these resources in a multistakeholder way.”
-by Colin Donohue, http://imaginingtheinternet.org