Posts Tagged ‘multistakeholder’
IGF-USA 2012: Critical Internet Resources (CIRs) – Evolution of the Internet’s Technical Foundations
Brief session description:
Thursday, July 26, 2012 – Since the initiation of the Internet Governance Forum (IGF), Critical Internet Resources (CIR) and the evolution of the Internet’s technical foundations have been a central focus of ongoing Internet governance debates. Varied views can engender misunderstandings that influence the opinions of global stakeholders, and different views exist about how to advance CIRs. International governmental approaches are proposed by some, while others strongly support the present bottom-up, consensus-driven models. Three foundational technological changes – IPv6, secure Domain Name System (DNSsec) and secure routing – framed the discussion in this workshop. Deployment of these new technical and organizational approaches raises significant challenges to stakeholders, operations and governance arrangements.
Details of the session:
The moderator for the session was Walda Roseman, chief operating officer of the Internet Society. Panelists included:
- Steve Crocker, chair of the board of the Internet Corporation for Assigned Names and Numbers
- John Curran, president and CEO of the American Registry of Internet Numbers
- Richard Jimmerson, director for deployment and operationalization, Internet Society
- Vernita Harris, deputy associate administrator in the Office of International Affairs of NTIA, US Department of Commerce
Thursday’s IGF-USA conference at Georgetown Law Center featured an assembled panel of government and corporate experts who addressed the controversial issues concerning the control of critical Internet resources.
Walda Roseman, chief operating officer of the Internet Society (ISOC), chaired the discussion on the implementation and security of CIRs.
CIRs include IP addresses, domain names, routing tables and telecommunications, or what Steve Crocker, CEO and co-founder of Shinkuro Inc., Internet Hall of Fame member and chair of the board of ICANN, called the base of Internet architecture upon which everything else is built.
Moving from Internet Protocol Version 4 to IPv6
One of the most pressing concerns regarding CIRs is the revision of Internet Protocol (commonly referred to as IP) from version 4 to version 6, now the most dominant protocol for Internet traffic.
IPv4 used 32-bit addresses, allowing for approximately 4.2 billion unique IP addresses, but the growth of the Internet has exceeded those limits. IPv6 uses 128-bit addresses, allowing for about 3.4×1038 unique addresses. This number is equal to approximately 4.8×1028 addresses for each of the seven billion people alive in 2012.
Because headers on IPv4 packets and IPv6 packets are quite different, the two protocols are not interoperable and thus they are both being run in what is called a “double stack.”
However, IPv6 is, in general, seen to be a conservative extension of IPv4. Most transport and application-layer protocols need little or no change to operate over IPv6. The exceptions to this are the application protocols that embed internet-layer addresses, such as FTP and NTPv3. In these, the new address format may cause conflicts with existing protocol syntax.
Internet service providers, the Internet Society and many large Internet-based enterprises worked to support a World IPv6 Launch on June 6 this year to help accelerate the adoption of IPv6.
John Curran, president and CEO of the American Registry for Internet Numbers, said upgrading to IPv6 is a necessary step for “any enterprise that wants to still be in business in five years,” because it enables them to continue to reach new customers and grow.
When asked about the costs or burdens of upgrading to IPv6 for small businesses, Curran explained that in most cases the burden would fall on the hosting company through which they run their website.
Chris Griffiths, director of high-speed Internet and new business engineering for Comcast, confirmed this, stating his company would have to upgrade to continue to attract new clients.
Security issues always loom large in Internet evolution
The development of the Internet has led to a need for Domain Name System Security, or DNSSEC. Curran explained that DNSSEC maintains the integrity of the Internet by ensuring the information users obtain is from the source they believe they are corresponding with, essentially preventing redirection to fraudulent websites.
Redirection could come from hackers, hijackers and phishers, but also the US government, should initiatives such as SOPA or PIPA pass.
“My primary interest is keeping the open Internet alive,” said Richard Jimmerson, director of deployment and operationalization for ISOC. “Somebody in this room will want to invent the next Facebook or Yahoo! Today, that is possible, but if we do not pay attention to certain things, that may not be possible anymore.”
Griffiths said Comcast and other Internet technology companies work together through governance processes now in place to address, for example, the types of security vulnerabilities that can drive action to work to avoid future risk, and in making adjustments in infrastructure and dealing with other emerging challenges.
Conflicts arise over the management of CIRs
The US government currently maintains the most control globally over CIRs. This is not well received by some critics around the world, as they fear that the United States may abuse its power. Some have also proposed that they would like to see a roadmap of the Internet for the next 20 years.
Curran addressed these concerns by stating that the US government has a positive track record regarding the respectful and neutral administration of its responsibility for CIRs, mostly leaving all of the operational details to multistakeholder global governance bodies such as the Internet Engineering Task Force and ICANN, and added that roadmap would not likely be effective as there are too many unknowns moving forward.
Vernita Harris, deputy associate administrator of the National Telecommunications and Information Administration, explained that the newest Internet Assigned Numbers Authority (IANA) contract indicates it expects that ICANN and aspects of control over the Internet architecture “will be multi-stakeholder driven, addressing the concerns of all users both domestic and international.”
— Brennan McGovern
IGF-USA 2012 Opening Plenary Remarks: Ambassadors Phil Verveer and Terry Kramer advocate Internet freedom, multi-stakeholder model
Brief session description:
Thursday, July 26, 2012 – Ambassador Phil Verveer, coordinator for international communications and information policy at the US State Department, offered opening remarks and introduced Terry Kramer, the former president of Vodafone North America, who was appointed in the spring of 2012 to be US Ambassador to the World Conference on International Telecommunications, which will take place Dec. 3-13 in Dubai, United Arab Emirates. The International Telecommunication Union description of WCIT: “The conference is a review of the current International Telecommunications Regulations, which serve as the binding global treaty outlining principles that govern the way international voice, data and video traffic is handled, and which lay the foundation for ongoing innovation and market growth.”
Details of the session:
Ambassador Phil Verveer, US coordinator for International Communications and Information Policy, emphasized the importance of Internet freedom at the Internet Governance Forum-USA Thursday morning at Georgetown Law Center.
The Universal Declaration of Human Rights, adopted by the United Nations General Assembly in 1948, directly supports Internet freedom, Verveer said.
“Article 19:2 (states), ‘Everyone has the right to freedom of opinion and expression. This right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers,’” he said. “Every human is entitled to these rights simply by being human.”
Although it has been ratified by many nations, the declaration is not bound by international law, and Verveer acknowledged that differing government philosophies result in different Internet policies.
“There is a compelling case for Internet freedom grounded in human rights, but the problem, of course, is that it is not nearly enough to persuade some countries that have strong reasons to interfere with Internet freedom,” he said.
Verveer pointed out that the declaration does not provide the only support for Internet freedom. The economy also allows for strong incentive to liberalize Internet policy.
From an economic standpoint, the argument for Internet freedom is straightforward: The Internet is an enormous commercial channel, and there is a positive correlation between its accessibility and its economic potential.
“There is the fundamental intuition that serious reductions in innovation will handicap economic growth,” Verveer said.
Verveer said he expects that delegates to the 2012 World Conference of International Telecommunications (WCIT) in Dubai will be in agreement that the amendments made to the International Telecommunication Regulations (ITRs) in 1988 be upheld.
“The United States will … prevent changes of ITRs that would … constitute a reversal of the liberalized telecommunications environment that has prevailed virtually everywhere in the world since 1988,” he said. “Our principal goal for WCIT involves maintaining this enabling environment, with complete confidence that if we are successful the benefits of information and communications technology will continue to increase and to expand to billions of additional people.”
Verveer then yielded the stage to Terry Kramer, US ambassador to the WCIT. Kramer charged the audience to think critically about the Internet’s future and about the messages relayed to other stakeholders in the global network.
“When it comes time for us to advocate directly (for Internet freedom), it will be very important that we come from a position of knowledge and fact, not just ideology,” he said. “(We must) be able to speak from knowledge about what worked in the past and how we see the future evolving.”
Kramer attested to the value of the multistakeholder model, given the distributed nature of the Internet and the diversity of its users. He emphasized the need to meet with international players at the forefront of the Internet’s evolution. “The multistakeholder model is the only effective one that will work,” he said. “The Internet is too global to have one organization in control. …We need to get examples of what success looks like (across the world).”
Kramer warned that some stakeholders’ ambitions are likely to oppose Internet freedom, openness and accessibility.
“There have been several proposals that … are worrisome,” he said. “One category of these is the control of traffic and the control of content. From every angle, that results in a bad outcome. It creates cynicism … and workaround solutions. … But there will be wisdom and good ideas here that we can effectively advocate.”
— Katie Blunt
Marilyn Cade, chair of the IGF-USA Steering Committee, led a closing discussion that also included remarks from Markus Kummer, executive administrator for the global IGF Secretariat; Larry Strickling, assistant secretary of the National Telecommunications and Information Administration (NTIA), which is part of the U.S. Department of Commerce; and Deimante Bartkiene, a representative of the Lithuanian Embassy, invited IGF-USA attendees to the global IGF, taking place in Vilnius, Lithuania, Sept. 14-17.
Details of the session:
Marilyn Cade, president of ICT Strategies, asked the gathered audience during the closing session of IGF-USA 2010 to suggest at least five ways the IGF process can be improved in the future. She received more input than that. Here are a few of the ideas:
- The “users reign” scenario isn’t based in reality right now. The only way the scenario can come to fruition is if the people involved in global IGF efforts help design it and make it work.
- People should not demonize innovative companies that make mistakes. When companies take risks, let them fail, call them out but don’t overreact or issue calls for new laws to stop an experiment from ever happening again.
- The people involved with IGF should embrace transparency, inclusion and collaboration. Inclusion, in particular, means reaching out to parties that don’t show up to participate in opportunities like IGF-USA. The IGF effort should increase awareness, extend more outreach and have broader information available to people.
- The organizers of IGF should extend participation, particularly remote participation (ability to “attend” virtually, online), to the conferences.
- The Internet is inherently not like real life, and the more we try to make it like real life, the less appealing it will be to users. The people participating in the discussions at IGF should to keep this sentiment in mind going forward.
- The IGF organizers should more clearly articulate the roles of the different Internet stakeholders and organizations, define and implement a funding model for IG and enact some form of output for the IGF itself.
- The IGF should have more voices from emerging markets and the private sector at the table.
- A final piece of advice: Make sure what the people involved in IGF ask for is going to gain the best result. Don’t change the mandate, just renew.
Strickling said in his closing remarks that the U.S. government is committed to the continuation of the IGF in its current form. He said allowing a multistakeholder discussion will only enhance the accessibility of the Internet.
“Internet stakeholders across the globe are committed to this type of forum,” he said. “We want to make sure IGF is not just about dialogue. We need to make sure lessons learned from these discussions are put into action. I don’t imagine I am alone in thinking that open dialogue in IGF is an ideal way to enhance trust in these stakeholders.
“Changes that place one group above another in IGF would ultimately undermine this model.”
Kummer closed by saying that the IGF mandate will be up for a vote in the United Nations’ General Assembly later this year, and he added that the general assembly should almost certainly vote to extend the IGF mandate. But he’s concerned about what kind of changes might be suggested.
“Now we will have to find synthesis between two tendencies: the Internet will stay with us and nation-states will stay with us,” Kummer said. “We see the IGF as a synthesis between these two tendencies.
“I hope they will not do much tweaking moving on. All of you can have a role to play in this by reaching out, talking to governments.”
Click here to go to the main site used by
the organizers of IGF-USA: http://www.igf-usa.us/
-Colin Donohue, http://imaginingtheinternet.org
Andrew McLaughlin, deputy chief technology officer for Internet policy at the White House, worked as a top policy expert for Google before joining the administration of President Barack Obama. McLaughlin talked about transparency and democracy in his keynote.
Details of the session:
U.S. government leaders believe that a wide-open Internet promotes growth, innovation and democracy, according to Andrew McLaughlin, the deputy chief technology officer of Internet policy for the White House. He talked about openness, transparency, innovation and democracy during his closing remarks at the IGF-USA conference July 21 at the Georgetown Law Center in Washington, D.C.
He said President Barack Obama and the leaders of the federal government want to keep the Internet transparent and decentralized because they believe openness spurs creativity and discussion online.
“We’ve been trying to advance those policies,” McLaughlin said. “Openness is a normative value, which is to say a good in and of itself, but also an important network value. It helps everyone connected to the network understand what’s going on in the network.”
McLaughlin drew a strong distinction between the regulatory model developed for telephone services and the policies being established for the Internet, warning that the latter communications entity is definitely not simply a successor to the former. He said the public-switched telephone network was a closed system that was centralized, tightly controlled based on proprietary technologies and vertically integrated.
In contrast, he said, the Internet is an open, decentralized network that’s built around layers where power really rests in the edge of network, rather than its core. McLaughlin said the government needs to find a way to take advantage of this “ever more cheaper, ever more powerful technology” to help promote transparency.
“Transparency can be loosey-goosey term,” he said. “It can be related to openness in one sense. (It also) means the thing you put in is same thing that comes out at the other end. I think transparency in the network needs to come with transparency in policy making.”
McLaughlin said the first memorandum President Obama signed on his first day of office centered on the transparency of government, and one clear example of governmental openness is the digitizing of the Federal Register.
“We took the Federal Register and started publishing it in XML format, and when we did this, within about 24 hours a group of people at Princeton threw up a simple online application that allows you to type in search terms, and you can get e-mail or an RSS feed that pops up in your inbox any time something is published in the Federal Register that you’re interested in,” McLaughlin said. “That’s great because it’s 70,000 pages a year. It’s inscrutable. Now it’s all freely available.”
So yes, the Internet inherently spurs innovation, creation, growth and global dialogues. But it can’t be a staid resource. McLaughlin said its continued positive evolution is integral to its future success.
“We all have an interest in keeping the Internet global,” McLaughlin said. “The Internet should be open, and the Internet should be decentralized. It is and should be treated as a layered stack.
“The Internet governance work we are doing needs to recognize that and treat each of those layers differently. The Internet needs to evolve. We need to be open to that kind evolution and not let the Internet be hardened into its current structure. It’s breathtaking that in my lifetime this communications network has opened possibilities, enabled change and presented encouraging new horizons for the culture and for the practice and performance of democracy.”
-Colin Donohue, http://imaginingtheinternet.org
IGF-USA 2010 Workshop- Best Practices Forum: Considerations on Youth Online Safety in an Always-Switched-On World
Danny Weitzner, associate administrator for the Office of Policy Analysis and Development in the U.S. Department of Commerce’s National Telecommunications and Information Administration, led this session. In June of 2010 a new report was provided to the U.S. Congress: “Youth Safety on a Living Internet.” Topics addressed in that report were covered in this session. They include: the risks young people face; the status of industry voluntary efforts; practices related to record retention; and the development of approaches and technologies to shield children from inappropriate content or experiences via the Internet.
Details of the session:
Braden Cox, policy counsel for the NetChoice Coalition, shared an anecdote: While driving to the IGF-USA 2010 conference he was listening to a traffic report. The reporter complained about his long commute from Fredericksburg, Md., to Washington, D.C., each morning. A traffic engineer then came on the air and explained that long commutes are often caused by people who choose the wrong routes.
“You know what?” Cox said. “D.C. traffic is a lot like the Internet. There are a lot of different options, people can become overwhelmed and it can be slow. But all it comes down to education.”
Education is something the panel and respondents in the “Best Practices Forum: Considerations on Youth Online Safety in an Always-Switched-On World” IGF-USA 2010 session discussed extensively.
Panelists included Cox; Jennifer Hanley of the Family Online Institute; Michael W. McKeehan, the executive director of Internet and technology policy at Verizon Wireless; and Stacie Rumenap from Stop Child Predators.
There to respond were Jane Coffin of the National Telecommunications and Information Administration; Morgan Little, a research associate with the Imagining the Internet Center; Bessie Pang, the executive director of the Society for the Policing of Cyberspace; and Adam Prom, an intern for the law firm of Akerman Senterfitt.
The session leader was Danny Weitzner, the associate administrator of the Office of Policy Analysis and Development at NTIA.
The 148-page report, “Youth Safety on a Living Internet” (available as a PDF download here: http://www.ntia.doc.gov/reports/2010/OSTWG_Final_Report_060410.pdf) was provided to the U.S. Congress in early June. Cox and McKeehan were part of the Online Safety and Technology Working Group that prepared the report. Cox said the findings indicate that while issues such as child predators and “sexting” are minor problems, the main worry online today in the United States is cyberbullying by peers.
Cox said the report includes four recommendations to fight this trend:
- Avoid scare tactics. Instead, promote social norms and good etiquette on the Internet.
- Promote digital citizenship, e-literacy and computer security in pre-kindergarten through 12th grade education.
- Focus online safety programs on risk prevention, including interventions with high-risk youth.
- Create a digital literacy core on Internet safety.
“The Internet is living,” Cox said. “And much like in everyday life, we operate without truly understanding the risks.”
Many lawmakers have tried to crack down on cyberbullying, Rumenap said. Forty-four states now have some sort of law about it. But she said most of the laws are ineffective.
“What’s the best way to prevent cyberbullying?” Rumenap asked. “Don’t be a teenage girl,” McKeehan responded. Rumenap said the majority of cyberbullying consists of teenage girls being teenage girls – gossiping and saying mean things to one another. “Criminalizing a 14-year-old for saying something mean probably isn’t the best result,” Rumenap said.
The panelists discussed attitudes online. The best method for cutting down on cyberbullying, the group affirmed, is education. “We have to inform minors about the permanence, about the implications and about what they are posting online,” Little said.
Hanley said that starts with teaching kids accountability. “We’re building a culture of responsibility,” she said. “We’re trying to move rights and responsibilities that we take for granted in the offline world to the online world. We have to make sure we’re teaching them new social norms.”
Participants in the discussion agreed that there is only so much public policy can do. “You can’t legislate cyberbullying away,” Prom said.
The opportunity for Internet education exists on every level, Little agreed. “It’s nothing that can come from the top down, it’s nothing that can come instantaneously, and I don’t think it can come from schools,” he said. “They’re stretched too thin as it is. It has to come from all around.”
The message is simple, McKeehan said. “The No. 1 recommendation: Teach your kid not to be a jerk online,” he said. “Don’t be a jerk in the real world, and don’t be one online either.”
While a lot of this education can start at home, it should be present everywhere in kids’ lives, especially when they are relating with their friends, Rumenap said.
“It’s a conversation,” she said. “It’s a conversation at home. It’s a conversation at school. It’s a conversation in after-school programs. And it needs to be a conversation with their peers. It needs to become a social norm.”
–Sam Calvert, www.imaginingtheinternet.org
Cloud computing holds great promise for customers and entrepreneurs in the United States and around the world. It offers users – including governments and enterprises – the opportunity to pay only for the computing they use rather than maintaining all their computing needs and resources themselves. For innovators, the cloud offers a greatly reduced cost of entry into a market heretofore dominated by big players. However, there are policy challenges to be addressed. Fully realizing this potential requires unprecedented cooperation between industry, consumers and governments to ensure individual privacy and data security and ensure confidence in the remote storage of critical information. Not all are optimistic about the future of cloud computing because of the centralization of personal information, concentrated threats to security and the questions it raises about national sovereignty. This panel, moderated by Jonathan Zuck, president of the Association for Competitive Technology, explored opportunities and challenges of “the cloud.”
Details of the session:
There is a need for discussion about the opportunities and challenges of cloud computing in the public policy arena because of the popularity of platforms like Flickr, Facebook, fantasy sports leagues, Google and Amazon, according to panelists in a cloud computing workshop at the IGF-USA conference July 21 in Washington, D.C.
- John Morris, general counsel, Center for Democracy and Technology
- Dan Castro, senior analyst, Information Technology and Innovation Foundation
- Jack Suess, vice president of information technology, University of Maryland
- Evan Burfield, chief executive officer, Synteractive
- Marc Berejka, policy advisor, office of the secretary, U.S. Department of Commerce
Cloud computing offers users — including governments and enterprises — the opportunity to pay only for the computing they use rather than maintaining all their computing needs and resources themselves. This allows innovators to have increased access to the market at a reduced price, increasing competition in the market.
“There is a definite trend to using cloud computing,” Castro said. “There are much lower start-up costs because they don’t have to pay for the infrastructure. It gives you the ability to scale up or scale down. “
A common question is if data posted on cloud computing platforms is secure. The risks to data security and integrity rise with transnational cloud computing. Cloud users often do not know which law enforcement rules apply to their data.
“They don’t even know for sure where the data is; they don’t know exactly what country the data is in,” Morris said. “The challenges of being a cloud computing service provider and how to respond to law enforcement requests are very significant.”
If a cloud customer lives in the United States, but her data is stored on a server in another country, does that make her data more or less secure? Will the laws of the other country alter her rights regarding the actions she takes online?
“I think we will move to seeing risks to free speech on the Internet due to cloud computing,” Morris said. “I think that cloud computing could lead to repression.”
The panel also discussed issued tied to cloud computing and intellectual property.
“As we do move into the cloud there is a question about if we want to protect copyright protection,” Castro said. “Can service providers do a better job? We have a lot of intellectual property in the United States that we care about.”
Panelists noted that cloud computing presents completely new challenges in regard to cybersecurity, copyright protection and free flow of information on the Internet. “In the 1990s, during the version 1.0 era, the government philosophy on the Internet was ‘hands off,’” Berejka said. “We are now truly globally interconnected; it’s not a theory any more. Our philosophy is still to do no harm, but that might not necessarily translate into doing nothing.”
Higher education is one industry that has found a way to take advantage of cloud computing platforms. The outsourcing of student e-mail is becoming common, broadband and advanced networking are available to many more participants and vendor and government support for federations like InCommon is increasing.
“A lot of universities are coming together and thinking about community cloud services,” Suess said. “Higher education likes to collaborate with one another but one of the things that is holding the cloud back is stability. And trust is another question.”
Small businesses are also able to take advantage of cloud computing platforms. Burfield’s company, Synteractive, has 45 employees. He said they have no servers, conduct all their e-mail business in the cloud, use Skype for meetings and use Facebook for marketing. His company created recovery.gov for President Obama’s administration. They built the platform on an Amazon-provided server in about a week, creating the entire platform in the cloud.
“For a small business like ours it’s a great competitive advantage that never existed before,” Burfield said.
Cloud computing is allowing small businesses to be competitive, but there are still limits to what cloud computing services can do.
“For the cloud computing revolution to really work for consumers we need for the industry to move to a world of robust data portability,” Morris said. “The real promise of cloud computing for consumers is innovation and competition. I hope that there is data portability so when a new service comes online I can take my data and try the new service.”
–Rebecca Smith, www.imaginingtheinternet.org
The online world and the Internet are continuing to expand at exponential rates. As more and more users and more applications move into the online world with the expansion of broadband and mobile, concerns about online crimes and malicious threats to the Internet and to users also grow. This workshop was established to examine the range and scope of online crimes and malicious use of the Domain Name System. For instance, scam artists host websites with false information or a phisher registers a domain intended to resemble a famous brand. Consumers and businesses can be victims of abuse, and legitimate service providers are seeing crime and fraud in the network. The use of DNS security (DNSSEC) is part of a mitigation strategy.
Details of the session:
Every time an individual pulls up a webpage or website, the Domain Name System is used.
Moderators and industry leaders met at an IGF-USA 2010 workshop titled E-Crimes and Malicious Use in the DNS: Implications and Observations.
Panelists participating in the discussion noted that malicious use and criminal behavior in the DNS is not acceptable, but they did not come up with any clear conclusions regarding new ways to better control these problems.
The moderator of the event was Jim Galvin, director of strategic relationships and technical standards for Afilias. Panelists included Garth Bruen, founder of KnujOn; Doug Isenberg, attorney at law with GigaLaw Firm; Shaundra Watson, counsel for international consumer protection at the Federal Trade Commission; John Berryhill, intellectual property lawyer; Bobbie Flaim, special agent with the FBI; Margie Milam, senior policy advisor for ICANN; and Matt Serlin, senior director of domain management at MarkMonitor.
The panelists agreed the abuse of the DNS is not a regional issue nor is it confined to a particular sector of the Internet. The crimes occur across multiple jurisdictions and affect a variety of individuals.
Some shared anecdotes about incidents where collaboration with other entities gave way to resolving a major DNS violation.
-Anna Johnson, http://www.imaginingtheinternet.org