Posts Tagged ‘igf-usa 2011’
Details of the Session
Marilyn Cade, catalyst of IGF-USA, provided closing remarks to participants at this year’s conference in Washington, D.C., that mirrored, in some ways, her remarks from the previous year, as she described her view of the state of the Internet.
“We were at the very beginnings of the earthquake way out in the middle of the ocean,” Cade said, referring to one of her discussions from last year. “We were just beginning to detect some seismic activity that eventually, if not dealt with, could lead to a tsunami.”
“It’s possible we’re on the threshold of some bad outcomes and we need to deal with those now,” Cade said.
In light of potential threats to the use and access of the Internet as we know it, Cade encouraged people who were not previously a part of IGF to stay in touch and remain involved in discussions surrounding Internet governance.
Cade, along with Chengetai Masango, representative of the United Nations Secretariat for the IGF, urged conference participants to attend the 2011 global IGF conference this fall.
The conference, which will be held in Nairobi, Kenya from Sept. 27-30, will be the sixth global IGF meeting. Masango said the main theme of the meeting will be “Internet as a catalyst for change: access, development, freedoms and innovation,” and will include more than 90 workshops, best practices, open forums, dynamic coalitions and an IGF Village—a space where organizations can display their Internet governance activities.
Masango stressed that not only are the event meetings themselves important to attendees, but that there is “value at the edges”—a benefit from meeting and dialoging with others who have concerns about Internet governance.
Various remote participation options will be available for those interested in being a part of the 2011 global conference but are unable to travel to Nairobi. Among the options are Webex, realtime transcription, a webcast, email, Twitter and HUBS—gatherings of interested people or national IGFs that can connect together through Webex to take part in the meeting.
More information on the 2011 global conference can be found at http://www.intgovforum.org.
– Natalie Allison
Earlier in the day at IGF-USA, participants divided into three groups to discuss potential-future scenarios for the Internet in 2020. At this session, moderators briefed the plenary crowd on the discussions and they and other IGF-USA participants in the audience weighed in with their observations.
Details of the session:
Building upon an experiment that had succeeded at the previous year’s meeting, the Internet Governance Forum-USA presented a set of hypothetical situations, ranging between idyllic or dystopic depending on the preferences of those in attendance. Splitting off into three groups, panelists and members of the audience discussed the pros and cons of the end results of an imagined timeline, then moved on to figure out how best either to ensure or prevent said timeline.
As a part of the concluding ceremony of the IGF-USA, the lead moderators of every respective group presented their scenario to those caught unaware by a possible destiny and pointed out what the Internet community, along with governmental and business leaders, can do to in response to the potential future.
The first, Regionalization of the Internet, revolved around a prospective world in which individuals are either in or out on the Web, blocked off from those deemed to be outside of their own government’s sphere of influence. (You can find details from the earlier session that fed into this session here: http://www.elon.edu/e-web/predictions/igf_usa/igf_usa_2011_scenario_Internet_islands.xhtml.)
Andrew Mack, of AMGlobal Consulting and the session’s lead moderator, described it as, “interesting, but a bit depressing. We took the Angel of Death view on this.”
The idea of the Internet as an open highway, in this world, is replaced by one replete with tolls, as cross-regional access is limited, or in the worst cases, cut off entirely. Because of national security concerns, economic weakness, pressure from climate change and the massive new adoption rates of the “next billion” Internet users found in emerging markets, the Internet becomes a series of castles.
Some in the session actually thought the scenario fit the present more than an illusory future, and the more dire of descriptions could become the status quo within five years. To prevent it, governments were urged to be flexible and practice their own advice, industries were urged to increase their focus on the next billion users, who as of yet have no champion to advance their causes, and IGF was urged to resist the advance of ITU, the United Nation’s mass communications arm.
The second session, lead by Pablo Molina of the Georgetown Law Center, presented a more positive picture. “Youth Rising and Reigning” (You can find details from the earlier session that fed into this session here: http://www.elon.edu/e-web/predictions/igf_usa/igf_usa_2011_scenario_youth_rise.xhtml.)projected a world with the youth-led revolutions in the Middle East spreading to Europe and other disaffected masses taking up the call to utilize new Internet-based technologies to assert their own independence in light of continued economic and civil strife. And though many agreed that there’s a strong plausibility of “Youth Rising …” a key distinction that strikes at its heart was made.
“The defining factor is digital literacy and mastery, not age,” Molina told the audience, bringing to earth the notion that everyone younger than 30 is an Internet messiah, and bringing to light the fact that with the right competencies and skill, even the most elderly can have an influence on the Web. And despite the positive outlook of the scenario, an important distinction was made: Bad actors will always find ways to capitalize on new advances, and inadvertently, some innocents will be inconvenienced or, at worst, suffer as a result of those ill intentions.
To encourage, if not the revolutionary subtext of the hypothetical situation, the political and societal awareness of the youth, all means to promote participation in political discourse were advocated, be they through industry continuing its innovative advances, governments empowering instead of reigning in their citizens, or IGF supporting the participation of more and more stakeholders to ensure all voices are accounted for. And, of course, education, coming in the form of digital literacy, is a must for the youth to have the tools to, at most, incite a just revolution, and at the least, fight for their own causes in an effective way once the Internet further integrates itself within society.
The talkback that was perhaps the most pessimistic and grimly reminiscent of the most bleak of science fiction was “Government Prevails,” led by Steven DelBianco of NetChoice. (You can find details from the earlier session that fed into this session here: http://www.elon.edu/e-web/predictions/igf_usa/igf_usa_2011_scenario_government_prev.xhtml.)It depicts not victorious and noble governments deservedly beloved by its populace, but ones that, through a series of calamities, find themselves with the responsibility and power over maintaining surveillance over their entire citizenry.
Natural disasters of unimaginable magnitude and hacking sprees running rampant across the globe, in this scenario, coupled with rapid advancements in mobile and surveillance technologies, give the world’s governments both the mandates (since its presumed that they win the public trust after being the only entities capable of responding to such horrendous occurrences) and means to fulfill a vision reminiscent, albeit not quite as menacing, as that of George Orwell’s “1984.”
“I woke up this morning feeling fine, and now I’m afraid,” one member of the session said after hearing about the timeline.
Each of the elements of the prevailing government could be, as separate entities, taken as positives. Many responded warmly to the possibility of a more advanced version of London’s CCTV, scanning entire cities in the hopes of preventing crime, or smartphones that were not only mandated to keep tabs on your location at all times, but which could be used to turn in violators of strict anti-pollution legislation. But at the end of the day, it’s still a world in which the government is given the sole proprietorship of its people, with a seemingly omniscient awareness of their every little move.
To keep it from happening, the workshop decided, industries should obey the laws to avoid losing public trust, and they work together with the government to avoid the current philosophy of “government vs. private business interests.” Governments, obviously, shouldn’t grab the chance at such power and instead opt for a more open and decentralized Internet.
As for IGF? It should stick to its current duties and engage with all stakeholders, though such a future, while seemingly horrendous to Western minds, DelBianco mused, could be equally as appealing to those in countries such as Iran and China. This, in the end, illustrated one of the most evocative elements of the hypothetical exercise. Just as one man’s trash can be another man’s treasure, one man’s dystopia can be another man’s utopia.
– Morgan Little
The Internet and the Web are continuing to expand at exponential rates. When the board of the Internet Corporation for Assigned Names and Numbers opened up a whole new world of names for Internet addresses with its historic vote in June 2011, new gTLDs and their implications for users became extremely important. This session explored the Internet users’ experiences that might be expected as the Domain Name System (DNS) is prepared to under a massive expansion, adding hundreds or even a thousand new gTLDs to “allow for a greater degree of innovation and choice.”
Details of the session:
Every time an individual pulls up a webpage the Domain Name System is used. Moderators and industry leaders who met at an IGF-USA 2011 workshop say changes announced by ICANN this summer will bring new challenges and opportunities. Generic top-level domains, also known as gTLDs were previously quite limited. They included .com, .info, .net and .org. On June 20, 2011, the board of the Internet Corporation for Assigned Names and Numbers (ICANN) voted to allow companies and organizations to choose any fitting suffix for their domain names. The new gTLDs will be operational in 2013. Among the likely names are .sport, .bank and .app.
The moderator of the event was Frederick Felman, chief marketing officer for Mark Monitor, a major domain management company based in the United States. Panelists included:
- Suzanne Radell, senior policy adviser in the office of international affairs at the U.S. National Telecommunications and Information Administration
- Amber Sterling, senior intellectual property specialist for the Association of American Medical Colleges
- Pat Kane, senior vice president for naming services for Verisign
- Jon Nevett, co-founder and executive vice president of Donuts Inc. and president of Domain Dimensions, LLC, a consultancy on domain name issues
- Brian Winterfeldt, partner at the Washington, D.C., law firm Steptoe & Johnson, where he is a member of the intellectual property group
- Ron Andruff, president and CEO of DotSport, managing the new top-level domain .sport – http://www.dotsportllc.com/about
Details of the Session
The panelists speculated that as few as 500 and as many as 2,000 domain names could be added in the near future as ICANN opens its application pool up in January 2012. These new names can range from generic names like .pizza, brand names like .apple or geographic names like .London.
“Sports is one of those unique things,” he said. “Like music, [it] transcends borders, transcends languages, transcends cultures. It is relevant.”
It is important that we allow multiscript applications so we can reach all people of all languages, he said.
ICANN’s decision to open up the applicant pool is still relevantly new to the general public, which could lead to confusion, said Felman.
But the general population is beginning to join in the conversations, Kane explained. But Radell cautioned that the government is very concerned about the potential for fraud and general user confusion. When something goes wrong, people are going to turn to their government to ask why this was allowed to happen, she said.
Members of the Governmental Advisory Committee (GAC) worked very closely with ICANN to make sure safeguards were put into place to protect the users, Radell added.
One audience member asked how something like .bank would affect his ability to access his bank’s website. He questioned how the URL would be structured, and how Google Chrome users, who don’t use a URL at all, only a search bar, would access the sites. The panelists agreed that expectations for end users are still being developed.
Non-profits are another group that could have some trouble with the new domain names, said Sterling. In the past 15 years, non-profits have seen more donations through the use of the Internet, but it has also seen the Internet abused in the process.
Brand owners are concerned about the fraud that could occur in the future with increased domain names and if multiple groups apply for the same domain name, said Winterfeldt. There is mediation through ICANN and brand owners will be notified if their domain is being sought by another company.
Another concern is whether the increase in domain names would lead to another .com bubble and fizzle out. “In essence, whether they survived was not the point, said Hedlund. “It’s about adding competition and how the market responds.”
– Anna Johnson
Larry Strickling, administrator of the National Telecommunications and Information Administration and assistant secretary for communications and information at the U.S. Department of Commerce, gave a mini-keynote talk at IGF-USA 2011. NTIA is the executive branch agency that is principally responsible for advising the U.S. president on communications and information policies. Prior to his work for the Obama Administration Strickling worked as a policy coordinator for Obama for America, as a regulatory officer at Broadwing Communications, a department head at the FCC, a VP for Ameritech and a litigation partner at the Chicago law firm Kirkland & Ellis.
Details of the Session:
To begin the final plenary session for the day, Larry Strickling, an administrator for the National Telecommunications and Information Agency (NTIA), took to the podium to discuss recent activity in the world of Internet governance, particularly the recent Internet Cooperation for Assigned Names and Numbers conference in Singapore and the Organization of Economic Cooperation and Development meeting in Paris.
“We are at a very critical time in the history of the Internet,” he said, mentioning disputes among international organizations, including some governments that have recently called for increased regulation of Internet activity.
Strickling said he contributes the success of the current Internet and the way it is, or isn’t governed, to the multi-stakeholder approach, which can only be sustained and advanced when there is participation.
Last December, Strickling said, he helped complete a review of ICANN and submitted 27 recommendations to their board, all of which have been adopted.
“Now the focus turns to ICANN’s management and staff,” he said.
He also applauded ICANN’s acceptance of proposals made by the Governmental Advisory Committee regarding generic top-level domain names.
“The fact that not all the proposals were adopted does not represent a failure of the process or a setback in progress but reflects the reality of the multi-stakeholder model,” he said.
At the OECD’s meeting in June, representatives from government, the private sector, civilians and the field of technology met to discuss and develop the “Internet economy.”
“Participants at the meeting agreed to a communiqué on policy making principles and will create the conditions for an open, interoperable, secure and continually innovating Internet,” he said.
Strickling added that the intent was not to harmonize global law, but was to provide a global framework.
He then moved on to where the world could go next after the advancements of the past few months.
“More importantly, what’s the call of action for all of you?” he said, later concluding that the audience’s job was to advocate for a multi-stakeholder approach, not a treaty-based approach to developing policy.
Strickling reminded participants about the approaching July 29 deadline for comments on NTIA’s IANA Functions Contract, the first time that NTIA has sought public input.
He then concluded that the U.S. government is committed to multi-stakeholder solutions, and then reiterated the need for international cooperation and a focus on the process, not necessarily the outcome and adherence to developments already made, while taking questions from Cade and Michael Nelson of Georgetown University.
“If all that happens with the OECD principles and people file them away in a filing cabinet, then we’ve failed,” Strickling said. “These are only useful if they become a tool that we can now use as an advocacy basis for the rest of the world.”
In 2009, Strickling was appointed by the Senate to serve as assistant secretary for communications and information at the U.S. Department of Commerce.
During her introduction, Marilyn Cade said that Strickling’s reach went far and wide.
“The scope of his responsibility extends to impact on global decisions and global actions,” she said.
As an administrator with NTIA, Strickling is responsible for advising President Barack Obama on matters related to communications and information. He has extensive experience in technology policy and telecommunications both for the government and in the private sector.
– Rachel Southmayd
This session delved into recently announced policy statements with future implications including those made by the Organisation for Economic Cooperation and Development, the U.S. International Strategy on Cyberspace, the G8 and Others – Are principles a feasible approach to underpin Internet governance? If so, which ones? Should principles be applied by codification in law, MOU, or treaty? The workshop consisted of a mini analysis of currently proposed sets of principles. Because the Internet and online services are global, the perspective of the workshop was a global view.
Details of the session:
This is a placeholder for a lead sentence that encapsulates a key point that is the lead according to panelists in a workshop on Internet principles at the IGF-USA conference July 18 in Washington, D.C.
The co-moderators for the session were Fiona Alexander of the National Telecommunications and Information Administration (NTIA) and Shane Tews of Verisign. They hosted a session in which the following people first presented briefings on recently announced sets of principles.
Heather Shaw, vice president for ICT policy for the United States Council for International Business (USCIB), shared details of the OECD Communique on Principles for Internet Policy-Making: http://www.oecd.org/dataoecd/40/21/48289796.pdf.<
Chris Hemmerlein, a telecommunications policy analyst for NTIA, spoke about the sections of the May 2011 G8 Declaration that focus on the Internet: http://www.g20-g8.com/g8-g20/g8/english/live/news/renewed-commitment-for-freedom-and-democracy.1314.html.
Sheila Flynn, of the cyber policy office of the U.S. State Department, briefed participants on the U.S. International Strategy on Cyberspace: http://www.whitehouse.gov/sites/default/files/rss_viewer/internationalstrategy_cyberspace.pdf.
Leslie Martinkovics, director of international public policy and regulatory affairs for Verizon, introduced the concepts of the Brazilian Principles for the Internet: http://einclusion.hu/2010-04-17/internet-principles-in-brazil/.
Sarah Labowitz, U.S. State Department, shared details of the Council of Europe’s Internet Governance Principles: http://www.coe.int/t/dghl/standardsetting/media-dataprotection/conf-internet-freedom/Internet%20Governance%20Principles.pdf.
The introduction of the principles was followed by a roundtable discussion moderated by Iren Borissova of Verisign. Participants were:
- Jackie Ruff, vice president for international public policy and regulatory affairs for Verizon Communications
- Milton Mueller, Syracuse University (participating over the Internet from a remote location)
- Jeff Brueggeman, vice president for public policy at AT&T
- Cynthia Wong, director of the Project on Global Internet Freedom at the Center for Democracy & Technology
- Liesyl Franz, vice president for security and global public policy for TechAmerica
- Mike Nelson, research associate for CSC Leading Edge Forum and visiting professor at Georgetown University
- Robert Guerra, director of the Internet Freedom program at Freedom House
- Susan Morgan, executive director of the Global Network Initiative
For all of the Internet-focused principles laid out by the OECD, G8, U.S. State Department and the Brazilian government, the lists of tenants and guidelines, the debate at the 2011 Internet Governance Forum on “A Plethora of Policy Principles” boiled down to one question: Can the principles be successfully converted into actionable concepts?
Governmental parties, whether they are sanctioned by presidential administrations or are the result of a multistakeholder process, are seeking to list the boundaries in which they wish to act when the next contentious issue hits the web. The problem with these lists, which by themselves could perhaps act effectively within a singular cultural, regional or governmental context, stretch across all boundaries in a way similar to that of the Internet itself.
The policy principles included in the discussion, which in no way represent the entirety of idealized lists, were as follows:
-The OECD Communique on Principles for Internet Policy-Making, which is the most recent set, agreed upon by 34 member states, that seeks to promote the free flow of information, promote the open nature of the Internet, promote investment and the cross-border delivery of services, encourage multistakeholder cooperation and a litany of others, ranging from security concerns to liability issues for an affront to any of the contained principles.
-The G8 Renewed Commitment to Freedom and Democracy, which isn’t solely focused on Internet rights issues, but nonetheless deals heavily with digital issues. The list segments Internet users into three groups: citizens, who seek to use the Internet as a resource and as a means to exercise human rights; businesses, which use it to increase efficiency and reach consumers; and governments seeking to improve their services and better reach their citizens. The G8 list also considers the Internet as the “public forum” of our time, with all of the associated assembly rights applied.
-President Barack Obama’s U.S. International Strategy for Cyberspace focused on the concepts of prosperity, transparency and openness. It represents an effort on the part of the U.S. government to approach Internet issues with a singular vision and seeks to establish an international framework to deal with these issues in the future. Interestingly, it was also the only list of principles discussed during the session that asserts a sort of “digital right to self-defense” in the instance of an attack on the United States’ own digital resources.
-The Brazilian Internet Streering Committee’s Principles for the Governance and Use of the Internet in Brazil differed from the other lists in that it was created after a series of discussions between interested governmental, NGO, private and scientific parties. The committee’s principles also stood for greater universality to the Internet, particularly a breakdown of linguistic barriers and a strict adherence to maintaining diversity in the digital domain. For those questioning why Brazil, given the sheer number of countries with vested interests in Internet issues, Leslie Martinkovics, the director of international public policy and regulatory affairs for Verizon, said, “Brazil is seen as an opinions leader in the Americas. … they would like to take the high ground and lead the discussions going forward.”
-The Council of Europe’s Internet Governance Principles is the product of 47 member states with an expressed focus of “affirming the applicability of existing human rights on the Internet,” according to Sarah Labowitz of the U.S. State Department. In addition to those concerns, the principles call for a clear series of planning, notification and coping mechanisms in place in the event of a cyber disaster.
Once the particulars and intricacies of the various plans had been laid out, the critiques began to fly in. Mike Nelson, research associate for CSC Leading Edge Forum and visiting professor at Georgetown University, played the self-admitted role of the skeptic.
“The first thing you do is hold a meeting, and we’ve been doing that for five years,” Nelson said, describing how meetings lead to research, research leads to a lengthy span of time, during which the public becomes discontented, after which a list of principles emerges to placate the masses.
Nelson did not seek for the topic of discussion to be “do you or do you not stand for freedom,” but instead, a fundamental debate on so-called “flashpoints,” which are actual, specific points of policy, the results of a debate, which could result in legitimate action, as opposed to simply more principles.
Rebecca MacKinnon soon followed Nelson in critiquing the concept upon which the entire panel was devoted, noticing a trend for the principles and conclusions reached by disenfranchised groups, including those who aren’t in the post-industrial West or in the increasingly powerful emerging economies, to be at best given lip service, and at most outright ignored both by interested parties and IGF itself.
“What’s changed between 2004 and now?” MacKinnon asked. “How do people interpret these principles that have been, less or more, set in some degree of stone for quite some time?”
For the Chinese or Iranian dissident, she posited, rouge groups such as Anonymous and Wikileaks do more for their cause than institutional bodies like IGF simply because they rely entirely upon action instead of dialogue, action that is particularly focused on powerful entities.
For all of the critiques piled on the notion of principles and the efficacy of IGF, there was an equal counter of support.
“The role of the IGF is exactly what it was set out to do. There has been discussion, and it has encouraged awareness,” said Heather Shaw, vice president for ICT policy for the United States Council for International Business.
She added that many of the principles outlined in the State Department report published by the Obama administration contains many of the same concepts that were actively discussed at the previous year’s IGF meetings.
“The fact this discussion is happening everywhere points to the success of the Internet Governance Forum,” said Fiona Alexander of the National Telecommunications and Information Administration. “IGF is spurring these kinds of conversations.”
But the unanswered question lingering at the end of the session’s discussion was whether those conversations, those discussions and that awareness is enough in this day and age, with the Internet’s rapid advancement now being met with an equally rapid growth in governmental interest in its inner workings.
– Morgan Little
Internet Governance Forum-USA, 2011 New Challenges to Critical Internet Resources Blocking and Tackling – New Risks and Solutions
The security, stability and resiliency of the Internet are recognized as vital to the continued successful growth of the Internet as a platform for worldwide communication, commerce and innovation. This panel focused on domain name service blocking and filtering and the implementation of Internet Protocol version 6 (IPv6) and critical Internet resources in general. The panel addressed some of the implications of blocking and filtering; the implementation of DNSSEC and IPv6 on a national basis; and future challenges to the Internet in a mobile age.
Details of the session:
The Internet’s success well into the future may be largely dependent on how it responds and reacts to new challenges, according to panelists in a session on critical Internet resources at the IGF-USA conference July 18 in Washington, D.C.
The Internet continues to evolve. It is also growing, as it becomes accessible to billions more people. A major challenge now and in years to come is to make the Internet more secure while continuing to promote openness, accessibility, transparency, bottom-up decision-making, cooperation and multistakeholder engagement. It is important that organizations continue to retain these values as much as possible as they react to cybersecurity and cybertrust issues.
This workshop was conducted in two parts, both moderated by Sally Wentworth, senior manager of public policy for the Internet Society. Panelists included:
- John Curran, president and CEO of the American Registry for Internet Numbers (ARIN)
- Steve Crocker, CEO and co-founder of Shinkuro, and vice chair of the board for ICANN, the Internet Corporation for Assigned Names and Numbers
- George Ou, expert analyst and blogger for High Tech Forum http://www.hightechforum.org/
- Rex Bullinger, senior director for broadband technology for the National Cable and Telecommunications Association (NCTA)
- Paul Brigner, chief technology policy officer, Motion Picture Association of America (MPAA)
- Bobby Flaim, special agent in the Technical Liaison Unit of the Operational Technology Division of the FBI
- David Sohn, senior policy counsel for the Center for Democracy and Technology (CDT)
- Don Blumenthal, senior policy adviser for the Public Interest Registry (PIR)
- Jim Galvin, director of strategic relationships and technical standards for Afilias
Wentworth began the session by referring to it by an alternate title, “Blocking and Tackling.” The alternate title proved appropriate, as the two sets of panelists became more and more passionate in their assertions during the discussions on two topics that affect the future health of the Internet: the implementations of IP version 6, or IPv6, and especially Domain Name System (DNS) blocking and filtering.
The first panel consisted of John Curran, Steve Crocker, Rex Bullinger, Jim Galvin and Bobby Flaim. It centered on a discussion of IPv6 and the difficulty in implementing a system that is viewed as the “broccoli of the Internet” – something that is technologically necessary, but for consumers is less of a priority because a switch is not incentivized.
The technological necessity is an inevitability. IPv4 has 4.3 billion independent IP addresses. The central pool of addresses ran dry Feb. 3, 2011. The last five blocks were passed out to the five regional address distributors. Depending on the rate of growth, Curran explained, those addresses may not last very long. In fact, the Pacific region has already handed out all its addresses. The 7 billion people in the world can’t fit into 4.3 billion addresses, especially when most have more than one address to their names.
“We have to move, the hard part is getting them to move,” Curran said, referring to consumers.
The biggest problem is that IPv6 and IPv4 are two languages that don’t talk to each other.
“You have to enable IPv6 in order to speak to IPv6 Internet users. It’s literally possible for us to break the global Internet if we don’t transition to IPv6,” Curran said.
The dual stack model was identified by the Internet Engineering Task Force (IETF) to be the most effective and efficient way to begin integrating IPv6 into the lexicon.
“What we have is a coexistence processes rather than a transition,” Crocker explained. “We’re going to have these two existing parallels. You’ve got pure IPv4 exchanges and conversations, pure IPv6 exchanges, then you’ll have somewhat more complex interchanges between IPv4 users and IPv6 systems and those will require some sort of translation.”
“The ISPs,” Curran said, “are stuck in the trap as to whether there’s enough demand to make the switch.”
Currently, most devices, such as laptops, are IPv6 enabled and have been for some time, so they’re coexisting, rather than transitioning directly from IPv4, Curran said.
“Things are not going to be replaced overnight,” Galvin said. “The providers are ready, but the customers aren’t. The laptops are ready, but the interaction is not ready.”
There are other problematic implications of enacting the new IP version, particularly as it relates to ensuring that how the IP addresses are being allocated and to whom and logging NATS, according to Flaim. Another element was addressed by an audience member – the possible advantages possessed by countries with less infrastructure than the United States. Is the United States being left behind because its Internet is too big? Crocker contended that it’s a possible scenario that some developing regions could leap frog over IPv4 and go directly to IPv6.
DNS Blocking and Filtering
The second panel of the afternoon consisted of Crocker, George Ou, Paul Brigner, Galvin, David Sohn and Don Blumenthal and centered on a lively discussion about the merits of DNS blocking and filtering as an answer to copyright infringement.
The panel was divided on its views – some felt that DNS filtering and blocking represented the perfect answer to copyright infringement and media piracy, while others felt that a solution based on technical adjustment was the wrong way to approach what they deemed a human problem, not a technical problem. While a consensus was not achieved, the discussion addressed serious concerns about the damaging effects of illegal downloading on the content production industry, as well as the greater constitutional, technical and political implications of the use of DNS filtering and blocking.
Panelists referenced the Protect IP legislation that is currently in the U.S. Senate. The legislation is aimed at off-shore websites that infringe copyright laws by hosting pirated media. One of the ways the bill works is to undercut the sites by going after their advertising and funding sources.
The trouble, Crocker explained, is that the blockages or filters are not only “trivial to get around” but the motivation is there. Sohn agreed that DNS filtering is not beneficial, especially in this case, because it is easy to evade. Using DNS filtering to prevent users from accessing entire domains that may contain illegal content rather than addressing the content itself on specific websites is too broad, Sohn suggested.
Galvin agreed: “While content providers have a legitimate need to protect their assets, there seems to be the automatic assumption that DNS filtering is the right way to do that. You’re concerned about content, so you want to do content filtering.”
Galvin cautioned the panel and the audience to be aware of consequential damages.
Sohn also raised concerns about prior restraint. “Under prior restraint law, you cannot restrict speech unless it’s actually proven illegal,” he said. “DNS filtering orders would often occur on a preliminary basis, as part of a restraining order, rather than after a full legal determination by a court.”
There was further discussion that on a technical level, the use of these tactics would be problematic because it would break DNSSEC and destabilize the Internet. Especially when DNSSEC was designed to detect just that type of illegal activity, Crocker maintained.
On the other side of the issue, Brigner explained that in using DNS filtering, criminal sites would be removed from the “global phonebook,” preventing individuals from accessing them and propagating the consumption of illegal media.
“We’re not asking for a new row of cannons,” he said in reference to Crocker’s earlier metaphor about the Vassa, a Swedish warship that sank because its poor design was based on the king’s desire for more firepower in spite of architectural faults. “Let’s use sound engineering principals.”
An audience member’s suggestion of the use of an industry seal program was also met with varying levels of support and dissension. Ou expressed that an industry seal program may be easily counterfeited, while others felt that using a “human” solution rather than a technical solution was a more appropriate answer to the problem.
In the end, the dialogue raised many new questions about the real world implications of IPv6 and DNS blocking technologies.
“It’s important that these various communities find a way to address the issues in a way that’s respectful of the technology, respectful of the global Internet and people’s need to carry out their business, and the freedom of expression,” Wentworth said in closing.
“There’s a lot of competing interests, but they don’t have to be mutually exclusive.”
– Bethany Swanson
Internet Governance Forum-USA, 2011 Best Practices Forum ICTs for Disaster Response: Transforming Emergency Management
Responders are driving innovative uses of ICTs to transform emergency planning, intermediation and management. The Internet and social networking are being harnessed by search and rescue teams to locate and bring vital support to victims. ICTs are reassuring loved ones, bringing help to the stranded, raising financial aid, managing communications for responders and supporting rebuilding. This workshop explored the role communications, Internet and Internet-based applications play in disaster response and recovery operations and steps that can be taken to ensure continuity of operations following a disaster. It also considered the connection between disaster preparedness and Internet governance.
Details of the session:
Information and communication technologies are connecting public safety officials, allowing the efficient coordination of response operations and keeping citizens informed in new ways every day. Responders are driving innovative uses of ICTs to transform emergency planning, intermediation and management.
The Internet and social networking are being harnessed by search and rescue teams to locate and bring vital support to victims. The new Internet-based tools, mobile applications and social media that are transforming disaster relief efforts and empowering citizens were the focus of this workshop at the IGF-USA conference July 18 in Washington, D.C.
This session was moderated by Kelly O’Keefe, director of the Washington office of Access Partnership, a consultancy in international telecommunications trade, regulation and licensing. O’Keefe has a global knowledge base in the topic as she is also a rapporteur for an International Telecommunication Union study group on emergency communications.
The session’s panelists included:
- Joe Burton, counselor for technology and security policy, Communications and Information Policy, U.S. State Department
- Jim Bugel, assistant vice president for public safety and homeland security for AT&T
- Corbin Fields, Sparkrelief, a non-profit Internet-based organization empowering communities to provide disaster relief, http://sparkrelief.org/#
- Roland A. LaPlante, senior vice president and chief marketing officer, Afilias
- Keith Robertory, manager, disaster services technology, American Red Cross
- Tim Woods, technical leader, Cisco Systems
Kelly O’Keefe started the discussion by referring to recent global disasters, from the earthquake in Haiti to the earthquake and resulting tsunami in Japan. These events have demonstrated the importance not only for disaster response, but for relief communication, especially for developing countries, she said.
The biggest trend in disaster communication has been the migration toward Internet-based communications, said Tim Woods of Cisco. The influence and increased use of technology has become more widespread, and increasingly people turn to the Internet, particularly social media, to receive updates on events. Social media, in particular, allow users to send updates to followers immediately in real time.
But despite the widespread prevalence of technology and response services across the globe, the United States does not have the authority to simply step in and start setting up an information system in any country experiencing a disaster. There are differences when responding to a disaster in another country that aren’t problems in the United States.
When the Red Cross responded to the earthquake in Haiti, Keith Robertory said, “We didn’t just say, ‘Hey, let’s get our suntan lotion and see what’s happening.’”
In addition to disseminating information to the public, the Red Cross had a responsibility to talk to the Haitian government and coordinate their needs among Red Cross organizations from other nations.
During such coordination efforts, U.S. organizations cannot make the same kinds of assumptions that they would usually make at home. There are differences in technology cultures that must be taken into account when setting up a communications network during a disaster, said Robertory, of the American Red Cross.
Communicating during an emergency
There should be an interest in the swift restoration of communications infrastructures to save lives in a country experiencing a natural disaster, Joe Burton said. There is a global trend toward catastrophic disasters. In recent history, with the rise of the Internet and social networks, the Internet and text messaging are efficient uses of communications networks.
In terms of the big picture, there are more people with a basic phone, even a lower-end version, than there are who own PCs and TVs combined, said Vance Hedderel, of Afilias. This bigger picture allows disaster response communications to understand how to reach people. At this point in time, the phone is more effective than the Internet. SMS data reach larger numbers of people.
Additionally, the goal of disaster communications should be to inform people experiencing the disaster first-hand. “
A major gap currently exists where those people aren’t getting the necessary information and the outside world seems to know much more,” Hedderel said. “Those issues become so paramount when there is little infrastructure in place.”
When sending out information over the Internet, Robertory said it is critical to hit all social media sites. Since the emphasis is on getting information to the largest number of people possible, the disaster response teams have to reach their audiences across many platforms.
Establishing a network
From the service provider’s perspective, there is an emphasis on critical infrastructure during and after a catastrophic event, Woods said. The networks to be used for information sharing should be reliant and resilient to disruption. A capacity plan needs to be in place to handle an emergency. What often happens is that networks become oversaturated immediately after a disaster, with users attempting to assure others of their safety or provide updates to the state of those affected.
Robertory likened establishing network capacity to a gym membership: “You hope that not everyone comes in to use the treadmills on the same day at the same time,” he said.
Although being able to handle the enlarged capacity that happens after a disaster event is important, a network is not sustainable if preparation for overcapacity becomes slow and expensive. The goal is a balance of capitalism and altruism, life-saving and economy, to make money with the most efficient use of resources possible.
Despite the importance of developing effectively working technology systems, these will be largely useless if various agencies involved cannot work together. Part of preparation is building relationships between agencies and determining who will communicate with whom.
“If you can build those relationships ahead of time, you have a better chance of getting through when disaster strikes,” Burton said.
Another side to preparedness involves having technology that works even in smaller situations, Robertory said. Attempting to prepare a system for a big event from the start leaves too much room for errors when such a situation actually occurs. If the system works for everyday emergencies, it allows time to test it and improve it for smaller upcoming events.
“It’s about being proactive, not reactive,” said Corbin Fields, of Sparkrelief.
– Carolyn VanBrocklin