Documentary coverage of IGF-USA by the Imagining the Internet Center

IGF-USA 2012 Workshop: Next Challenge – How to Handle Big Data in the Cloud

leave a comment »

Brief session description:

Thursday, July 26, 2012 – The dramatic reduction in the cost of computing and storage made possible by cloud computing services, the spread of easy-to-use, open-source analytic tools, and the growing availability of massive data services from governments and the private sector (e.g. Google Maps) have enabled thousands of start-ups, hackers and others to create exciting new tools for business, entertainment, government and other sectors. Government policies can help or hinder development of new databases and Big Data apps. Issues covered in this session included: 1) open government data policy; 2) Intellectual Property Rights protection; 3) IT research; 4) technology test beds; 5) education; 6) law enforcement access; and 7) privacy regulations.

Details of the session:

The moderator for the session was Mike Nelson, a professor at Georgetown University and research associate at CSC Leading Edge Forum. Panelists included:

  • Jeff Brueggeman, vice president of public policy for AT&T
  • Paul Mitchell, senior director and general manager, Microsoft TV Division
  • Lillie Coney, associate director, Electronic Privacy Information Center
  • Jules Polonetsky, director and co-chair, Future of Privacy Forum
  • John Morris, director of Internet policy, NTIA/US Department of Commerce
  • Katherine Race Brin, attorney, bureau of consumer protection, Federal Trade Commission

Mike Nelson, an Internet policy expert from Georgetown University, shared some reassuring remarks as he introduced a panel session that concentrated upon the complexities of managing what has become known as “Big Data.”

“These issues are not new,” he said. “We’ve been dealing with them for 20 or 30 years, but they are a lot more important now.”

Nelson explained in a workshop at IGF-USA Thursday at Georgetown Law Center that it’s not just about data that are big. It’s about data that are changing so quickly and need innovative tools of management.

He introduced the following questions:

Polonetsky and Brueggeman exchange stories a workshop about the Clould at IGF-USA in Washington, D.C. on July 26, 2012.

  • How will privacy concerns impact the development of large databases (or will they have any significant impact)?
  • What are the liability issues of Big Data in the cloud?
  • How do we deal with the shortage of data experts?
  • How do we handle issues concerning control and access to data?

Jeff Brueggeman, a global public policy executive with AT&T and a longtime participant in Internet governance discussion in many fora, began the conversation by addressing a few of these issues.

First, he noted the importance of working with businesses as well as policymakers to come up with tools to manage the data. He also addressed the significance of maintaining security in cloud data.

“The more data that’s being collected and retained, the more that data could be used as a target,” Brueggeman said.

Brueggeman introduced some issues for the other panelists to contest, inquiring about best practices for dealing with data sets, types of controls over what users should expect, what uses of data are legitimate without that control and international components.

Jules Polonetsky of the Future of Privacy Forum followed with a look at the long-term perspective, offering some insight about the impacts of cloud technology.

“I’ve always had a hard time getting my head around clouds,” Polonetsky said. “But the best we can do is make sure we’re a bit of a gatekeeper.”

He argued that a formalized procedure should be established for the release of private information to law enforcement officials and others seeking information. But he also elaborated on the risks of such technology, which he illustrated by telling a story about his friend, a rabbi, who watched a racy video, unaware that Facebook would automatically share the link on his Facebook page, proving how easy it is to inadvertently share online activity with the greater digital community.

Polonetsky champions the benefits of data use, but he also urges people to consider the implications of such sharing and storing of data. He said he believes there should be a debate in which people weigh the risks and benefits.

Katherine Race Brin speaks about the Clould at IGF-USA in Washington, D.C. on July 26, 2012.

Katherine Race Brin continued the conversation, citing some of her experiences dealing with these issues in her job with the Federal Trade Commission.

She said the FTC has studied the implications of cloud computing for a number of years and has also considered how, or if, a cloud is different than any other uses of data transfer in regard to privacy.

She said her work at the FTC has led her to believe that the companies that are storing data in the cloud are often in the best position to assess the risks of that data sharing.

“We’ve always said, in relation to personal data, the businesses remain accountable for the personal data of their customers,” Brin said.

She said that while the FTC holds businesses responsible, it also provides a framework to ensure consumer privacy.

Brin explained the three key aspects of this framework:

  • Privacy by design – Companies should build in privacy protection at every stage from the product development to the product implementation phases. This includes reasonable security for consumer data, limited collection and retention of such data and resonable procedures to promote data accuracy.
  • Simplified consumer choice – Companies should give consumers the option to decide what information is shared about them and with whom. This should include a “do-not-track” mechanism that would provide a simple, easy way for consumers to control the tracking of their online activities.
  • Transparency – Companies should disclose details about their collection and use of consumers’ information and provide consumers with access to the data collected about them.

Lille Coney, associate director of the Electronic Privacy Information Center, offered her insights as an expert on big data. (To see a video posted by The Economist about Big Data that is based on EPIC-supplied information, click here.)

“Governance is not easy,” Coney said. “But we do learn mechanisms for creating accountability, transparency and oversight.”

She noted that the difficulty lies in creating guidelines that have currency and legitimacy. In regard to cloud computing, Coney suggests that people are not only consumers; they themselves – or at least the sets of the private information they share – are actually products.

“Our online activity alone generates revenue, and many consumers don’t understand that,” Coney said.

She said she strongly believes in the importance of the public’s engagement in the conversation. With all these privacy concerns, Coney said the consumer cannot afford to leave it up to businesses or government.

Lille Coney speaks about the Clould at IGF-USA in Washington, D.C. on July 26, 2012.

Microsoft executive Paul Mitchell, added some perspective to the conversation in terms of how to go about tackling the issue of how to manage Big Data. “I think the big challenge here is figuring out whats’ first when we’re talking about big data,” Mitchell said, noting the overwhelming amount of data being created and databased. “What we have here is not a new problem. What we have here is a problem of scale.”

Mitchell said we can look at the separate desires of people, businesses and society, and consider a philosophy based on each group’s needs. He explained that the people-first philosophy would ensure that data that could be harmful isn’t allowed to be. The business-first philosophy would be about maximizing the potential economic return for the use of data. The society-first philosophy would optimize the value for society as a whole based on what can be done with the data.

“From an operating perspective, the challenges we face involve how to govern against these three axises,” said Mitchell. “The policymakers’ choice is how to balance the three appropriately.”

Nelson then asked a question directed at the panelists about the future of the cloud and whether there would be one cloud in an interconnected world or a world of separate clouds run by different companies.

Session moderator Nelson then asked the panelists about the future of the cloud -whether there will be one cloud in an interconnectedworld or a world of separate clouds run by different companies.

Mitchell argued that there are circumstances that will require a private set of services.

Coney expressed concern over that model. “Consumer’s control over their data in a cloud-driven environment will require the ability to move their data from Cloud A to Cloud B. Making that a reality in this environment is going to be the challenge,” she said.

Polonetsky had a slightly different viewpoint. He considered the business-platforms perspective, questioning how easy it should be to move consumers’ data.

“Yes, it is your data, but did the platform add some value to it by organizing it in a certain way?” he asked, adding that the platforms may make a legitimate contribution by organizing consumers’ data. For example, your Facebook friends belong to you, but Facebook created a platform in which you interact with and share information, photographs and other things with them.

To conclude, Nelson took a few questions from the audience and asked each panelist for a recommendation regarding the future management of Big Data. Brueggeman suggested there should be a set of commonly accepted practices for managing Big Data. Polonetsky added there should be more navigable digital data. Brin supported the strong need for transparency. Coney proposed that cloud providers and Big Data companies must show their respect for a diverse group of stakeholders. Mitchell recommended that we should all work toward greater harmony between business, personal and societal values.

— Audrey Horwitz

Advertisements

IGF-USA 2012 Scenario Story: Two Possible Futures for Copyright – Anarchy or Totalitarianism

leave a comment »

Brief session description:

Thursday, July 26, 2012 – The laws of copyright were introduced before the Internet, before file-sharing and before the advances in digital tools now used to create sampling, mash-ups and remixes. One example of the complex copyright conflicts faced today is “The Grey Album,” produced by DJ Danger Mouse. It gained notoriety as it challenged the intellectual property structure in place, mashing two legally protected albums in a violation of copyright law. Danger Mouse created the album strictly as a limited-edition promotional item (only 3,000 copies), but it immediately went viral and caught the ear of many people in the music industry and all over the US, making any legal cease-and-desist request technically meaningless. This example illuminates the incredibly complex and nuanced existence of copyright law in America today. This scenario exercise was aimed at exploring two divergent sides of America’s copyright future, one where regulations surrounding copyright law are lax to the point of anarchy, and the other where the regulations increase at an exponential rate, creating a totalitarian atmosphere.

Details of the session:

Moderators for the session were Ariel Leath and Kalyah Ford, graduate students at Georgetown University. Panelists included:

  • Thomas Sydnor II, senior fellow for intellectual property at the Association of Competitive Technology
  • Matthew Schruers, vice president for law and policy at the Computer & Communications Industry Association
  • Brandon Butler, director of public policy initiatives for the Association of Research Libraries

Thomas Sydnor II speaks in a workshop about the future of copyright at IGF-USA in Washington, D.C. on July 26, 2012.

This scenario exercise at IGF-USA 2012 featured a consideration of what might happen if one or the other of two extreme situations – totalitarianism or anarchy – evolved in the future. Students from Georgetown University proposed two possible futures for panelists to discuss.

Scenarios: In an anarchist 2020 scenario, panelists discussed what might happen if a high school student turned in work incorporating aspects of Ernest Hemingway’s “The Sun Also Rises.” Would a teacher be expected to treat it as an original work? In a totalitarian 2020 scenario, panelists discussed a situation in which the phrase “good morning” is owned by McDonald’s, and any online use of it would instantly set off an alarm automatically requiring that the violator of the copyright pay a $500 fine.

These two scenarios tied to copyright, according to panelists at IGF-USA Thursday at Georgetown Law Center, are highly unlikely, but they are still interesting to ponder. They discussed the potential ramifications of both extremes.

“As far as totalitarianism, if (the United States) were to fall into totalitarianism, we’d have done it a long time ago,” said Thomas Sydnor II, a research fellow at the Association for Competitive Technology. “When I take my walk with my dogs, my dogs trespass on my neighbors’ lawns, and I go and I trespass on my neighbors’ lawns to clean up what they left on my neighbors’ lawns. And yet, I do this every day and there is not the slightest chance that I will ever be sued for it, much less arrested because we all realize that, to a certain extent, part of rights is exercising a little restraint in enforcement.”

Snydor also stressed the importance of thinking about where the Internet and its users are going in the long run in terms of copyright law enforcement. “We don’t need to have perfect enforcement, but we do need better than we have now,” he said.

Thomas and Matthew share laughs during a workshop about the future of copyright at IGF-USA in Washington, D.C. on July 26, 2012.

“Whether we like it or not, it’s a much more complex copyright environment today,” said Pablo Molina, information systems technology professor at Georgetown University Law Center.

“I consider it as similar to considering the tax laws. Tax laws are so complicated, there are so many tax laws passed every term, that you need really expert tax lawyers and CPAs and other people just to figure out how to corporate or individual taxes when things get complicated, and I would argue that it is the same thing with copyright law.” He said we are likely to be moving toward more and more legislation and more and more enforcement.

Panelist Matthew Schruers of the Computer & Communications Industry Association argued that while law regulates the Internet, the impact of other vital factors must figure into decisions about the Internet as well, including markets, architecture and social norms.

Schruers predicted that even if copyright law goes in the direction of anarchy, human norms will most likely still prevent people from entirely disregarding the idea of copyright law.

He said if he were asked to predict which direction Internet regulation of intellectual property is most likely to go in the future, it will be more anarchic than it is today.

“In a low-protectionist anarchy environment, you’re likely to see more noncommercial and derivative work that is based largely on noncommercial creation,” Schruers said. “Control needs to be effective in order to produce [a totalitarian environment].”

Given the same choice regarding which direction on the totalitarian-anarchist spectrum society is most likely to go in the future, Molina said he believes society is moving in the direction of totalitarianism. Even so, he said he believes a full tilt to this extreme is unlikely. “There are always ways for people to circumvent the system,” Molina explained. “Both scenarios are possible. Whether they are likely is a different story.”

In terms of other factors important to the copyright law discussion, Molina and Schruers both said economic growth is an extremely good measure to assess when seeking balance between the extremes.

“In terms of progress, economic progress is the best metric we have,” Schruers said.

— Mary Kate Brogan

IGF-USA 2012 Best Practice Forum: ICTs for Disaster Response – How the Internet is Transforming Emergency Management

leave a comment »

Brief session description:

Thursday, July 26, 2012 – Recent man-made and natural disasters around the globe have highlighted the importance of ICTs for connecting public safety officials, coordinating response operations and keeping citizens informed. Additionally, new and emerging Internet-based tools, mobile applications and social media have transformed disaster-relief efforts, providing real-time data for first responders and empowering citizens to access and share life-saving information and locate loved ones. Enhanced situational awareness via multiple platforms offers almost instantaneous and ubiquitous information regarding implications for life and property and individuals impacted by natural or man-made risks and threats. Internet-based communication is increasingly relied upon to support disaster preparation, response and recovery. Workshop participants looked at what must be done to ensure resilient infrastructures and continuity of operations, including keeping citizens informed. Panelists were invited to share their perspectives and the lessons learned from recent disasters and to work to identify recommendations for collaboration among stakeholders in preparing for future disasters.

Details of the session:

The moderator was Joe Burton, counselor for technology and security policy in the Communications and Information Policy Office of the US State Department. Panelists were:

  • Garland T. McCoy, founder and president of the Technology Education Institute
  • Kristin Peterson, CEO and co-founder of Inveneo, an organization that provides ICTs to remote areas
  • Keith Robertory, disaster response emergency communications manager for the American Red Cross
  • Veronique Pluviose-Fenton, a congressional staffer who focuses on homeland security and disaster response
  • Tom Sullivan, chief of staff of the Federal Communications Commission

Véronique Pluviose-Fenton speaks at a workshop on ICTS for Disaster Response at IGF-USA in Washington, D.C. on July 26, 2012.

Last month, severe storms in the Northern Virginia/Washington, D.C., metro area not only knocked out Internet service, but also caused an outage of 911 Emergency Response telephone services that lasted four days.

The Best Practice Forum at IGF-USA Thursday at Georgetown Law Center featured a discussion between government and NGO representatives on how to address this type of scenario and best coordinate disaster response in the current technological era.

According to Garland McCoy, founder of the Technology Education Institute, the 911 outage highlights the flaws of the current IP-backed telephone system, which evolved from the analog, hard-wired telephone system.

“Back in the twisted copper-wire days, the power could go out but your phone would stay on,” McCoy said. But the IP phone system now has ”hub and spoke” architecture with a single point of failure, known as a Big Data facility.

Véronique Pluviose-Fenton, a congressional staffer who focuses on homeland security and disaster response, spoke on the failures of the communication system following major catastrophes such as Hurricane Katrina and the terrorist attacks of Sept. 11, 2001.

Pluviose-Fenton emphasized the importance of interoperability—the ability of networked communications systems to communicate with each other.

“We all watched live what happens when they (first responders) couldn’t communicate,” she said, referencing the chaos of the 2001 attacks on the United States, when police officers and fire fighters could not talk or relay warnings.

Keith Robertory, disaster services technology manager for the American Red Cross, said it’s possible to build an entirely interoperable network, but there are quite a few political roadblocks standing in the way. “Can you imagine if the New York police chief and fire chief are trying to agree who owns a shared network and who controls it?” Robertory asked, illustrating the difficulty of interconnectivity.

Pluviose-Fenton agreed, saying, “I still fundamentally feel that even with the advances in technology, there still is a problem with will.”

This is not just a domestic issue, as disasters in foreign countries have also put communication technology to the test. US agencies and NGOs often join the global-assistance efforts when disaster strikes elsewhere.

Kristin Peterson, CEO of Inveneo (a non-profit focused on ICTs in the developing world), discussed her role in establishing a wireless network in Haiti following the 2010 earthquake that destroyed nearly all existing communication systems in the island nation. Every aid group providing relief had its own network, from the American Red Cross to the US military.

“Within 24 hours we knew we had to put up a WiFi network,” Peterson said.

The task took several days but was a necessary step in orchestrating the global response in aiding Haitian refugees, from providing food and water to distributing shoes sent by singer Jessica Simpson.

“If you can’t communicate, you can’t coordinate your response,” Peterson said.

Tom Sullivan speaks at a workshop on ICTS for Disaster Response at IGF-USA in Washington, D.C. on July 26, 2012.

The task took several days but was a necessary step in orchestrating the global response for food and water to shoes sent by singer Jessica Simpson.

“If you can’t communicate, you can’t coordinate your response,” Peterson said.

Tom Sullivan, chief of staff of the US Federal Communications Commission, said that even Japan, a country with an extremely sophisticated communications system and other cutting-edge technology, had to depend on a backup power grid following the 2011 earthquake.

He said it is necessary for the United States to develop a strong contingency communications plan in order to be prepared for the inevitable arrival of yet another Katrina-esque catastrophe or any devastating emergency situation. Robertory elaborated on this need. He supervises American Red Cross efforts to establish emergency communications infrastructures when providing relief to victims of disasters.

He and Sullivan also emphasized the importance of citizen engagement in a field where first-response is not and never will be 100-percent reliable.

“If 911 services were bad, wouldn’t you be more likely to learn first aid and CPR?” Robertory asked. He explained that citizens should form their own personal contingency plans should communication fail in the aftermath of a disaster.

All of the panelists agreed that advances in technology provide both new opportunities and new challenges for those responsible for disaster relief.

— Brennan McGovern

IGF-USA 2012 Workshop: Can an Open Internet Survive – Challenges and Issues

with one comment

Brief session description:

Thursday, July 26, 2012 – This workshop focused on the challenges of keeping the Internet open while simultaneously maintaining a safe and secure environment for individuals, businesses and governments. Governments encounter a wide ranging set of issues and concerns that can limit an open Internet, including the cost of connectivity, spam/malware, intellectual property rights, human rights and objectionable content. Businesses often make decisions for business purposes that may contribute to closing off the Internet. Leaders in governments’ legislative branches, including the US Congress and its counterparts around the world, and business leaders do not always recognize the implications of the actions they take that might negatively influence the Internet. In addition, citizens may voluntarily but without full understanding accept moves that contribute to closing off the Internet, quietly accepting actions and decisions that affect its openness in a negative way. The session worked to identify the key characteristics of an open Internet; the global and national challenges that threaten this; the initiatives pursued to advance the open Internet; multistakeholder engagement to develop and promote an open Internet.

Details of the session:

The session was moderated by Robert Guerra, principal at Privaterra and senior advisor to Citizen Lab in the school of global affairs at the University of Toronto. Panelists were:

  • Ellen Blackler, vice president for global public policy, The Walt Disney Company
  • Thomas Gideon, technical director of the Open Technology Institute at the New America Foundation
  • Andrew McDiarmid, policy analyst at the Center for Democracy and Technology
  • Julian Sanchez, research fellow at the Cato Institute
  • Paul Diaz, director of policy for the Public Interest Registry
  • John Morris, director of Internet policy, office of policy analysis and development of the US National Telecommunications and Information Administration

Ellen Blackler participates as a panelist about challenges and issues facing an open Internet at IGF-USA in Washington, D.C. on July 26, 2012.

Between copyright infringement, intellectual property, piracy and protection of online privacy, the openness of the Internet is being threatened on all sides, according to six IGF-USA panelists, who gathered to define and assess the challenges to an open Internet Thursday at Georgetown Law Center.

“The free and open Internet oughtn’t be a free-for-all,” said Ellen Blackler, vice president for global public policy for The Walt Disney Company.

A focus on the balance between maintaining an open Internet while ensuring security and privacy and minimizing piracy has always loomed as one of the largest challenges to the future of the Internet. While members of this panel represented diverse Internet backgrounds, the all agreed that Internet policy must and will continue to evolve with challenges posed by the struggle between these often-competing values.

What is Internet openness?

The definition of an open Internet differs even within seasoned IGF attendees.

John Morris of the National Telecommunications and Information Administration (NTIA) cited the principles of Internet openness recommended by the Organisation for Economic Co-operation and Development (OECD) last year, which highlight several key characteristics, including the opportunities for both collaboration and independent work.

An open Internet allows users to operate “independently of one another, so as not to have a centralized single body to control or impose regulations,” Morris said.

The Internet policymaking process additionally needs to be open for collaboration, Morris said.

“What is it that keeps barriers low, what steps can we take to address challenges?” asked Andrew McDiarmid, policy analyst for the Center for Democracy and Technology (CDT). “It’s about learning … to keep the process more open and open to more voices.”

Though the openness of the Internet is one of the Web’s key characteristics, challenges ensue when openness trumps privacy.

“The openness principle has failed the public in privacy interest,” Blackler said.

U.S. policies directly affect those abroad

In the United States, Internet access is virtually everywhere, but the major challenge for Internet openness in many other parts of the world is online accessibility, especially in remote areas and in developing nations.

Robert Guerra acts as moderator during a workshop about challenges and issues facing an open Internet at IGF-USA in Washington, D.C. on July 26, 2012.

“Access at an affordable cost is key because then we can innovate,” said panel moderator Robert Guerra, the founder of Privaterra.

Panelists agreed that though global policies across the board on the issues tied to Internet openness are unlikely to be established due to differing cultural values and standards from country to country, cooperation on the international scale is still quite important.

“Not that I think we need to achieve one global norm about a particular issue, but we need to achieve a global level of interoperability,” Morris said.

In some countries, global Internet operability is a major issue due to government blocking and filtering–the management of what content citizens may or may not access or share. Thomas Gideon of the Open Technology Institute noted the difficulties that global policymakers face with nations that exercise a great deal of control over available content.

“A large part of what I do in my work is to defend human rights online,” Gideon said. “That’s equally fraught with the risks that those trying to speak freely in contentious and crisis regimes face.”

Paul Diaz, director of policy for the Public Interest Registry noted the challenge of governance measures working locally and globally. “What works in one environment, what may work here in the US, is not necessarily applicable in another country” he said. Ultimately, the Internet is global and therein lies the challenge.”

Piracy and copyright: What is the solution?

When discussing the widespread nature of piracy online and the difficulty in regulating it, panelists differed in their preferred approach to dealing with the challenges of intellectual and copyrighted property.

“Companies like Netflix are slowly finding ways to shift from a product to a service model,” Julian Sanchez, a research fellow at the Cato Institute, said, suggesting this as one successful choice for property owners.

Sanchez argued that the best way to discourage piracy is to create services that offer consumers a wide variety of choices and control over consumption of goods at a fair price. He said this is a better method than exclusively offering products that can be copied and shared and pirated just as easily.

Private niches online: Social networking and the cloud

With the advent of social networking and the desire to share and access personal information, the Internet includes private and targeted content, as well.

Sanchez emphasized that the structure of the Internet should be seen more as a network of people and relationships than as a technological architecture.

Facebook’s Sarah Wynn-Williams said social networking represents the “desire for people to connect and share and be open,” adding that the future of Internet policy must meet these demands and “preserve the ability of people to [share personal content online,] which is genuinely under threat.”

Panelists also noted that files shared through cloud data storage continue to be as difficult to regulate as physically shared materials. Just as the government has often largely chosen not to investigate copied CDs or cassettes that become distributed among friends, content in the cloud is as difficult to trace and regulate.

— Madison Margeson

IGF-USA 2012 Workshop: The Changing Landscape of the Domain Name System – New Generic Top Level Domains (gTLDs) and Their Implications for Users

leave a comment »

Brief session description:

Thursday, July 26, 2012 – Early in 2012, ICANN launched the process to introduce vast numbers of new generic top-level domains (gTLDs) — allowing, for the first time, the customization of Internet addresses to the right of the dot. Few people understand that there are already 22 existing gTLDs and 242 country code TLDs, with a total of 233 million registered second level names across all TLDs. In the coming years, these existing TLDs will be joined by numerous new gTLDs, likely resulting in the registration of millions of new second-level domains. Some will use scripts that are unfamiliar to English speakers or readers. How exactly these new gTLDs will impact the world of users and registrants is yet to be determined. Will they add significant new registration space, cause confusion, provide some unique innovations, or, most likely all of the above to some degree? ICANN received a wide range of applications – including brand names, generic terms, and geographic and regional terms. The workshop was organized to discuss Issues and questions including: changes to how domain name registrants and users may organize and search for information online; how defensive registrations may impact existing registrants; whether ICANN gave a sufficient focus to Internationalized Domain Names; how applications from potential registries from developing countries are supported; whether fraud and abuse that exists in the existing gTLD space will migrate easily into the new ‘spaces’ or even be compounded; and how conflicts between applicants from noncommercial sector will impact the users of the Internet.

Details of the session:

The session was moderated by Ron Andruff, president and CEO of DotSport, LLC. Panelists included:

  • Laura Covington, associate general counsel for global brand and trademarks, Yahoo!
  • Bobby Flaim, supervisory special agent with the Federal Bureau of Investigation
  • Suzanne Radell, senior policy adviser, NTIA, and US Government Advisory Council representative at ICANN
  • Elisa Cooper, director of product marketing, MarkMonitor (remote participant)
  • Alan Drewsen, executive director of the International Trademark Association
  • Andrew Mack, principal and founder of AMGlobal Consulting
  • Krista Papac, chief strategy officer for ARI Registry Services

Respondents were Dan Jaffe, executive vice president for government relations of the Association of National Advertisers, and Jeff Neuman, vice president for business affairs of Neustar and Generic Names Supporting Organization councilor at ICANN.

Suzanne Radell participates as a panelist about the changing landscape of the Domain Name System at IGF-USA in Washington, D.C. on July 26, 2012.

There is a mix of concern and optimism for how the new generic top-level domains (gTLDs) will change the landscape of the Internet, but it’s certain that a new era of the Internet is coming.

A diverse panel at IGF-USA Thursday at Georgetown Law Center offering perspectives ranging from the side of brands to trademark security agreed on one thing: The introduction of new gTLDs will open the Internet up to more users, but also to more actors and cyber squatters. The panel agreed that the gTLD program will result in a tremendous amount of change, but how it will affect the landscape and whether that change is good, sparked the most discussion.

This year, there are 2.3 billion users of the Internet and 555 million websites. The numbers are staggering, considering the Internet is only about 14 years old, said moderator Ron Andruff, president and CEO of RNA Partners Inc.

There are 22 existing gTLDs – including .com, .net, .org and .edu – and 242 country code TLDs.

Elisa Cooper, director of product marketing at MarkMonitor, joined the panel remotely to give an analysis and breakdown of new gTLD application statistics.

Of 1,930 applications for a new gTLD, 652 were .Brand applications. Cooper divides the applications into three categories: brand names, community based and generic. The two flavors of generic are closed and open – the latter makes registries available to the general public with little eligibility requirements. Cooper also revealed:

  • There is a relatively low number of Internationalized Domain Names – only 116.
  • Geographically, the majority of the applications have come from North America and Europe.
  • Of the .Brand applications – which go through the standard application process – technology,
    media and financial sectors led the way.
  • The most highly contested strings were .APP, .INC, .HOME and .ART
  • The top three applicants were Donuts, Google and Amazon.

Laura Covington, who serves as chief trademark and brand counsel for Yahoo!, joined the panel from a .brand applicant company and offered a brand owner perspective. Yahoo! applied for .yahoo and .flickr

“I think there are a lot of exciting opportunities from a marketing perspective, even from a security perspective with the new gTLDs and the new .brands in particular,” Covington said. “And I also think that it’s going to have to change the face of how trademark owners, brand owners deal with their enforcement issues, how they approach protecting their marks going forward.”

Yahoo! is viewing the new gTLDs as an amazing new world and new way to reach customers, though Covington admits uncertainty toward what search engines will do once gTLDs are added to the mix of search algorithms. As a brand owner, she has concerns with how to deal with the second-level names because there will be an exponential increase in opportunity for cyber squatters.

Flaim (FBI) and Papac (ARI) participate as panelists about the changing landscape of the Domain Name System at IGF-USA in Washington, D.C. on July 26, 2012.

Bobby Flaim, FBI special agent, is primarily concerned with the pre-existing problems with domestic and international law enforcement of the Internet and how the problems may worsen as bad actors become more prevalent.

The existing system has some major problems with cyber squatting, said Jaffe, group executive vice president of ANA. He said he didn’t want to be the panel’s doomsayer, but he added that no one should assume the new gTLD program will roll out in a smooth or timely manner.

One hugely positive impact of the new gTLDs Covington sees is an influx of new voices and new participants in the multistakeholder process.

Krista Papac, general manager of ARI Registry Services, agreed.

“I do have faith in the multistakeholder model and hope that we continue to find our way through it and deal with the different issues,” Papac said.

Papac is running some of the registries for the new gTLDs and sees a lot of opportunity to create more secure environments and more opportunities from brands.

Suzanne Radell, senior policy adviser in the Office of International Affairs at NTIA and US GAC Representative, said that more people and more interest in the program will be crucial to ICANN’s evolution.

“We’ve got our fingers crossed that the benefits to consumers, to users are not outweighed by risks and costs,” Radell said. “So we’re looking very much forward to a review of the new gTLD program.”

Alan Drewsen, executive director of INTA, said he expects that the introduction of the new gTLDs will go more slowly and be less successful than hoped.

“ICANN will continue to exist, though I think it’s done everything possible to put its life in jeopardy,” Drewsen said, making the audience and panel laugh.

Andrew Mack, AMGlobal, speaks at a workshop about the changing landscape of the Domain Name System at IGF-USA in Washington, D.C. on July 26, 2012.

INTA has been critical of the process that ICANN has led over the last several years in introducing the new gTLDs.

“Given the amount of time and money that the members have invested in this process and the potential consequences that can flow from its failure, INTA will continue to work collaboratively with a lot of these constituencies to get the best possible results,” Drewsen said.

Andrew Mack, principal of AMGlobal Consulting, sees a large concentration in the global North and the English-speaking world. People in the global South won’t be able to participate in a program they don’t know exists. Seventeen gTLD applications are better than none, he said, but the number of applicants from other parts of the globa total to a paltry amount compared to highly connected regions already experiencing huge economic shifts due to the Internet. Mack said his pessimism is rooted in the fact that Africa and Asia are missing out when they could really benefit.

“And we want them to be part of our Internet,” Mack said.

There is an influx of new participants from existing participants, Neuman of Neustar noted.

The new gTLDs open up a lot of opportunities for business and marketing folks, but each person on the panel defined success in different ways.

“It’s definitely going to be an exciting time,” said Brian Winterfeldt, a partner with Steptoe & Johnson LLP. “I think we really are moving into sort of a new era of the Internet with this expansion and I think it’s going to be very exciting to see how it evolves.”

— Ashley Barnas

IGF-USA 2012 Opening Plenary Roundtable: Emerging Internet Issues – Governments or Governance?

with 2 comments

Brief session description:

Thursday, July 26, 2012 – This major session of the opening plenary of IGF-USA discussed the current state of play with various proposals ranging from the WCIT, the UN Commission on Science and Technology and Enhanced Cooperation, areas where more government may be called for from their perspective or strong improvements in “governance.” Panelists offered a range of perspectives about government and governance.

Details of the session:

The session was moderated by Marilyn Cade, the chief catalyst of IGF-USA. Panelists included:

  • Rebecca MacKinnon, the Bernard L. Schwartz Senior Fellow at the New America Foundation
  • Marc Rotenberg, president of the Electronic Privacy Information Center
  • Jacquelynn L. Ruff, vice president of International Public Policy and Regulatory Affairs for Verizon Communications
  • Paul Brigner, the regional bureau director of the North American Bureau at the Internet Society
  • John Curran, president and CEO of the American Registry for Internet Numbers
  • Kristin Peterson, co-founder and CEO of Inveneo
  • Fiona Alexander, associate administrator of the Office of International Affairs at NTIA

If there’s a keyword lying at the heart of the Internet Governance Forum it is “multistakeholder.” Key is the belief that individuals from various backgrounds—from private industry to civil society to government to academia—benefit from gathering and discussing their visions for the future, and the viability thereof. Whether they’re able to reach any consensus after gathering and discussing the issues is another matter entirely.

The 2012 IGF-USA conference, held at Georgetown Law Center in Washington, D.C., Thursday, opened with a panel showing just how diverse these individuals can be, and how varied their focus is in regard to the pressing issues facing the parties looking to influence the continued growth of the Internet.

Rebecca MacKinnon from the New American Foundation speaks at the Opening Plenary Roundtable at IGF-USA in Washington, D.C. on July 26, 2012.

Rebecca MacKinnonof the New America Foundation opened the seven-member discussion by highlighting the importance of the “digital commons,” the non-commercial backbone providing structure to a number of vital digital institutions. Because of the shared nature of this backbone, which stretches across traditional nation-state boundaries, MacKinnon said she believes the world is on the verge of a reformation of the current governing concepts, as individual states try to gain control over institutions that involve those beyond their jurisdiction.

In the modern era, MacKinnon asserted, individuals are “not just citizens of nation-states and communities, we’re citizens of the Internet.”

“We have to be informed about how power is exercised,” she continued, highlighting a need for everyone involved to play their part in shaping the direction of the Internet’s evolution.

This, in turn, circles back to not just the perceived necessity for multi-stakeholder solutions, but the lingering questions as to how those solutions are reached.

“How do we ensure that the policy-making mechanisms actually allow input from all affected stakeholders?” MacKinnon asked.

She theorized that societies are on the precipice of a “Magna Carta” moment, in which the traditional concepts that dictate the ways in which governments work will be disrupted by this multistakeholder model.

This drew some rebuttals to some degree from other members of the panel.

Fiona Alexander, associate administrator at the Department of Commerce’s National Telecommunications and Information Administration, agreed with MacKinnon that some nations may be standing at that edge, but said the Magna Carta moment isn’t to be expected of every country, or even every stakeholder taking part in current dialogue.

“They [unnamed stakeholders] have in many cases failed to live up to what’s expected of them,” she said, which leaves those advocating for multistakeholder solutions in a situation where they’re defending a model for governance under siege, fostering doubts for its efficacy.

And a large number of those stakeholders are far behind those in developed, Western countries in regard to Internet penetration.

Fiona Alexander speaks at the Opening Plenary Roundtable at IGF-USA in Washington, D.C. on July 26, 2012.

Kristin Peterson, co-founder and CEO of Inveneo, a non-profit organization dedicated to the proliferation of communications technology in the developing world, shared just how much work needs to be done in bridging the gap between dominant Internet stakeholders and those just attaining reasonable access to the Web.

“Internet access is important not just on individual level, but on a functional level, an organizational level,” she said.

Part of this is due to the remoteness of developing, rural areas, which drives up the cost of infrastructure to a counterproductive degree.

A single 1MB connection, Peterson highlighted, which would be suitable for a school or a medical clinic, costs upwards of $800 a month in Haiti. Another unnamed country that Inveneo has worked with has less than 100MB in total. And that 1MB of Internet access? It costs roughly $2,000 per month.

On the opposite end of the spectrum, far removed from countries just beginning to break down the barriers preventing them from gaining full access to the Internet, are stakeholders who, in the minds of some, will have an inordinate amount of influence over multi-stakeholder debates.

Marc Rotenberg, president of the Electronic Privacy Information Center, highlighted the influence of corporate entities as one such problem.

Comparing growing corporate influence over the Internet to “the clouds gathering at the beginning of a Batman movie,” Rotenberg warned those in attendance, “You have to pay attention when the skies darken, things are about to happen.”

One such entity, which Rotenberg accused of having an ever-growing outsized influence over the Internet, is Google, whose growing presence on the Web is the “Number-one threat to Internet freedom.”

Regardless of whether that’s the case, such problems do require a means to draw in those affected by the evolving dialogue on Internet governance.

John Curran speaks at the Opening Plenary Roundtable at IGF-USA in Washington, D.C. on July 26, 2012.

“How do we get people engaged, how do we raise a flag and pull in society, business, governments?” asked John Curran, president and CEO of the American Registry for Internet Numbers.

Curran offered perspective into the scope of the problems facing Internet stakeholders, the shape of which appears on multiple layers, with technological standards and protocols existing at the bottom layer. They require little political involvement, moving up to domain names and IP addresses, which aren’t necessarily the most hot-button social issues under debate within the halls of Congress. Nonetheless, they bring about privacy and tracking concerns, peaking with the broad, end-user experiences that draw in such general topics as intellectual property use, censorship and national security.

And, of course, given the nature of IGF, the multistakeholder model is seen as the best means to approach such problems.

Paul Brigner, the regional director of the North American Bureau at the Internet Society and Jacquelynn Ruff, vice president of international public policy and regulatory affairs for Verizon, offered insight into how new players are accepting and integrating into the multistakeholder approach.

Telecommunications firms, well aware of the dwindling demand for their traditional services in the wake of the Internet revolution, are “moving away from focusing on traditional telecommunications to Internet protocol and Internet issues,” Brigner said.

Jacquelynn Ruff speaks at the Opening Plenary Roundtable at IGF-USA in Washington, D.C. on July 26, 2012.

An issue such as the possible transition to a sending party pays structure, for example, is an issue that demands the inclusion and participation of a multitude of affected parties. Under such a regime, “You’re not free, necessarily, to innovate at low cost like you experience today,” Brigner said. “The end-to-end nature of the Internet that allows these sort of things to evolve.”

To alleviate some of the difficulty inherent in such discussions, Ruff cited the importance of enhanced cooperation, the notion of mapping past developments, current deficiencies and projecting future ambitions in a way that involves all interested parties. Emphasizing examples within UNESCO, ICANN and the Council of Europe, Ruff celebrated enhanced cooperation’s increasing rate of adoption.

The world is at “a fork in the road on the global discussion on where the future lies,” she said. And applying enhanced cooperation to the traditional multi-stakeholder methodology could be an effective means to remedy the arguments over which path to take.

That said, a plethora of stakeholders have their own interpretation and they will be seizing the opportunities granted by this IGF event and future conferences to throw their hat into the ring drawn by the opening plenary session’s panelists.

— Morgan Little

IGF-USA 2012 Opening Plenary Remarks: Ambassadors Phil Verveer and Terry Kramer advocate Internet freedom, multi-stakeholder model

leave a comment »

Brief session description:

Thursday, July 26, 2012 – Ambassador Phil Verveer, coordinator for international communications and information policy at the US State Department, offered opening remarks and introduced Terry Kramer,  the former president of Vodafone North America, who was appointed in the spring of 2012 to be US Ambassador to the World Conference on International Telecommunications, which will take place Dec. 3-13 in Dubai, United Arab Emirates. The International Telecommunication Union description of WCIT: “The conference is a review of the current International Telecommunications Regulations, which serve as the binding global treaty outlining principles that govern the way international voice, data and video traffic is handled, and which lay the foundation for ongoing innovation and market growth.”

Details of the session:

Phil Verveer, U.S. State Department Ambassador, gives opening remarks at IGF-USA in Washington, D.C. on July 26, 2012.

Ambassador Phil Verveer, US coordinator for International Communications and Information Policy, emphasized the importance of Internet freedom at the Internet Governance Forum-USA Thursday morning at Georgetown Law Center.

The Universal Declaration of Human Rights, adopted by the United Nations General Assembly in 1948, directly supports Internet freedom, Verveer said.

“Article 19:2 (states), ‘Everyone has the right to freedom of opinion and expression. This right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers,’” he said. “Every human is entitled to these rights simply by being human.”

Although it has been ratified by many nations, the declaration is not bound by international law, and Verveer acknowledged that differing government philosophies result in different Internet policies.

“There is a compelling case for Internet freedom grounded in human rights, but the problem, of course, is that it is not nearly enough to persuade some countries that have strong reasons to interfere with Internet freedom,” he said.

Verveer pointed out that the declaration does not provide the only support for Internet freedom. The economy also allows for strong incentive to liberalize Internet policy.

From an economic standpoint, the argument for Internet freedom is straightforward: The Internet is an enormous commercial channel, and there is a positive correlation between its accessibility and its economic potential.

“There is the fundamental intuition that serious reductions in innovation will handicap economic growth,” Verveer said.

Verveer said he expects that delegates to the 2012 World Conference of International Telecommunications (WCIT) in Dubai will be in agreement that the amendments made to the International Telecommunication Regulations (ITRs) in 1988 be upheld.

“The United States will … prevent changes of ITRs that would … constitute a reversal of the liberalized telecommunications environment that has prevailed virtually everywhere in the world since 1988,” he said. “Our principal goal for WCIT involves maintaining this enabling environment, with complete confidence that if we are successful the benefits of information and communications technology will continue to increase and to expand to billions of additional people.”

Terry Kramer, U.S. Ambassador for WCIT, gives opening remarks at IGF-USA in Washington, D.C. on July 26, 2012.

Verveer then yielded the stage to Terry Kramer, US ambassador to the WCIT. Kramer charged the audience to think critically about the Internet’s future and about the messages relayed to other stakeholders in the global network.

“When it comes time for us to advocate directly (for Internet freedom), it will be very important that we come from a position of knowledge and fact, not just ideology,” he said. “(We must) be able to speak from knowledge about what worked in the past and how we see the future evolving.”

Kramer attested to the value of the multistakeholder model, given the distributed nature of the Internet and the diversity of its users. He emphasized the need to meet with international players at the forefront of the Internet’s evolution. “The multistakeholder model is the only effective one that will work,” he said. “The Internet is too global to have one organization in control. …We need to get examples of what success looks like (across the world).”

Kramer warned that some stakeholders’ ambitions are likely to oppose Internet freedom, openness and accessibility.

“There have been several proposals that … are worrisome,” he said. “One category of these is the control of traffic and the control of content. From every angle, that results in a bad outcome. It creates cynicism … and workaround solutions. … But there will be wisdom and good ideas here that we can effectively advocate.”

— Katie Blunt