report to congress

 
 


children�s internet protection act

pub. l. 106-554

study of technology protection measures

in section 1703

august 2003

department of commerce

national telecommunications and information administration






 
           

 

 

 

 

 

 



department of commerce

national telecommunications and information administration

  nancy j. victory, assistant secretary for communications and information

  office of policy analysis and development

kelly levy, associate administrator
sallianne schagrin, telecommunications policy analyst
sandra ryan, telecommunications policy analyst

office of the chief counsel

kathy smith, chief counsel
josephine scarlett, attorney advisor
stacy cheney, attorney advisor

office of the assistant secretary

christina pauze, telecommunications policy analyst

elizabeth mccleneghan, student intern
igor fuks, student intern
gina beck, student intern




table of contents

executive summary..................................................................................................

i.          introduction....................................................................................................

a.  children and the internet........................................................................................
b.  congressional efforts to protect children from inappropriate online content..........
c.  ntia�s requirement under cipa to evaluate technology protection
measures and internet safety policies..............................................................

ii.  evaluation of existing technology protection measures� ability
      to meet the needs of educational institutions
........................

 a.  balancing the need to allow children to use the internet
with the need to protect children from inappropriate material...............................
b.    accessing online educational materials with a minimum
level of relevant content being blocked...............................................................
c.    deciding on the local level how best to protect children
from internet dangers............................................................................................
d.    understanding how to fully utilize internet protection
technology measures............................................................................................
e.     considering a variety of technical, educational, and economic
factors when selecting technology protection measures......................................
f.     adopting an internet safety strategy that includes technology,
human monitoring, and education.........................................................................

iii.      fostering the development of measures that meet the needs of educational institutions...................................................................

a.  ntia recommendations.......................................................................................
      1.  training............................................................................................................
      2.  legislative language.........................................................................................

iv.       the development and effectiveness of internet safety policies

a.  best practices........................................................................................................
b.  lessons learned from internet safety policies........................................................

v.        conclusion.........................................................................................................

appendix i:      federal register notice
appendix ii:     list of commenters
appendix iii:    filtering effectiveness tests cited in n2h2 comments to the ntia
appendix iv:    sample acceptable use policies


executive summary

in homes, schools, and libraries across the nation, the internet has become a valuable and even critical tool for our children�s success.  access to the internet furnishes children with new resources with which to learn, new avenues for expression, and new skills to obtain quality jobs.

our children�s access to the internet, however, can put them in contact with inappropriate and potentially harmful material.  some children inadvertently confront pornography, indecent material, hate sites, and sites promoting violence, while other children actively seek out inappropriate content.  additionally, through participation in chat rooms and other interactive dialogues over the internet, children can be vulnerable to online predators. 

parents and educators have access to a variety of tools that can help protect children from these dangers.  in october 2000, congress passed the children�s internet protection act (cipa), which requires schools and libraries that receive federal funds for discounted telecommunications, internet access, or internal connections services to adopt an internet safety policy and employ technological protections that block or filter certain visual depictions deemed obscene, pornographic, or harmful to minors.[1]  congress also requested the department of commerce�s national telecommunications and information administration (ntia) to (1) evaluate whether the technology measures currently available adequately address the needs of educational institutions, and (2) evaluate the development and effectiveness of local internet safety policies.  congress also invited any recommendations from ntia as to how to foster the development of measures that meet these needs.  this report sets forth ntia�s public outreach, including comments received through a request for comment, its evaluation, and recommendations.

with respect to whether the technology measures currently available address the needs of educational institutions, the commenters identified the following needs of educational institutions:

based on a review of the comments, currently available technology measures have the capacity to meet most, if not all, of these needs and concerns.

accordingly, ntia makes the following two recommendations to congress on how to foster the use of technology protection measures to better meet the needs of educational institutions:

finally, commenters expressed a great deal of satisfaction regarding the development and effectiveness of internet safety policies.  specifically, they praise the ability to customize these policies to address the concerns of individual communities.  based on the comments, ntia has identified best practices for use in developing internet safety policies. 


i.          introduction

in october 2000, congress passed the children�s internet protection act (cipa) as part of the consolidated appropriations act of 2001.[2]  cipa requires schools and libraries receiving discounted telecommunications, internet access, or internal connections services through federal funding mechanisms to certify and adopt an internet safety policy and employ technological protections that block or filter certain visual depictions deemed obscene, pornographic, or harmful to minors.  section 1703 of cipa requests that the national telecommunications and information administration (ntia) within the u.s. department of commerce evaluate the effectiveness of internet technology protection measures and safety policies to fulfill the needs of educational institutions, make recommendations on fostering the development of measures that meet these needs, and evaluate the development and effectiveness of local internet safety policies.  in accordance with the statute, ntia initiated a notice and comment proceeding to obtain public comment on these issues.

            a.        children and the internet

the explosive growth of internet use in the united states has been fueled in part by children�s and teenagers� online activities.  children and teenagers use computers and the internet more than any other age group.[3]  by the fall of 2001, 99 percent of public schools in the united states had access to the internet, and public schools had expanded internet access into 87 percent of instructional rooms. [4]  approximately 65 percent of american children ages 2-17 use the internet from home, school, or other locations.[5] access to the resources of the internet has given children new research tools, information sources, avenues of expression, collaborative learning opportunities, and connections to other communities, among other benefits.[6]  but it also has potentially exposed them to the unseemly side of the internet � indecent material, pornography, hate sites, violent sites, and online predators.[7] 

           in august 2002, the national academies released youth, pornography, and the internet, a report that studied tools and strategies for protecting children from online pornography.  the report concluded that there are no �foreseeable technological �silver bullets� or single permanent solutions� to keeping children safe from such material.[8]  rather, the report supported solutions that balance the potential benefits of the internet to children with the competing goals and values of the community.[9]

            as the dangers to children in an online environment have emerged, so have a variety of technology tools.  some common technologies used to protect children include:[10]

o       filtering with yes or no lists:

o      server-side filtering:  internet service providers and online server software offer filtering techniques to clients that deny access to particular content sources that have been pre-selected for blocking via automated processes, human review, and/or user options.  the list of blocked urls may or may not be disclosed and is regularly updated at the server level.[11]

o      client-side filtering:  this technology prohibits the browser from downloading content based on specified content sources identified by the user.  blocked sites may originate from both the software supplier and/or from the user�s decision.  users maintain control over these lists with a password and may periodically download updated lists from the software�s website.  some software filters out email or instant messaging.[12]

o       filtering using text-based content analysis:  this technology combines pc-based software and server software to conduct real time analysis of a website�s content to filter out illicit content.  some software analyzes email and attachments.  the user may or may not gain access to how such content is excluded.[13]

o       monitoring and time-limiting technologies:  this technology tracks a child�s online activities and sets limits on the amount of time a child may spend online.  monitoring software often covers the internet, email, and instant messaging activities.[14]

o       age verification system:  this technology uses an independently-issued id and controls the flow of online content by conditioning access to a web page with use of a password issued (by a third party) to an adult.[15]

even the most sophisticated and current technology tools are not one hundred percent effective.[16]  public awareness campaigns and workshops have sought to supplement technology tools.[17]  in addition, congress introduced several bills to legislate a solution.

b.         congressional efforts to protect children from inappropriate online content  

in 1996, congress first attempted to curb inappropriate online content by passing the communications decency act (cda).[18]  the cda prohibited the sending or posting of obscene or indecent material via the internet to persons under the age of 18.  the supreme court declared the law unconstitutional, however, stating that the law violated free speech under the first amendment.[19]  specifically, the court ruled that cda's vague provisions chilled free speech unknown to the speaker generating the content, and that the cda's provisions criminalized legitimate, protected speech, including sexually explicit indecent speech, in addition to unprotected obscene speech.

congress responded by passing the child online protection act (copa) of 1998, a law written more narrowly to protect children from inappropriate online content.[20]  copa prohibited commercial web sites from displaying �harmful to minors� material and imposed criminal penalties on violators.  a three-judge panel for the united states district court for the eastern district of pennsylvania ruled that copa�s reference to �contemporary community standards� violated the first amendment when applied to the world wide web, and imposed an injunction on the enforcement of copa.[21]  the third circuit affirmed this decision stating that the reference to community standards in the definition of  �material that is harmful to minors� resulted in an overbroad statute.[22]  in may 2002, the supreme court vacated the third circuit decision and remanded the case for further review.[23]  the court found that �contemporary community standards� by itself does not render the statute overbroad for purposes of the first amendment.[24]  on remand, the third circuit found that copa is not sufficiently narrowly tailored to satisfy the first amendment requirements.[25]

in october 2000, congress passed the children�s internet protection act (cipa) of 2000.[26]  the law conditions the receipt of certain federal funding on educational institutions� adoption of technological protections and internet safety policies.  sections 1712 and 1721 of cipa, involving the use of filtered internet access on public computers in libraries, were challenged in court as unconstitutional.[27]  in may 2002, the united states district court for the eastern district of pennsylvania struck down these provisions of cipa as unconstitutional, stating that a technology�s tendency to overblock material prohibits the flow of protected speech to library patrons.[28]   under a provision within cipa, providing for a fast-track appeals process requiring any appeals to be heard by the supreme court, the justice department appealed the court�s decision to the supreme court.  the court agreed to review cipa and heard oral arguments in march 2003.[29] 

in a plurality decision, the supreme court reversed the district court's decision in june 2003, finding that the filtering provisions did not violate the first amendment.[30]  four justices held that (1) the internet access provided by libraries is not a public forum, and therefore, decisions to block pornography are not subject to heightened scrutiny; (2) the disabling provision eases fears of "overblocking;" and (3) requiring filtering and blocking technology is an appropriate condition on the receipt of federal funding because libraries already exclude pornographic material from their other collections.  the supreme court underscored �the ease with which patrons may have the filtering software disabled.�[31]  the federal communications commission subsequently issued an order to ensure that its implementation of cipa complies with the supreme court�s decision.[32]

c.        ntia�s evaluation of technology protection measures and internet safety policies

section 1703(a) of cipa requests ntia to initiate a notice and comment proceeding to determine whether currently available blocking and filtering technologies adequately address the needs of educational institutions, to make recommendations on how to foster the development of technologies that meet the needs of schools and libraries, and to evaluate current internet safety policies.  section 1703(a) of cipa specifically provides the following:

       sec. 1703. study of technology protection measures

       (a) in general. - not later than 18 months after the date of the enactment of this act, the national telecommunications and information administration shall initiate a notice and comment proceeding for purposes of ---

1) evaluating whether or not currently available technology protection measures, including commercial internet blocking and filtering software, adequately address the needs of educational institutions;

(2) making recommendations on how to foster the development of measures that meet such needs; and

(3) evaluating the development and effectiveness of local internet safety policies that are currently in operation after community input.

            on may 24, 2002, ntia published a �request for comment� in the federal register,[33] eliciting information about technology protection measures and internet safety policies.  ntia requested interested parties to submit written comments on any issue of fact, law, or policy germane to the evaluation.  ntia also encouraged commenters to submit copies of relevant studies, surveys, research, or other empirical data.  ntia did not seek comment on the constitutionality of the statute or its provisions.  in order to generate a wide range of responses, ntia conducted extensive outreach to the education community, technology developers, consumer groups, and academia.  the �request for comment� elicited 42 comments from associations, technology vendors, governmental agencies, academics/university professors, schools, and libraries.[34]

ii.        evaluation of existing technology protection measures� ability to meet the needs of educational institutions

section 1703 of cipa requests that ntia evaluate whether currently available technology protection measures, including commercial internet blocking and filtering software, adequately address the needs of educational institutions.  in answering this inquiry, the commenters identified six needs of educational institutions: 

1)     balancing the importance of allowing children to use the internet with the importance of protecting children from inappropriate material;

2)     accessing online educational materials with a minimum level of relevant content being blocked;

3)     deciding on the local level how best to protect children from internet dangers;

4)     understanding how to fully utilize internet protection technology measures;

5)     considering a variety of technical, educational, and economic factors when selecting technology protection measures; and

6)     adopting an internet safety strategy that includes technology, human monitoring, and education.

below we examine these needs and set forth the commenters� evaluation of whether existing technology protection measures are meeting each of these needs.

a.                balancing the importance of allowing children to use the internet with the importance of  protecting children from inappropriate material.

congress passed cipa to protect children from inappropriate and harmful content while accessing the internet at educational institutions that use federal funds.[35]  commenters expressed little doubt that technology plays a role in reducing a child's exposure to inappropriate content.[36]  many commenters wrote of their use of technology protection measures.  several comments from schools and libraries reported using internet-content filters in order to assist in a safer internet experience.  some institutions install filters specifically on internet stations for children under eighteen.[37]  some schools reported the effective use of filtering software.  for example, st. pius x school in urbandale, iowa reported using firewall filtering as well as customizable blocking to meet its protection needs.  the school�s administrators select sites and domains to block with the option to �unlock� those sites at a later time.[38]  one public library described filters as "easy to use," giving students "access to most sites they need in school."[39]  the library also reported few, if any, problems associated with filtered internet use.[40]

ntia also received comments that referenced the results of 26 independent laboratory tests on filters conducted between 1995 and 2001 by ten professional testing laboratories.[41]  (see appendix iii)  the labs conducted 108 individual product tests examining filtering software.  the test results grouped products into three categories: "found filters effective," "found filters of mixed effectiveness," and "found filters ineffective."  nineteen of the twenty-six product tests found filters effective, four product tests found filters of mixed effectiveness, and three product tests found filters ineffective.  based on these results, the commenters that drew ntia�s attention to this study concluded that filtering is an effective method of protecting children from inappropriate material.[42]

where filtering fell short of being effective, the situation usually involved either overblocking or underblocking of material.  numerous commenters discussed the effect of overblocking and underblocking of online content as it relates to the needs of educational institutions. [43]   the united states district court for the eastern district of pennsylvania defined overblocking as, �the blocking of content that does not meet the category definitions established by cipa or by the filtering software companies,� and underblocking as �leaving off of a control list a url that contains content that would meet the category definitions defined by cipa or the filtering software companies.�[44]

one concern resulting from overblocking is the restricted ability of users to view appropriate content and legitimate online research.[45]  comments from the education community acknowledged that despite training and education, technology still fails to meet the needs of educators by missing inappropriate sites, or by depriving students and teachers of access to legitimate information.  two commenters expressed particular concern with the latter situation.[46]  a study by the pew internet and american life project recently found that:  �[w]hile many students recognize the need to shelter teenagers from inappropriate material and adult-oriented commercial ads, they complain that blocking and filtering software often raises barriers to students' legitimate educational use of the internet.  most of our students feel that filtering software blocks important information, and many feel discouraged from using the internet by the difficulties they face in accessing educational material.� [47]

other comments referred to the united states district court for the eastern district of pennsylvania�s may 24, 2002 decision declaring cipa sections 1712 and section 1721 facially invalid under the first amendment.[48]  a three-judge panel convened an eight-day trial to decide the issues related to the effectiveness of currently available technology protection measures.[49]  the commenters directed ntia�s attention to the court�s discussion of the difficulties with the internet�s structural composition that impinge upon the filtering software�s ability to block content effectively.[50]

 the court described the internet as a decentralized, interconnected network with millions of web pages linked to thousands of additional web pages to create the �publicly indexable web.� [51]   these links enable search engines to sort and index material by following links from one web page to another.[52]  accordingly, search engines often fail to categorize isolated web pages not connected by these links.[53]  witness testimony estimated that fifty percent of the internet currently remains incapable of being indexed, thereby further invalidating the effectiveness of filtering technologies.[54] 

the court heard testimony from three leading filtering companies who explained the methods used to filter content.[55]  typically, filtering software products separate appropriate and inappropriate content by compiling category lists such as:  adult/sexually explicit, arts, alcohol, business, chat, dating, education, entertainment, hate speech, health, illegal, news, religion, and violence.[56]  users determine which content to block by selecting from pre-determined category lists.[57] 

additional testimony was to the effect that the filtering technologies are incapable of effectively blocking the majority of content defined by cipa without also blocking a substantial amount of protected speech.[58]  as indicated by government witnesses, every filtering software product demonstrated excluded between 6 percent and 15 percent of protected speech.[59]  the court evaluated why filtering software overblocked or underblocked material and concluded that:  filtering companies focus on reviewing fresh content or newly posted web addresses and spend little time on reviewing the accuracy of websites previously categorized; inconsistencies exist between filtering definitions for pornography and cipa�s legal definitions of obscenity, child pornography, or content harmful to minors; community standards vary with regard to categorizing content; and the available technology is generally unable to meet cipa�s requirement that filters block visual depictions, but not text.[60]

based on the comments, existing technology protection measures are helping to meet the concerns of educational institutions to protect children from inappropriate materials they may encounter while using the internet.  the occurrence of overblocking and underblocking, however, has resulted in some dissatisfaction and frustration by users with the existing technology protection measures. 

b.                accessing online educational materials with a minimum level of relevant content being blocked.

while existing technology protection measures, such as filtering software, are able to block much of which is deemed inappropriate material for children, the technology measures also sometimes block online educational content sought by teachers.  commenters from both individual schools and associations representing schools discussed the difficulties that educators experience when planning lessons based on online content.  the consortium for school networking (cosn) polled their members and found that filtering and blocking technologies often block lessons planned by teachers from home, including educational websites.[61]  for example, this experience caused frustration for a program in missouri that furnishes teachers with laptops for the specific purpose of preparing lessons at home.  the technology in these schools often blocks access to web sites pre-selected by teachers.  teachers in these schools usually discover the blocked web sites during a lesson, forcing them to react quickly and find new, suitable content. 

one response to this situation is the cosn�s june 2001 report, �safeguarding the wired schoolhouse,� which provides guidance to educators using the internet to supplement their lessons with educational content and resources that evaluate web sites, search strategies, search engines, and web lessons.[62]  two examples provided in the report include the montgomery county public schools� and the washington library media association�s development of websites about information literacy and the creation of web lessons.[63]

congress included several �disabling provisions� within cipa allowing administrators to disable technology for certain bona fide research or other lawful purposes.[64]  although some claim that congress intended these provisions to cure the overblocking tendencies of technology protection measures,[65] some commenters expressed concern that the provisions affect differently those recipients receiving e-rate funds and those receiving department of education funds.[66]  for example, the recipients of department of education funds may �disable for certain use�[67] and recipients of e-rate funds may �disable during adult use.�[68]   the comments further explained that �disabling for certain use� permits administrators to supersede technology for both adults and students, whereas �disabling during adult use,� limits a school�s flexibility to supersede technology.[69]  some schools noted that by creating different standards based on the source of federal funds, these provisions generate confusion and reluctance within educational communities about using disabling technology to accommodate override requests for fear of breaching cipa.[70]  some commenters perceived the override provision as failing to cure the overblocking concerns when educators or students desire immediate access to educationally-related material.[71]

based on the comments, some educators are having difficulties with existing technology protection measures in meeting their need to be able to access online educational materials with a minimum level of relevant content being blocked.  the disabling provisions of cipa do not appear to be a satisfactory answer for some educators.

c.                deciding on the local level how best to protect children from internet dangers.

several commenters stated that cipa�s provisions requiring educational institutions to install technology protection measures on computers removes local decision making from educators.[72]  comments from associations representing schools explained that schools often adopt locally-based internet solutions reflecting the unique circumstances of the community, such as:  faculty and staff familiarity with technology; level of patron and parental involvement; values of the community; funding resources; size of the community and educational institution; degree of supervision; education philosophy; and political will of library and school board members.[73]   further, schools prefer making decisions locally to reflect local resources (financial and human), values, and community concerns.[74] 

commenters also tended to disagree regarding the access to selection criteria developed by software companies for filtering products.  for example, educators argue that, without an understanding of how technology companies select blocking criteria, educators possibly subject themselves to non-educational standards and the ideas and policies of outside parties.[75]   yet, according to the comments submitted by a technology developer of blocking and filtering software, the company provides extensive information to users and publishes details about the categories of sites it blocks.[76]   several vendors� comments discussed their products� ability to allow users to type in a web address to learn more about a particular site�s blocking category.[77]  additionally, one vendor discussed its efforts to seek user feedback and to respond promptly to consumer requests to add, delete, or change a blocked web site.[78]  to that end, the company received over 60,000 requests between january 1, 2002 and august 15, 2002, and reviewed each request within two days.[79]  of these requests, twenty percent resulted in an addition, deletion or change.[80] 

on the other hand, two commenters noted that many technology companies choose not to release their blocked lists for a variety of reasons including:  the list�s proprietary nature and source code; the risk of abuse by competitors; the expense associated with a carefully created database; the harmful effect to children; the diminished value of a published list; and the general privacy policy of the company.[81]  in addition, the national education association�s comments stated that, generally, category descriptions vary in scope, detail, and helpfulness.[82]  one advocacy group claimed that the employees of filtering companies may apply their own subjective judgments or reflect the manufacturers' social and political views when reviewing content web sites.[83]   

in addition to preferring that technology companies release their lists of blocked sites, educational institutions questioned the process filtering companies use to develop and define blocking criteria.  the american library association expressed uneasiness with selecting technology tools to accommodate the wide-ranging values of their patrons when most libraries feel uncertainty about the blocking decisions made by companies.[84]  the center for democracy and technology agreed that technology users enjoy little input into blocking decisions, noting that, �in designing filtering tools, companies seek to meet the needs of diverse consumer groups and thus intentionally choose to block sites that may be undesirable or offensive to a particular audience or targeted consumer group but deemed appropriate by another.�[85] 

many commenters cited to the u.s. district court decision to highlight the desire of educational institutions to make decisions locally, and the need to understand categories pre-selected by filtering companies.  additionally, commenters discussed that the blocking categories defined by filtering companies rarely correspond with cipa�s definition and these categories cannot be customized to comply with cipa.[86] 

the comments underscored in a number of ways the belief by some educational institutions that existing technology measures fell short of meeting their need to decide locally how to protect the children in their community from internet dangers.

d.                understanding how to fully utilize internet protection technology measures.

the comments indicated that educators need training to fully understand how to use the technology protection measures in order to accommodate bona fide and other lawful research, as well as to meet other needs of their specific environment.  several comments noted the difficulty of adjusting a technology tool to override a blocked web site.[87]   many commenters acknowledged that overblocking of helpful educational material occurs with many filtering products and, consequently, teachers need training on how to disable filtering software for minors conducting educational searches or other legitimate research.[88]

the commenters also noted instances where educators experienced delays with an override.[89]  the consortium of school networking (cosn) asked their members to report their experience with override requests.  they found that the time it took to request an override and receive a response ranged from less than five minutes to as long as one week.[90]  some institutions lacked an override policy altogether.[91]   one association�s comments summarized the end result of these issues as extremely frustrating for teachers who lack training on how to disable filtering technology.[92] 

ntia also received a variety of responses discussing educators� experience with adjusting technology protection measures to accommodate all age groups and grades.  the comments indicate the need for training educators on how to adjust technology protection measures to accommodate different age groups.  one commenter stated that its filtering technology does not adjust blocking content based on the age of the child.[93]  yet, many technology products offer users the ability to customize.[94]  one technology vendor provided ntia with an example of its product�s web site customization feature.[95]  specifically, the product gives the user the ability to add sites to a block list.[96]  another commenter described a software program that accommodates six age groups:  unfiltered access-adults; teen access-15 to 17; pre-teen-12 to 14; kid-8 to 11; child-7 & under.[97]  while many products exist that adjust to different ages, some commenters disagreed with the effectiveness or ease of adjusting the technology to accommodate various ages or grades.[98]  one commenter noted that relying on age specific categories works well for younger children, but varying maturity levels makes it more difficult to cluster older children by age and rely upon the categories pre-selected by technology vendors.[99]

based on the comments, existing technology protection measures are capable of meeting a number of the needs of educational institutions.  however, some educators are unaware of the capabilities of these measures or lack the knowledge about how to use many features of the technology protection tools.

e.         considering a variety of technical, educational, and economic factors when selecting technology protection measures.

commenters listed several factors that educational institutions take into account prior to selecting technology.  most commenters cited cost as the primary factor.  one commenter mentioned that when institutions consider cost, they often choose cheaper and less sophisticated products.[100]  schools and libraries also noted that they obtain very little extra funding to pay for internet protection measures.[101]  the e-rate program, which gives schools and libraries discounts on telephone service, internet access, and internal connections, does not cover technology protection measures, such as filtering and blocking software.[102]  in addition to cost, comments from educational associations listed maintenance, effectiveness, ability to customize, network impact, and upgrades as important factors considered when selecting technology protection measures.[103]  in sum, the commenters noted that educational institutions consider a variety of economic, technical, and educational factors when selecting technology protection measures.

f.                 adopting an internet safety strategy that includes technology, human monitoring, and education. 

commenters responding to ntia�s request for comment described their experience with the use of technology protection measures within educational institutions.  many educational institutions discussed their use of filtering and blocking technology to protect children from inappropriate content.  others explained their use of a combination of technology and non-technical protection strategies, such as human monitoring or internet safety policies, to achieve this goal.

interestingly, the comments revealed that the measures adopted by educational institutions depend in part on their interpretation of cipa.  one commenter noted that educational institutions trying to comply with cipa interpret the language �technology protection measures� as a requirement to install only filtering software, and often do not explore other technical remedies.[104]   this commenter also stated that many educational institutions interpret cipa�s �technology protection measure" language as limited to �commercial, proprietary-protected filtering software.�[105]  a trade association noted that this narrow interpretation of cipa�s technology protection measure requirement may inhibit schools and libraries from adopting more comprehensive solutions that encompass both technology and education.[106]  some commenters did discuss other technology measures, such as monitoring software,[107] but there were no comments from educational institutions regarding their experience as users of monitoring software. 

the federal communications commission�s (fcc) rules interpret cipa as encouraging educational institutions to adopt both technological and non-technological measures to protect children online.[108]   fcc regulations require schools and libraries to certify that they have adopted:

       an internet safety policy that blocks and filters certain visual depictions for both minors and adults;

       an internet safety policy that includes monitoring;

       an internet safety policy that addresses:  access to inappropriate material; email, chat, and other forms of electronic communications; hacking; disclosure of a minor�s personal information; and measures restricting material that is harmful to minors. [109]

a report released in september 2002 by the u.s. department of education�s national center for educational statistics supports the conclusion that educational institutions rely on a combination approach to shield children from inappropriate online content.  the report documents that, in 2001, 96 percent of public schools used a variety of technologies or policies to protect children from inappropriate content.  of these schools, 91 percent relied on teacher or staff monitoring; 87 percent installed blocking or filtering software; 80 percent required parents to sign a written contract; 75 percent required students to sign a written contract; 44 percent adopted an honor code; and 26 percent confined school access to an intranet.[110]

notwithstanding some commenters� interpretation of cipa, the majority of comments indicated that most educational institutions prefer a combination of technology and education to ensure a safe online environment. [111]  members of the international society of technology in education (iste) adopted numerous methods to ensure that students had a safe, educational, and age appropriate experience online, including acceptable use policies, software technologies, teacher monitoring and supervision, and student education programs.[112]  a trade association representing schools stated that most educational institutions adopt diverse internet protection solutions that correspond with the culture and resources of their community.[113] 

several commenters indicated a preference for non-technological solutions or a need to supplement technology with non-technical measures to create a safe online environment.  for example, the state education department of the university of the state of new york relies on broadly written acceptable use policies as their protection method of choice.  it views technology-based solutions as geared toward content issues only, leaving the other challenges associated with public internet access unaddressed.  thus, it adopted written policies to manage a wide-range of additional specific behaviors, such as patron access, noise levels, and computer tampering.[114]  

a school in albuquerque, new mexico, took a different approach.  as an individual serving as a volunteer school technology coordinator explained, the school adopted student monitoring and pre-selected sites over filtering technology, not only because of the unreliability of technology and the cost, but also because of an inadequate budget to train staff.[115]  this commenter concludes that the creation of �yes� lists, or pre-selected child-appropriate content, serves to keep children protected from harmful content.[116] 

some libraries are also emphasizing a non-technical approach to safeguarding children from harmful content.  the board of the evanston public library in illinois implemented a library use policy instead of filtering software for its computers.  the policy encourages parents to accompany their children and supervise their internet access.  additionally, librarians configure children�s computers for �focused internet access,� directing kids to pre-select age-appropriate websites.[117]  the charles m. bailey public library in winthrop, maine utilizes a combination of bookmarks, web design, parental involvement, and technology education classes for children to create a safe online environment.[118]  the las vegas clark county library district (lvccld) uses an approach giving patrons numerous options to protect themselves online.  the library prefers an �empowerment� approach offering patrons the choice to control their internet access level with various educational and informational methods.[119]   

 based on the comments, existing technology protection measures are capable of meeting the technology component of an approach that includes both technology and non-technical protection strategies.

in sum, ntia gleaned six distinct needs within educational institutions:  (1) balancing the importance of allowing children to use the internet with the importance of protecting children from inappropriate material; (2) accessing online educational materials with a minimum level of relevant content being blocked; (3) deciding locally how best to protect children from internet dangers; (4) understanding how to fully utilize internet protection technology measures; (5) considering a variety of technical, educational, and economic factors when selecting technology protection measures; and (6) adopting an internet safety strategy that includes technology, human monitoring, and education.  as articulated in the comments, existing technology protection measures, by themselves, are meeting most, but not all, of these needs.  below we discuss ways to foster the development of measures that would more fully meet these needs of educational institutions.

iii.       fostering the development of measures that meet the needs of educational institutions

in the comments, ntia found that educational institutions experienced frustration with the marketplace for not developing new and advanced technology protection measures. ntia asked commenters to discuss the development of new technology features that would better meet the needs of educational institutions.  ntia received a variety of responses indicating that the following technology features would best assist educational institutions today. 

comments from four technology vendors explained how their technology blocks categorized content, and described features associated with their product. [125]  the vendors offer many of the features desired by the education community.  for example: 

based on the four descriptions, these existing products offer features similar to those requested by educational institutions. 

through independent research, ntia also found that more companies are increasingly entering the market for internet content protection technology.  some analysts predict that the growth of the networking and protection market can be attributed to increased internet access, the exponential growth of web pages, and the increasing desire of families, schools, and libraries to protect children from inappropriate content and interactions on the internet.[130]  some analysts predict that the market for these products will rise to over $600 million by 2004 at a rate of nearly 50 percent per year.[131]

the more-established internet content filtering companies appear to be increasing the amount of money that they put into their research and development divisions.[132]  numerous venture capital firms invest in these internet-safety technology companies as well, both within the united states and abroad.[133]  in addition to u.s.-made internet content filters, international companies are developing filtering software.  currently, over fifty companies exist that provide this technology.[134]  ntia found that while a substantial number of technology companies exist that invest in the research and development of technology protection measures, educational institutions are either unaware of the diverse array of products available to meet their needs or lack the training to fully utilize the products.

some commenters claim that cipa locked in filtering and blocking technology as the "technology protection measure" of choice, thereby stifling potential innovation of technology protection measures.[135]  according to several commenters, little incentive exists for the markets to develop more flexible technology products to meet the needs of educational institutions if investors or venture capitalists perceive the education community as demanding only one type of technology.[136]  the consortium for school networking writes that cipa �forced all of the companies competing in the market to define their product in terms of which best complies with cipa, rather than how they may serve the needs of different kinds of school districts.� [137]  

while some commenters encouraged technology vendors to develop new protection products to meet educators� needs, others believed that the focus of attention should not be on new technologies.[138]  rather, they believe that the focus of attention should be on the development and implementation of a comprehensive education and supervision approach to protect children by preparing them to make safe and responsible choices.[139]

a.       ntia recommendations

           section 1703(a)(2) of cipa invited ntia to make any recommendations to congress on how to foster the development of measures that meet the needs of educational institutions.[140]  based on the comments, ntia has identified two recommendations:  (1) vendors should offer training services to educational institutions so the institutions can understand and use fully the capabilities of technology protection measures; and (2) congress should amend cipa�s language to clarify the term �technology protection measures.�

1.         recommendation #1:  training

the majority of comments from educational institutions noted that some educators often lack the training necessary to use fully the available technology tools.  for example, although cipa includes several provisions giving adults the authority to override technology for certain bona fide or other legitimate research,[141] some educators often do not know how to disable the technology.  commenters also indicated their desire that software perform specific tasks, such as scanning content rather than relying on key words; listing blocked sites by subject area; allowing individual log-ins to accommodate varying ages; and allowing editing and overriding of blocked sites in real time.[142]  ntia identified a disconnect between the specific needs listed by educational entities and the current capabilities of available technology.  ntia found that, while commenters discussed the desire for certain technological capabilities, the vendors� comments explained that their technology already performs many of these tasks. 

ntia recognizes that, as educational institutions become familiar with using technology protection measures, the need for training may decrease.  until that time, however, ntia agrees with commenters who expressed the importance of training as part of the solution to protect children from illicit online content.  ntia suggests that as part of promotional efforts to advertise products or as part of the initial orientation to their products, technology vendors should train and educate teachers, administrative personnel, librarians, and other educational personnel on the specific features of their product. 

2.         recommendation #2:  legislative language

commenters discussed the difficulty that some educational institutions have interpreting cipa�s �technology protection measure� language.   some commenters claim that many educational institutions default to �filtering� technology only, without researching other types of technology protection options.  as a result, many believe that this reliance on mostly filtering products stifles the marketplace and serves as a disincentive for technology companies to invest in the research and development of newer and more sophisticated products.  moreover, as set forth above, filtering and blocking software has not been able to overcome problems of overblocking, inability to generate an updated index for the internet, and lack of correspondence to statutory definitions and categories.  yet, other technology tools can or have the potential to address better the needs of educational institutions.  thus, ntia recommends that congress change the current legislation to clarify that the term �technology protection measure� encompasses not only filtering and blocking software, but also other current and future technology tools.  specifically, section 1703(3) of cipa currently reads as follows:

technology protection measure � the term �technology protection measure� means a specific technology that blocks or filters internet access to visual depictions that are -- (a) obscene, as that term is defined in section 1460 of title 18, united states code; (b) child pornography, as that term is defined in section 2256 of title 18, united states code; or (c) harmful to minors. 

ntia recommends replacing the above language with the following:

technology protection measures � the term �technology protection measure� means a specific technology that prevents internet access to visual depictions that are -- (a) obscene, as that term is defined in section 1460 of title 18, united states code; (b) child pornography, as that term is defined in section 2256 of title 18, united states code; or (c) harmful to minors. 

ntia believes this expanded definition using the word �prevents� will encourage educational institutions to utilize technology, in addition to blocking and filtering software, that may better meet their needs as outlined above.  a wider selection of products should give local decision makers more options to find the products that best meet their community�s needs.

alternatively to amending cipa, ntia recommends that the fcc and the u.s. department of education (doe) provide further guidance to recipients of e-rate or doe funds on the meaning of technology protection measures.

iv.       the development and effectiveness of internet safety policies

ntia found that educational institutions have engaged in discussions with their respective communities to create acceptable internet safety policies.[143]  (see appendix iv for examples.)  educational institutions tend to incorporate the values and needs of their community into their policy and, as a result, experience positive feedback about their policy�s success as part of the solution to protect children online.[144]  most of the commenters expressed a great deal of satisfaction with the evolution and use of safety policies and praised cipa for giving educational institutions the autonomy to develop their own policies.[145]  the consortium for school networking (cosn) expressed appreciation that cipa allowed schools to draft policies reflecting the needs of the community and school environment.[146]  the state education department of the university of the state of new york credits its safety policies as the most effective strategy employed to keep patrons in conformance with library rules.[147]  the policy's success begins with staff-wide understanding of the policy�s content, followed by consistent application, on-going review, and community involvement.[148]

several public libraries post internet safety policies that appear whenever a patron logs onto a public computer.[149]  in these instances, internet access requires patrons to click an acceptance explaining his or her agreement and asks the individual to abide by the terms of the policy.  the policy states that patrons may access constitutionally- protected online material, and that patrons may not use the internet in an inappropriate manner for a public area.  the policy also lists specific, prohibited behaviors, such as accessing obscene material, accessing materials harmful to minors, or engaging in offensive, intimidating, or hostile behavior.  in the two years since implementing the policy, these librarians indicate that they have witnessed only a few instances of inappropriate patron behavior, and attribute their internet safety policy with contributing to a trouble-free environment and creating a safe-online experience.[150]

educational institutions also consider internet safety policies as an avenue to teach children about online safety skills.[151]   some suggested important safety skills may include teaching children about taking appropriate actions when harmful content appears online; teaching children to report threatening/disturbing correspondence online; or arming children with strategies if approached by a stranger.[152]  additionally, one commenter underscored that internet safety policies must be reviewed regularly to guarantee that they adequately reflect the views of the community and cover the appropriate technology.[153]

ntia found that internet safety policies are generally effective when educational institutions customize internet safety policies to the needs of the community.  many communities opt to keep their policies flexible to adapt to evolving technologies and the changing needs of the community.[154]

the national research council report studied acceptable use policies, similar to the internet safety policy.   the report defined acceptable use policies as �a set of guidelines and expectations about how individuals will conduct themselves online.�[155]  accordingly, these policies make young people responsible for their online behavior and encourage personal accountability for responsible internet use.[156]  the report endorses effective policies as including sanctions for violations; soliciting input from parents, community members, schools, libraries, and students; and using accidental violations as opportunities to educate users about how to avoid similar situations.[157]

ntia asked participants to discuss their experience with successful internet safety approaches or �best practices.�  ntia grouped the responses into the following categories:  acceptable use policies, child media literacy, parental education and awareness, staff education and development, identification of appropriate content, and child-safe areas.  a summary of successful best practices provided by the comments is detailed below:

a.  best practices

       child media literacy

  b.   lessons learned from internet safety policies

ntia asked commenters for lessons learned from their experience with internet safety policies.  one internet safety expert told ntia that in order to ensure successful policies governing children's internet use, drafters of such policies should discuss the following:  guidelines/purpose, sharing networks/resources, passwords, email, privacy, copyright and plagiarism, internet access, and safety.[171]  another group encouraged teachers and librarians to establish policies that:  give educators autonomy for classroom curriculum materials; address the different ages of students and different educational settings (classroom use, library use, after school enrichment); and implement effective human and technical monitoring strategies.[172]  an education trade group wrote that internet safety policies should not be regarded as �just another form for parental signature,� but rather these policies must be given special status, and the policy�s principles must be fully integrated into the school curriculum.[173]

other comments discussed the importance of incorporating clear violations and sanctions into safety policies.[174]  for example, a program in missouri encourages strong consequences of computer misuse.[175]  if a student intentionally misuses the computer, the student forfeits all computer privileges and the school informs the student�s parent of the violation.[176]  if a student unintentionally misuses the computer, the student must immediately turn off the computer and raise his or her hand for the teacher to handle the situation.[177]  the program praised this policy as keeping violations to a minimum largely due to the policy�s clarity and consistent enforcement.[178]

commenters also noted several difficulties with employing technology without acceptable use policies to protect children.  first, commenters noted that technology protection measures are not the entire answer. [179]  these commenters emphasized that technology protection measures are most effective when teachers and educational institutions can customize technology and use it in connection with other strategies and tools.[180]  as one commenter stated, children need to be trained to think critically and use the internet safely.  technology cannot replace education and judgment.[181] second, one commenter noted that technology protection measures can give a false sense of protection.[182]  this commenter stated that children should be educated to avoid improper content in the same unfiltered environments children experience in their homes, libraries, and offices.  he argued that filtering provides an inauthentic atmosphere that thwarts teachers� preparing their students to deal with reality.[183] alternatively, another commenter argued that acceptable use policies may give a false sense of protection.[184]  the commenter noted that appropriate use policies are a good protection measure, but there is an assumption that children can avoid offensive material simply by education.[185]  he also contended that time limits imposed by acceptable use policies have not been found to stop the ability of children to access inappropriate material online.[186]

            another difficulty that commenters highlighted is the constraints of the school environment.  they noted that the classroom setting is not always amenable to monitoring.[187]  they also stated that teachers express uncertainty about their role as monitor watching children online.  they noted that some teachers lack the requisite knowledge and sophistication about technology.[188] the national research council report also discussed several issues relating to acceptable use policies.  the council recommended that these policies should:  distinguish between adult and child use; distinguish between younger and older children; determine how to measure compliance; avoid overly broad wording and strive to list specific inappropriate behavior and material; protect against liability; and define a user�s rights.[189]

           the best practices and lessons learned that are set forth above provide valuable information for communities to consider as they develop and implement internet safety policies. 

v.        conclusion

in summary, existing technology protection measures have met many of the needs of educational institutions.  while the education community has had success with technology measures, however, the education community also recognizes that comprehensive child protection solutions do not rest solely with technology.  commenters emphasized that technology protection measures are most effective when teachers and educational institutions can customize technology and use it in connection with other strategies and tools.  educational institutions prefer local decision making that gives leaders the flexibility to select the appropriate technology that fits best with their unique circumstances and to consider non-economic factors that may influence technology selection decisions.  commenters also recognized the need for more training within educational institutions.  based on our evaluation of how existing technology protection measures have met the needs of educational institutions, ntia made two recommendations:  (1) additional training on the full use of technology protection measures, and (2) new legislative language that would clarify cipa�s existing �technology protection measure� language to ensure that technology protection measures include more than just blocking and filtering technology.  ntia believes this expanded definition will encourage educational institutions to utilize a wider range of technology that will better meet their needs.  with respect to internet safety policies, commenters reported an overwhelming satisfaction with the development and effectiveness of these policies. 

ntia also notes that the comments reveal the commitment of all interested parties � educators, academics, technology vendors, and associations � to protect children as they explore the online world.  ntia commends all the parties involved in this issue for their dedication and hard work.  our nation�s children will be well served by the ongoing efforts toward effective solutions that best protect children while allowing them to reap the many benefits of the internet.




appendix i. federal register notice


37396     federal register/vol. 67, no. 103/wednesday, may 29, 2002/notices


department of commerce

national telecommunications and information administration

[docket no. 020514121‑2121‑01]

in 0660‑xxi4

request for comment on the effectiveness of internet protection measures and safety policies

agency: national telecommunications and information administration, department of commerce.

action: notice; request for comments.

__________________________________

summary: the national telecommunications and information administration (ntia) invites interested parties to provide comments in response to section 1703 of the children's internet protection act (cipa), pub. l. no. 106‑554, 114 at 2763, 2763a‑336 (2000). section 1703 directs ntia to initiate a notice and comment proceeding to evaluate whether currently available internet blocking or filtering technology protection measures and internet safety policies adequately address the needs of educational institutions. the act also directs ntia to make recommendations to congress on how to foster the development of technology protection measures that meet these needs.

dates: written comments are requested to be submitted on or before august 27, 2002.

addresses: comments may be mailed to salience fortunate chagrin, office of policy analysis and development, national telecommunications and information administration, room 4716 hchb, 14th street and constitution avenue, nw., washington, dc 20230. paper submissions should include a diskette in html, ascii, word, or wordperfect format (please specify version). diskettes should be labeled with the name and organizational affiliation of the filer, and the name of the word processing program used to create the document. in the alternative, comments may be submitted electronically to the following electronic mail address: cipa‑study@ntia.doc.gov.comments submitted via electronic mail also should be submitted in one or more of the formats specified above.

for further information contact:

sallianne fortunato schagrin, office of policy analysis and development, ntia, telephone: (202) 482‑1880; or electronic mail: sschagrin@ntia.doc.gov.  media inquiries should be directed to the office of public affairs, national telecommunications and information administration: telephone (202) 482-7002.

of the schools with acceptable use policies, 94 percent reported having student access to the internet monitored by teachers or other staff; 74 percent used blocking or filtering software; 64 percent had honor codes; and 28 percent used their intranet. id. most schools (91 percent) used more than one procedure or technology as part of their policy: 15 percent used all of the procedures and technologies listed; 29 percent used blocking/ filtering software, teacher/staff monitoring, and honor codes; and 19 percent used blocking/ filtering software and teacher/staff monitoring. id. at 7, 8. in addition, 95 percent of schools with an acceptable use policy used at least one of these technologies or procedures on all internet‑connected computers used by students. id.

this trend appears to be reflected in the library community as well. a recent article in the library journal reports that of the 355 libraries responding to its budget report 2002, 43 percent reported filtering internet use, up from 31 percent in 2001, and 25 percent in 2000. norman oder, the new wariness, the library journal (jan. 15, 2002) (lj budget report 2002), available at http://1ibraryjoumal.reviewsnews.com/ index. asp?layout=articleprint &articieid‑‑ca188739. of those libraries filtering internet use, 96 percent reported using filters on all children's terminals. id.

the e‑rate and cipa

section 254(h) of the communications act of 1934, as amended by the telecommunications act of 1996, provides a universal support mechanism program (commonly known as the "e‑rate program") through which eligible schools and libraries may apply for discounted telecommunications, internet access, and internal connections services. see 47 u.s.c. 254(h). the program is administered by the universal service administrative company (usac) pursuant to regulations promulgated by the federal communications commission. see federal communications commission, universal service for schools and libraries, available at http://www.fcc.gov/wcb/universal_service/ schoolsandlibs.html.

according to usac, approximately 82 percent of public schools and 10 percent of private schools received e‑rate funding in the fiscal year (fy) 2000 funding cycle (july 1, 2000 through june 30, 2001) (using 1997 data base as denominator). see universal service administrative company, available at http://www.sl.universolservice.org. public libraries also rely heavily on erate funding; 57 percent of main public libraries received e‑rate funding in fy 2000. id.; see also lj budget report 2002 supra.

supplementary information:

growing concern about children's exposure to inappropriate online content

a u.s. department of commerce report, released earlier this year, indicates that as of september 2001 more than half of the nation's population (143 million americans) were using the internet. a nation online: how americans are expanding their use of the internet, national telecommunications and information administration, u.s. department of commerce (feb. 2002), available at http://www.ntia.doc.gov/ntiahome/dn/ index.html. children and teenagers use computers and the internet more than any other age group. id. at 1, 13. almost go percent of children between the ages of 5 and 17 (or 48 million) now use computers. id. at 1, 44. significant numbers of children use the internet at school or at school and home: 55 percent for 14‑17 year olds; 45 percent for 10‑13 year olds; and 22 percent for 5‑9 year olds. id. at 47. approximately 12 percent of 10 to 17 year olds use the internet at a library. id. at 52. noting the heightened interest regarding the possible exposure of children to unsafe or inappropriate content online, the department of commerce report notes that for the first time households were surveyed to determine the level of concern about their children's exposure to material over the internet versus their concern over exposure to material on television. the results indicated that 68.3 percent of households were more concerned about the propriety of internet content than material on television. id. at 54.

similarly, in its 2000 survey of public schools to measure internet connectivity, the department of education's national center for education statistics asked questions about "acceptable use policies" in schools in recognition of the concern among parents and teachers about student access to inappropriate online material. see internet access in u.s. public schools and classrooms: 1994-2000, nces 2001‑071, office of education research and improvement, department of education (may 2001), available at http://www.nces.ed.gov/pubs2001/internetaccess.

according to the nces survey, 98 percent of all public schools had access to the internet by the fall of 2000. id. at 1. the survey also indicated that almost all such schools had "acceptable use policies" and used various technologies or procedures (blocking or filtering software), an intranet system, student honor codes, or teacher/staff monitoring to control student access to inappropriate online material. id. at 7.


in october 2000, congress passed the children's internet protection act (cipa) as part of the consolidated appropriations act of 2001 (pub. l. no. 106‑554). under section 1721 of the act, schools and libraries that receive discounted telecommunications, internet access, or internal connections services under the e‑rate program are required to certify and adopt an internet safety policy and to employ technological methods that block or filter certain visual depictions deemed obscene, pornographic, or harmful to minors for both minors and adults.1 the federal communications commission implemented the required changes to the e‑rate program and the new cipa certification requirements became effective for the fourth e‑rate funding year that began on july 1, 2001, and ends on june 30, 2002. see federal‑state joint board on universal service, children's internet protection act, report and order, cc docket no. 96‑45 (march 30, 2001), available at http:// www.fcc.gov/wcb/universal_service/schoolsandlibs.html.

section 1703(a) of cipa directs ntia to initiate a notice and comment proceeding to determine if currently available blocking and filtering technologies adequately address the needs of educational institutions, make recommendations on how to foster the development of technologies that meet the needs of schools and libraries, and evaluate current internet safety policies. section 1703(a) of cipa specifically provides:

sec. 1703. study of technology protection measures

(a) in general. b not later than 18 months after the date of the enactment of this act, the national telecommunications and information administration shall initiate a notice and comment proceeding for purposes of--

(1) evaluating whether or not currently available technology protection measures, including commercial internet blocking and filtering software, adequately address the needs of educational institutions;

(2) making recommendations on how to foster the development of measures that meet such needs; and

(3) evaluating the development and effectiveness of local internet safety policies that are currently in operation after community input.

internet blocking and filtering software and acceptable use policies

the computer industry has developed a number of technology protection measures to block or filter prohibited content in response to the growing amount of online content. among these measures are stand alone filters, monitoring software, and online parental controls.

_______________________________________

i nita notes that sections 1712 and 1721 of the chip are currently the subject of constitutional challenge. see american library assn v. united states, no. 01‑cv‑1303 (ed. pa. march 20, 2001); multnomah county public library v. united states, no. 01‑cv‑1322 (e.d.pa. march 20, 2001). nita is not seeking comment on the constitutionality of the statute or its provisions.

 the pew internet and american life project reports that more than 41 percent (2 of every 5) of parents of children using the internet rely on monitoring software or use pre‑selected controls on their home computers. pew internet and american life project, the internet and education: findings of the pew internet and american life project, at 5 (september 2001), available at http:// www.pewinternet.org/reports/ toc.asp?report=36.

a consumer reports study indicated, however, that some technology protection companies refuse to disclose their method of blocking or filtering and their list of blocked sites, although users can submit web addresses to check against blocked lists in some cases. see digital chaperones for kids: which internet filters protect the best? which get in the way?, consumer reports at 2 (march 2001). another report indicates that technology protection tools can require a fair amount of technical expertise in order to be manipulated successfully, such as an understanding of how to unblock sites, adjust tools for different levels of access, and examine and interpret log files. trevor shaw, what's wrong with cipa, e‑school news (march 1, 2001), available at http:// www.eschoolnews.com/features/cipa/ cipa3.cfm.

the national research council (nrc) of the national academy of sciences recently released a report describing the social and educational strategies, technology‑based tools, and legal and regulatory approaches to protect children from inappropriate material on the internet. see youth, pornography, and the internet, committee to study tools and strategies for protecting kids from pornography and their applicability to other inappropriate internet content, national research council (nrc report) (may 2, 2002), available at http://bob.nap.edu/html/youth_internet/es.html.

among other things, the nrc report concludes that perhaps the most important social and educational strategy for ensuring safe online experiences for children is responsible adult involvement and supervision. id. at es‑7, 209. this strategy includes families, schools, libraries, and other organizations developing acceptable use policies to provide explicit guidelines about how individuals will conduct themselves online that will serve as a framework within which children can become more responsible for making better choices. id. at 218. the report notes that acceptable use policies are most effective when developed jointly with schools and communities. id. at 219.

the report suggests that acceptable use policies are not without problems, including how to avoid the "one size fits all" problem that may arise in trying to craft a policy that is appropriate for both young children as well as teenagers. id.at 219‑220. the nrc report also discusses the ways that technology provides parents and other responsible adults with additional choices as to how best to protect children from inappropriate material on the internet. id. at es‑8, 255‑304. the report notes, however, that filtering/ blocking tools are all imperfect in that they may "overblock" otherwise appropriate material or "underblock" some inappropriate material. id. at 259-266.

specific questions

in an effort to enhance ntia's understanding of the present state of technology protection measures and internet safety policies, ntia solicits responses to the following questions. ntia requests that interested parties submit written comments on any issue of fact, law, or policy that may provide information that is relevant to this evaluation. commenters are invited to discuss any relevant issue, regardless of whether it is identified below. to the extent possible, please provide copies of studies, surveys, research, or other empirical data referenced in responses.

evaluation of available technology protection measures

section 1703(a)(1) of the act requires nita to evaluate whether or not currently available technology protection measures, including commercial internet blocking and filtering software, adequately address the needs of educational institutions.

1. discuss whether available technology protection measures adequately address the needs of educational institutions.

2. is the use of particular technologies or procedures more prevalent than others?

3. what technology, procedure, or combination has had the most success within educational institutions?

4. please explain how the technology protection products block or filter prohibited content (such as "yes" lists, (appropriate content); "no" lists, (prohibited content), human review, technology review based on phrase or image, or other method.) explain whether these methods successfully block or filter prohibited online content and whether one method is more effective than another.

5. are there obstacles to or difficulties in obtaining lists of blocked or filtered sites or the specific criteria used by technology companies to deny or permit access to certain web sites? explain.

6. do technology companies readily add or delete specific web sites from their blocked lists upon request? please explain your answer.

7. discuss any factors that were considered when deciding which technology tools to use (such as training, cost, technology maintenance and upgrades or other factors.)


fostering the development of technology measures

section 1703(a)(2) directs ntia to initiate a notice and comment proceeding to make recommendations on how to foster the development of technology measures that meet the needs of educational institutions,

1. are current blocking and filtering methods effectively protecting children or limiting their access to prohibited internet activity?

2. if technologies are available but are not used by educational institutions for other reasons, such as cost or training, please discuss.

3. what technology features would better meet the needs of educational institutions trying to block prohibited content?

4. can currently available filtering or blocking technology adjust to accommodate all age groups from kindergarten through grade twelve? are these tools easily disabled to accommodate bona fide and other lawful research? are these tools easily dismantled?

current internet safety policies

section 1703(a)(3) requires ntia to evaluate the development and effectiveness of local internet safety policies currently in operation that were established with community input.

1. are internet safety policies an effective method of filtering or blocking prohibited material consistent with the goals established by educational institutions and the community? if not, please discuss the areas in which the policies do not effectively meet the goals of the educational institutions and/or community.

2. please discuss whether and how the current policies could better meet the needs of the institutions and the community. if possible, provide specific recommendations.

3. are educational institutions using a single technology protection method or a combination of blocking and filtering technologies?

4. describe any best practices or policies that have been effective in ensuring that minors are protected from exposure to prohibited content. please share practices proven unsuccessful at protecting minors from exposure to prohibited content.

dated: may 22, 2002.

 kathy d. smith,

chief counsel, national telecommunications and  information administration.

 [fr doc. 02‑13286 filed 5‑28‑02; 8:45 am]

______________________________


appendix ii:  list of commenters

american center for law and justice (aclj)
american civil liberties union (aclu)
american library association
center for democracy and technology (cdt)
cleanweb.net
charles m. bailey public library, winthrop, maine consortium for school networking
dobox, inc.
david duggan
e-mints
east brunswick public library, east brunswick, nj
electronic privacy information center (epic)
evanston public library, evanston, illinois
seth finkelstein
florida gulf coast university
fort morgan public library, fort morgan, colorado
free expression policy project (fepp)
grayson county public library
daniel s. hahn
international society for technology in education (iste)
jefferson-lewis boches
joseph mcclane
leo l. mosier
kidsnet, inc.
las vegas-clark county library district
meadowbrook high school library
mid-atlantic regional technology in education consortium (mar*tec)
morality in media (mim)
n2h2, inc.
national education association (nea)
palo alto united school district, palo alto, california
rebecca ramsby
responsible netizen institute (nancy willard)
st. pius x school, urbana, iowa
vericept corporation
kristen wallace
wiredsafety.org (parry aftab)

appendix iii:  filtering effectiveness tests cited in n2h2 comments to the ntia

test

date

product

effectiveness

method

pc week

4/7/1995

websense

mixed

query sample of urls

pc magazine

11/7/1995

cybersitter

mixed

query sample of urls

pc magazine

11/7/1995

net nanny

ineffective

query sample of urls

pc magazine

11/7/1995

surfwatch

mixed

query sample of urls

internet world

9/1/1996

cyber patrol

effective

query sample of urls

internet world

9/1/1996

cybersitter

mixed

query sample of urls

internet world

9/1/1996

intergo

effective

query sample of urls

internet world

9/1/1996

net nanny

ineffective

query sample of urls

internet world

9/1/1996

net shepherd

mixed

query sample of urls

internet world

9/1/1996

specs for kids

effective

query sample of urls

internet world

9/1/1996

surfwatch

mixed

query sample of urls

pc magazine

4/8/1997

cyber patrol

effective

query sample of urls

pc magazine

4/8/1997

cybersitter

effective

query sample of urls

pc magazine

4/8/1997

cybersnoop

effective

query sample of urls

pc magazine

4/8/1997

net nanny

effective

query sample of urls

pc magazine

4/8/1997

rated pg

effective

query sample of urls

pc magazine

4/8/1997

surfwatch

effective

query sample of urls

pc magazine

4/8/1997

x-stop

effective

query sample of urls

consumer reports

5/1/1997

cyber patrol

ineffective

query sample of urls

consumer reports

5/1/1997

cybersitter

ineffective

query sample of urls

consumer reports

5/1/1997

net nanny

ineffective

query sample of urls

consumer reports

5/1/1997

surfwatch

ineffective

query sample of urls

pc magazine

5/6/1997

little brother

effective

query sample of urls

pc magazine

5/6/1997

on guard

effective

query sample of urls

pc magazine

5/6/1997

smartfilter

effective

query sample of urls

pc magazine

5/6/1997

surfwatch

effective

query sample of urls

pc magazine

5/6/1997

websense

effective

query sample of urls

infoworld

8/18/1997

websense

effective

query sample of urls

pc world

10/1/1997

cyber patrol

mixed

query sample of urls

pc world

10/1/1997

cybersitter

effective

query sample of urls

pc world

10/1/1997

net nanny

mixed

query sample of urls

pc world

10/1/1997

net shepherd

mixed

query sample of urls

pc world

10/1/1997

surfwatch

effective

query sample of urls

computer shopper

11/1/1997

cybersitter

effective

query sample of urls

macworld

11/1/1997

cyber patrol

effective

query sample of urls

macworld

11/1/1997

surfwatch

effective

query sample of urls

macworld

11/1/1997

x-stop

effective

query sample of urls

internet magazine

12/1/1997

cyber patrol

effective

query sample of urls

internet magazine

12/1/1997

cyber snoop

effective

query sample of urls

internet magazine

12/1/1997

cybersitter

effective

query sample of urls

internet magazine

12/1/1997

n2h2

effective

query sample of urls

internet magazine

12/1/1997

safesurf

effective

query sample of urls

internet magazine

12/1/1997

surfwatch

effective

query sample of urls

internet magazine

12/1/1997

websense

effective

query sample of urls

internet magazine

12/1/1997

x-stop

effective

query sample of urls

infoworld

2/16/1998

cyber sentinel

effective

query sample of urls

pc magazine

3/24/1998

cyber patrol

effective

query sample of urls

pc magazine

3/24/1998

cyber sentinel

effective

query sample of urls

pc magazine

3/24/1998

cyber sitter

effective

query sample of urls

appendix iv:  sample acceptable use policies

  1. fairfax county public schools

 

acceptable use policy for network access

 

the information systems and internet access available through fcps are available to
support learning, enhance instruction, and support school system business practices.

fcps information systems are operated for the mutual benefit of all users. the use of the fcps network is a privilege, not a right.  users should not do, or attempt to do, anything that might disrupt the operation of the network or equipment and/or interfere with the learning of other students or work of other fcps employees. the fcps network is connected to the internet, a network of networks, which enables people to interact with hundreds of thousands of networks and computers.

all access to the fcps network shall be preapproved by the principal or program manager. the school or office may restrict or terminate any user�s access, without prior notice, if such action is deemed necessary to maintain computing availability and security for other users of the systems. other disciplinary action may be imposed as stated in the fairfax county public schools student responsibilities and rights (sr&r) document.

 

respect for others

users should respect the rights of others using the fcps network by:

        using assigned workstations as directed by the teacher.

        being considerate when using scarce resources.

        always logging off workstations after finishing work.

        not deliberately attempting to disrupt system performance or interfere with the work of other users.

        leaving equipment and room in good condition for the next user or class.

 

ethical conduct for users

accounts on the fcps network, both school-based and central, are considered private, although absolute security of any data cannot be guaranteed.  it is the responsibility of the user to:

        use only his or her account or password.  it is a violation to give access to an account to any other user.

        recognize and honor the intellectual property of others; comply with legal restrictions regarding plagiarism and the use and citation of information resources.

        not read, modify, or remove files owned by other users.

        restrict the use of the fcps network and resources to the mission or function of the school system.  the use of the fcps network for personal use or for private gain is prohibited.

        help maintain the integrity of the school information system.  deliberate tampering or experimentation is not allowed, which includes the use of fcps network and resources to illicitly access, tamper with, or experiment with systems outside fcps.

respect for property

the only software, other than students� projects, to be used on school computers or the school network are those products that the school may legally use. copying copyrighted software without full compliance with terms of a preauthorized licensing agreement is a serious federal offense and will not be tolerated. modifying any copyrighted software or borrowing software is not permitted.

        do not modify or rearrange keyboards, individual key caps, monitors, printers, or any other peripheral equipment.

        report equipment problems immediately to teacher or program manager.

        leave workstations and peripherals in their designated places.

appropriate use

        do not use offensive, obscene, or harassing language when using any fcps network system.

        information may not be posted if it: violates the privacy of others, jeopardizes the health and safety of students, is obscene or libelous, causes disruption of school activities, plagiarizes the work of others, is a commercial advertisement, or is not approved by the principal or program manager.

        users will not change or delete files belonging to others.

        real-time messaging and online chat may only be used with the permission of the teacher or program manager.

        students are not to reveal personal information (last name, home address, phone number) in correspondence with unknown parties.

        users exercising their privilege to use the internet as an educational resource shall accept the responsibility for all material they receive.

        users are prohibited from accessing portions of the internet that do not promote the instructional mission of fcps.

        all student-produced web pages are subject to approval and ongoing review by responsible teacher and/or principal. all web pages should reflect the mission and character of the school.

related documents: student responsibilities and rights; regulation 6410.2



  1. lake washington school district

lake washington school district

computer equipment appropriate use procedures

purpose

the lake washington school district provides a wide range of computer resources to its students and staff for the purpose of advancing the educational mission of the district. these resources are provided and maintained at the district's ‑‑and therefore, the public's -‑­expense and are to be used by members of the school community with respect for the public trust through which they have been provided.

the appropriate use procedures that follow provide details regarding the appropriate and inappropriate use of district computers. the procedures do not attempt to articulate all required or proscribed behavior by users. successful operation of the district computer network requires that all users conduct themselves in a responsible, decent, ethical, and polite manner while using the district computers. you, the user, are ultimately responsible for your actions in accessing and using district computers and the district computer network. as a user of district computers, you are expected to review and understand the guidelines and procedures in this document.

appropriate use procedures

scope

the following procedures apply to all district staff and students, and covers all district computer equipment including any desktop or laptop computers provided to staff, the district computer network ("lwsdnet"), and any computer software licensed to the district ("district computers").

appropriate use

the district expects everyone to exercise good judgment and use the computer equipment in a professional manner. your use of the equipment is expected to be related to the district's goals of educating students and/or conducting district business. the district recognizes, however, that some personal use is inevitable, and that incidental and occasional personal use that is infrequent or brief in duration is permitted so long as it occurs on personal time, does not interfere with district business, and is not otherwise prohibited by district policy or procedures.

use of district software: district software is licensed to the district by a large number of vendors and may have specific license restrictions regarding copying or using a particular program. users of district software must obtain permission from the district prior to copying or loading district software onto anycomputer, whether the computer is privately owned or is a district computer.

use of non‑district software: prior to loading non‑district software onto district computers (including laptops, desktops, and lwsdnet), a user must receive permission from the district. the district will create a list of "authorized software" programs that may be loaded onto district laptops without specific permission. for example, a user will be able to load software onto a laptop that is necessary for a user to access a personal internet service for the purpose of remotely accessing the district's email network. all software must be legally licensed by the user prior to loading onto district equipment. the unauthorized use of and/or copying of software is illegal,

"it is against lwsd practice for staff or students to copy or reproduce any licensed software on lwsd computing equipment, except as expressly permitted by the specific software license. unauthorized use of software is regarded as a serious matter and any such use is without the consent of l wsd.�

lswd directive 1/29/1990

remote access: the district provides remote access to its internal email network for the convenience of its staff. users may access the district's email network over a standard internet connection by using either a district laptop or a privately‑owned computer. district laptops also have the ability to use the district's email network "off‑line." a user's email folders are stored locally on the laptop. therefore, a user may read, delete, and reply to district email, and create new email, without a direct connection to the network. any reply or new email created by the user will be sent to the recipient the next time the user connects to the network. also, at the time of the direct connection to the network, email delivered while the user was off‑line will be immediately downloaded to the laptop.

prohibited uses: district computers may not be used for the following purposes:

�    commercial use: using district computers for personal or private gain, personal business, or commercial advantage is prohibited.

�    political use: using district computers for political purposes in violation of federal, state, or local laws is prohibited. this prohibition includes using district computers to assist or to advocate, directly or indirectly, for or against a ballot proposition and/or the election of any person to any office. the use of district computers for the expression of personal political opinions to elected officials is prohibited. only those staff authorized by the superintendent may express the district's position on pending legislation or other policy matters.

�     illegal or indecent use: using district computers for illegal, harassing, vandalizing, inappropriate, or indecent purposes (including accessing, storing, or viewing pornographic, indecent, or otherwise inappropriate material), or in support of such activities is prohibited. illegal activities are any violations of federal, state, or local laws (for example, copyright infringement, publishing defamatory information, or committing fraud). harassment includes slurs, comments, jokes, innuendoes, unwelcome compliments, cartoons, pranks, or verbal conduct relating to an individual that (1) have the purpose or effect or creating and intimidating, a hostile or offensive environment; (2) have the purpose or effect of unreasonably interfering with an individual's work or school performance, or (3) interfere with school operations. vandalism is any attempt to harm or destroy the operating system, application software, or data. inappropriate use includes any violation of the purpose and goal of the network. indecent activities include violations of generally accepted social standards for use of publicly‑owned and operated equipment.

�      non‑district employee use: district computers may only be used by district staff and students, and others expressly authorized by the district to use the equipment.

�      disruptive use: district computers may not be used to interfere or disrupt other users, services, or equipment. for example, disruptions 'include distribution of unsolicited advertising ("spam"), propagation of computer viruses, distribution of large quantities of information that may overwhelm the system (chain letters, network games, or broadcasting messages), and any unauthorized access to or destruction of district computers or other resources accessible through the district's computer network ("crack mug" or "hacking").

privacy

district computers, the internet, and use of email are not inherently secure or private. for example, the content of an email message, including attachments, is most analogous to a letter or official memo rather than a telephone call, since a record of the contents of the email may be preserved by the sender, recipient, any parties to whom the email may be forwarded, or by the email system itself. it is important to remember that once an email message is sent, the sender has no control over where it may be forwarded and deleting a message from the user's computer system does not necessarily delete it from the district computer system. in some cases, emails have also been treated as public records in response to a public records disclosure request. likewise, files, such as internet "cookies" (explained more fully below) may be created and stored on a computer without the user's knowledge. users are urged to be caretaker's of your own privacy and to not store sensitive or personal information on district computers. the district may need to access, monitor, or review electronic data stored on district computers, including email and internet usage records.

while the district respects the privacy of its staff and while the district currently does not have a practice of monitoring or reviewing electronic information, the district reserves the right to do so for any reason. the district may monitor and review the information in order to analyze the use of systems or compliance with policies, conduct audits, review performance or conduct, obtain information, or for other reasons. the district reserves the right to disclose any electronic message to law enforcement officials, and under some circumstances, may be required to disclose information to law enforcement officials, the public, or other third parties, for example, in response to a document production request made in a lawsuit involving the district or by a third party against the user or pursuant to a public records disclosure request.

discipline

the appropriate use procedures are applicable to all users of district computers and refers to all information resources whether 'individually controlled, shared, stand alone, or networked. disciplinary action, if any, for students, staff, and other users shall be consistent with the district's standard policies and practices. violations may constitute cause for revocation of access privileges, suspension of access to district computers, other school disciplinary action, and/or appropriate legal action. specific disciplinary measures will be determined on a case‑by‑case basis.

care for district computer

users of district computers are expected to respect the district's property and be responsible in using the equipment. users are to follow any district instructions regarding maintenance or care of the equipment. users may be held responsible for any damage caused by your intentional or negligent acts in caring for district computers under your control. the district is responsible for any routine maintenance or standard repairs to district computers. users are expected to timely notify the district of any need for service.

users are not to delete or add software to district computers without district permission. due to different licensing terms for different software programs, it is not valid to assume that if it is permissible to copy one program, then it is permissible to copy others.

if a district laptop is lost, damaged, or stolen while under the control of a user, the user is expected to file a claim under his/her insurance coverage, where coverage is available. except in cases of negligent or intentional loss or damage, the district will cover out‑of‑pocket expenses.

using email and the internet wisely

using email wisely

�    email encourages informal communication because it is easy to use. however, unlike a telephone call however, email creates a permanent record that is archived and often transmitted to others. remember that even when you delete an email from your mailbox, it still may exist in the system for some period of time.

�     be circumspect about what you send and to whom. do not say anything in an email that you would not want to see republished throughout the district, in internet email,or on the front page of the eastside journal. remember that email invites sharing; a push of the button will re‑send your message worldwide, if any recipient (or hacker) decides to do so. what you say can be republished and stored by others.

�   beware of the "reply all" button. often your message only needs to be returned to one individual ‑‑is the message really appropriate for (and should it really take the time of) everyone on the address list?

�     you can create liability for yourself and the district. for example, within or outside the district, if you "publish" (type or re‑send) words that defame another individual or disparage another individual or institution, if you upload or download or re‑send copyrighted or pornographic material, if you use email to harass or discriminate against someone, or if you send private information or data about someone, you may violate applicable laws and district policy. make sure none of your activities violate any law or policy.

�     please keep in mind that because of intermediary server problems and other potential delays, internet email can sometimes take anywhere from five minutes to several days to arrive. it may not be the best means to send time‑sensitive information.

�     finally, beware of sending attachments. they may arrive garbled if the recipient is using a different email system.

�     email attachments can introduce viruses into the district system, and you can introduce a virus into a recipient's system by forwarding an infected attachment. this is especially likely if the attachment arrives from an unknown source via the internet. if you do not know the sender of internet email, consider routing the message to the mis staff who can open the attachment for you on a computer isolated from the district network. while that should prevent activating a virus, it will not stop certain other infections (e.g., a logic bomb). please do not open attached files ending in 46exe,99 "bat," or "com," as these files may be viruses or programs designed to delete data from the computer.

using the internet access wisely

�     be circumspect about where you go and what you do. do not visit any site or download or share any material that might cause anyone to question your professionalism, or the district's.

�      read the "license" or "legal" contract terms on every site. do not purport to bind the district to any license or other contract. if you make an agreement on your own behalf, do not violate that agreement using the district equipment or internet account. do not assume that just because something is on the internet, you may copy it. as a general rule, assume that everything is copyrighted and do not copy it unless there is a notice on the site stating that you may do so. for example, if you see a clever cartoon assume that you may not copy it. governmental documents are an exception (you may copy them), but you must confirm that it is the "government" and not a government‑related entity such as the post office.

�     be aware of the "do you want a cookie?" messages (if you have configured your browser to get such messages). if you answer yes, whatever activity in which you are engaged will be logged by the site owner to help it or its advertisers develop a profile about you or the district. it is possible that your browser is set to accept cookies without asking you each time.

�    you can create liability for yourself and the district. for example, if you "publish" (type or re‑send) words that defame or disparage another individual or institution, if you upload or download or re‑send copyrighted or pornographic material, if you use the internet to harass or discriminate against someone, or if you provide private information or data about someone, you may violate applicable laws or district policy. make sure none of your activities violate any law or policy.

�    do not engage in any "spamming" or other activities that could clog or congest internet networks.


lake washington school district

computer user agreement and release form

as a condition of using the lake washington school district ("district") computer equipment, including the computer network, desktop computers, and laptop computers ("district computers"), i understand and hereby agree to the following:

1.     awareness of rules. i have reviewed, understand, and agree to abide by the district computer appropriate use procedures, the internet code of conduct, and this agreement.

2.     district property . i understand that district computers are the property of the district and are devoted to the educational mission of the district. therefore, my use of the district computers, including the use of the internet and of the electronic mail systems, is a privilege and not a right.

3.   personal responsibility. i am responsible for my use of the district computers. i understand that my communications over the internet and through email may be traceable to the district or to me. although the district currently allows incidental and personal use of district computers that is infrequent or brief in duration, i will always use district computers in a professional manner. my privilege to use district computers may be revoked, suspended, or limited by the district at any time for any violation of the district computer appropriate use procedures, internet code of conduct, this agreement, or any other violation of district policies or federal, state, or local laws. the district will be the sole arbiter of what constitutes a violation of the above rules.

4.   privacy. while the district does not currently have a practice of regular monitoring or reviewing electronic information, the district reserves the right to do so for any reason, including (without limitation) to analyze district computer use, perform audits, review performance or conduct, and/or obtain information. i understand that the district has the right to review any material stored on or transmitted through district computers, including email, internet files (including web pages and usage logs), and software. the district may edit or remove any material which it, in its sole discretion, believes may be unlawful, indecent, obscene, abusive, or otherwise inappropriate.

5.     no warranty. i agree that my use of the district computers is at my own risk. the district does not guarantee or warrant in any way the performance or quality of district computers or any network accessible through district computers, nor does the district warrant that such networks or equipment will meet any specific requirements that i may have. the district will not be liable for any direct or indirect, incidental, or consequential damages (including lost or irrecoverable data or information) sustained or incurred in connection with the use, operation, or inability to use district computers.

6.   release. in consideration for the privilege of using district computers, i hereby release lake washington school district, its directors, employees, agents, and affiliates from any and all claims and damages of any nature arising from my use of, or inability to use, the district computers.

user title_________________________________________

user organization lake washington school district

school____________________________________________

user day phone____________________________________

  _____________________________________ 

  signature of user

  _____________________________________

  printed name of user

  _____________________________________

  date signed

  to be signed by a member of the building internet training team:

   i certify that the above person has completed the basic training necessary to qualify for an internet account.

  _________________________________                            __________________________________

 printed name                                                        signature



endnotes:

[1] children�s internet protection act (cipa), pub. l. no. 106-554 (2000) (codified at 20 u.s.c. �� 6801, 6777, 9134 (2003); 47 u.s.c. � 254 (2003)).

[2] cipa, supra note 1.

[3] u.s. department of commerce, national telecommunications and information administration, a nation online: how americans are expanding their use of the internet at 1, 13 (feb. 2002), available at http://www.ntia.doc.gov/ntiahome/dn/index.html.

[4] national center for education statistics, internet access in u.s. public schools and classrooms: 1994-2001 at 3 (september 2002) available at http://nces.ed.gov/pubs2002/2002018.pdf. 

[5] corporation for public broadcasting, connected to the future (march 2003).

[6] see, e.g., u.s. department of commerce, national telecommunications and information administration, how access benefits children:  connecting our kids to the world of information (sept. 1999).

[7] the commission on child online protection act final report to congress at 1 (oct. 20, 2001).

[8] national research council, youth, pornography, and the internet, committee to study tools and strategies for protecting kids from pornography and their applicability to other inappropriate internet content at 387 (may 2002).

[9] id.

[10] the commission on child online protection act final report to congress (oct. 20, 2001).

[11] id. at 19.

[12] id. at 21.

[13] id. at 22.

[14] id. at 34.

[15] id. at 25-26.

[16] see digital chaperones for kids:  which internet filters protect the best?  which get in the way? consumer reports, mar. 2001, at 2.

[17] see, e.g., www.getnetwise.org and www.netsmart.org.

[18] the communications decency act of 1996, pub. l. no. 104-104, 110 stat. 56 (codified at 47 u.s.c. � 223)(2003)).

[19] reno v. american civil liberties union, 521 u.s. 844 (1997).

[20] child online protection act (copa), pub. l. no. 105-277, 112 stat. 2681- 736 (codified at 47 u.s.c. � 231)(2003)).

[21] the american civil liberties union v. reno, 31 f.supp. 2d. 473 (e.d. pa. 1999).

[22] the american civil liberties union v. reno, 217 f.3d 162 (3d cir. 1999).

[23] ashcroft v. the american civil liberties union, 535 u.s. 564 (2002).

[24] id. at 566.

[25] the american civil liberties union v. ashcroft, 322 f.3d 240 (3d cir. 2003) (holding that the terms �material harmful to minors� and �for commercial purpose,� as defined, were not sufficiently narrowly tailored).

[26] cipa, supra note 1.

[27] see american library association v. united states, no. 01-cv-1303 (e.d. pa. march 20, 2001); multnomah county public library v. united states,no. 01-cv-1332 (e.d. pa. march 20, 2001).

[28] american library association v. united states, 201 f. supp. 2d. 401 (e.d. pa. 2002).

[29] united states v. american library association, 123 s. ct. 551 (2002).

[30] united states v. american library association, 123 s. ct. 2297 (2003).

[31] id.

[32] see in the matter of federal-state joint board on universal service, children�s internet protection act, cc docket no. 96-45, order, fcc 03-188 (rel. july 24, 2003) (implementation timing modifications).

[33] request for comment on the effectiveness of internet protection measures and safety policies, 67 fed. reg. 37396 (may 24, 2002).

[34] see appendix ii for list of commenters.  see www.ntia.doc.gov/ntiahome/ntiageneral/cipacomments/index.html for copies of all comments.  comments are also on file at the national telecommunications and information administration.  page numbers refer to the location in the comments on file at ntia.

[35] cipa, supra note 1.

[36] comment by center for democracy and technology at 5 (no date) [hereinafter cdt]; comment by leo mosier at 1 (aug. 13, 2002); comment by melora ranney, charles m. bailey public library at 1 (aug. 10, 2002) [hereinafter ranney]; comment by cathy bosley, fort morgan public library, fort morgan, co at 1 (aug. 10, 2002) [hereinafter bosley]; comment by robert peters, morality in media at 1, 2 (aug. 14, 2002) [hereinafter mim]; comment by nancy ledeboer, las vegas-clark county library district at 1, 2 (aug. 21, 2002) [hereinafter ledeboer]; comment by american center for law and justice at 3 (aug. 26, 2002) [hereinafter aclj]; comment by nancy willard at 7 (aug. 27, 2002) [hereinafter willard].

[37] comment by shelly murray at 1 (aug. 1, 2002); bosley, supra note 36, at 1.

[38] comment by gina montgomery at 1 (june 4, 2002).

[39] bosley, supra note 36, at 1.

[40] id.

[41] comment by n2h2 at 13 (aug. 27, 2002) [hereinafter n2h2]; aclj, supra note 36, at 3. the results from these tests have been compiled into the report, �the facts on filters,� authored by david burt.  the labs conducting these tests include: zd net labs, consumer reports labs, camden associates, iw labs, eweek labs, the pc world test center, the info world test center, macworld labs, network world test alliance, and real-world labs.

[42] aclj, supra note 36, at 3.

[43] ntia�s request for comment did not seek comments of the constitutionality of the cipa statute or its provisions.  several commenters directed ntia to the national research council study, youth, pornography, and the internet, released in may 2002.  comment by richard cate, state education department, university of the state of new york at 2 (aug. 22, 2002) [hereinafter cate]; cdt, supra note 36, at 3,4; comment by parry aftab, wiredsafety.org at 2 (july 15, 2002) [hereinafter aftab]; comment by anita carter, palo alto unified school district at 1 (august 10, 2002); comment by american civil liberties union and electronic privacy information center at 1-2 (aug. 27, 2002) [hereinafter aclu].  among other things, the report studied the many existing ways to block content with technology. the section analyzing filters explains that filters are subject to two kinds of inevitable errors:  overblocking and underblocking.  national research council, supra  note 8, at 51, 58.

[44] american library association v. united states, 201 f. supp.2d 401, 431-432 (e.d. pa. 2002).

[45] cdt, supra note 36, at 4. 

[46] comment by national education association at 4 (aug. 27, 2002) [hereinafter nea]; international society for technology in education at 4 (aug. 27, 2002) [hereinafter iste].

[47] the pew internet and american life project, the digital disconnect:  the widening gap between internet-savvy students and their schools (august 14, 2002), available at http://www.pewinternet.org/reports/toc.asp?report=67.

[48] american library association v. united states, 201 f. supp.2d 401 (e.d. pa. 2002).

[49] id.

[50] cdt, supra note 36, at 3; comment by consortium for school networking at 4 (aug. 16, 2002) [hereinafter cosn]; comment by american library association at 2 (aug. 26, 2002) [hereinafter ala]; aclu, supra note 43, at 1-2.

[51] american library association v. united states, 201 f. supp.2d at 418.

[52] id.

[53] id.

[54] id.

[55] id. at 436-437.

[56] id. at 442-443.

[57] id.

[58] id. at 446-448.

[59] id. at 442.

[60] id. at 446-448.

[61] cosn, supra note 50, at 14, 15.

[62] the consortium for school networking, safeguarding the wired schoolhouse (june 2001) at 11.

[63] id. at 27.

[64] cipa section 1711(a)(3) (codified at 20 u.s.c. � 6777(c)(2003)); cipa section 1712(a)(3) (codified at 20 u.s.c. � 9134(b)(2003)); cipa section 1721(a) (codified at 47 u.s.c. 254(h)(5)(d)(2003)); cipa section 1721(b) (codified at 47 u.s.c. 254(h)(6)(d)(2003)).

[65] american library association v. united states, 201 f. supp.2d 401, 484-486 (e.d. pa. 2002).

[66] iste, supra note 46, at 9; nea, supra note 46, at 8; cosn, supra note 50, at 10, 11.

[67] disabling during certain use, cipa section 1711(a)(3) (codified at 20 u.s.c. � 6777(c)(2003)) (stating that an administrator, supervisor, or person authorized by the responsible authority under paragraph (1) may disable the technology protection measure concerned to enable access for bona fide research or other lawful purposes); cipa section 1712(a)(3) (codified at 20 u.s.c. � 9134(b)(2003)) (stating that an administrator, supervisor, or other authority may disable a technology protection measure under paragraph (1) to enable access for bona fide or other lawful purposes).

[68] disabling during adult use, cipa section 1721(a) (codified at 47 u.s.c. 254(h)(5)(d)(2003)) (stating  that an administrator, supervisor, or other person authorized by the certifying authority under subparagraph  (a)(i) may disable the technology protection measure concerned, during use by an adult, to enable access for bona fide research or other lawful purpose); cipa section 1721(b) (codified at 47 u.s.c. 254(h)(6)(d)(2003)) (stating that an administrator, supervisor, or other person authorized by the certifying authority under subparagraph (a)(i) may disable the technology protection measure concerned, during use by an adult, to enable access for bona fide research or other lawful purpose).

[69] iste, supra note 46, at 11.

[70] cosn, supra note 50, at 15.

[71] willard, supra note 36, at 2; aclu, supra note 43, at 1, 2; cdt, supra note 36, at 2.

[72] iste, supra note 46, at 5; nea, supra note 46, at 2.

[73] id.

[74] nea, supra note 46, at 2.

[75] comment by dr. patrick greene, florida gulf coast university at 1 (aug. 8, 2002) [hereinafter greene]; willard, supra note 36, at 1.

[76] n2h2, supra note 41, at 7.

[77] id.; comment by nicole toomey davis, dobox at 1 (july 25, 2002) [hereinafter dobox].

[78] n2h2, supra note 41, at 8.

[79] id. at 12.

[80] id.

[81] nea, supra note 46, at 5; n2h2, supra note 41, at 8.

[82] nea, supra note 46, at 5.

[83] comment by free expression policy project at 2 (aug. 26, 2002) [hereinafter fepp].

[84] ala, supra note 50, at 1.

[85] cdt, supra note 36, at 4.

[86] aclu, supra note 43, at 6; fepp, supra note 83, at 2; cosn, supra note 50, at 4.

[87] willard, supra note 36, at 6; nea, supra note 46, at 7.

[88] nea, supra note 46, at 4; iste, supra note 46, at 4.

[89] willard, supra note 36, at 3.  according to willard, when teachers direct students to use home computers, this practice impedes the education of students without home computers.  willard further argues that this situation results in an ineffective use of the expensive computer technology that has been installed in schools. 

[90] cosn, supra note 50, at 15.

[91] id.

[92] id.

[93] ledeboer, supra note 36, at 2. 

[94] dobox, supra note 77, at 3, 5; n2h2, supra note 41, at 31; comment by kidsnet at 3 (aug. 27, 2003) [hereinafter kidsnet].

[95] dobox, supra note 77, at 12.

[96] id. at 3.

[97] aclj, supra note 36, at 4.

[98] cate, supra note 43, at 3; iste, supra note 46, at 2.

[99] iste, supra note 46, at 2; willard, supra note 36, at 7.

[100] iste, supra note 46, at 9.

[101] previously, schools relied on a grant established in 1996 called the technology challenge fund.  the fund subsidized additional technology costs for schools not covered by the e-rate.  congress had allotted $200 million to the fund for the u.s. department of education to administer, but the fund expired on september 30, 2002.

[102] the federal communications commission, universal service for schools and libraries, available at http://www.fcc.gov/wcb/universal_service/schoolsandlibs.html.  the universal service administrative company (usac) administers the schools and libraries program, also called the e-rate program.  according to usac, approximately 82 percent of public schools and 10 percent of private schools received e-rate funding in the fiscal year (fy) 2000 funding cycle (july 1, 2000 through june 30, 2001) (using 1997 data base as denominator).  public libraries also rely heavily on e-rate funding--57 percent of main public libraries received e-rate funding in fy 2000.  successful applicants receive discounts ranging from 20 percent to 90 percent, depending upon the household income level of students and whether the school or library is located in a rural or urban area.  the program is intended to assist local and state programs connecting schools and libraries to the internet. 

[103] cosn, supra note 50, at 11; nea, supra note 46, at 5; iste, supra note 46, at 9.

[104] willard, supra note 36, at 6.

[105]  id.

[106] comment by mid atlantic regional technology in education consortium (mar*tec) at 1 (aug. 27, 2002) [hereinafter mar*tec].

[107] willard, supra note 36, at 7; comment by vericept at 1 (aug. 27, 2002) [hereinafter vericept].

[108] federal-state joint board on universal service, cc docket no. 96-45, report and order, fcc 01-120 (april 5, 2001).

[109] id. at 3, 4.

[110] national center for education statistics, supra note 4, at 10.

[111] see, e.g., mar*tec,supra note 106, at 1.

[112] iste, supra note 46, at 5.

[113] cosn, supra note 50, at 11.

[114] cate, supra note 43, at 1.

[115] comment by david duggan at 1, 3 (aug. 26, 2002) [hereinafter duggan].

[116] id. at 1.

[117] comment by janice bojda, evanston public library at 1 (aug. 27, 2002) [hereinafter bojda] (stating that libraries make web page selection choices using the same standards as they do to select materials in their hard copy selection).

[118] ranney, supra note 36, at 1.

[119] ledeboer, supra note 36, at 4.

[120] cate, supra note 43, at 2.

[121] id. at 3.

[122] cosn, supra note 50, at 15.

[123] iste, supra note 46, at 9.

[124] aclu, supra note 43, at 4 (stating that cipa requires technology protection measures to block images, yet image recognition technology is immature).

[125] ntia summarization of the four vendor comments should not be construed as an endorsement of any product. 

[126] dobox, supra note 77, at 1.

[127] kidsnet, supra note 94, at 1.

[128] n2h2, supra note 41, at 4.

[129] vericept, supra note 107, at 1.

[130] content filtering of the web gains foothold in corporate market, the wall street journal europe, april 11, 2001.

[131] id.

[132] ipo.com at http://www.ipo.com/ipoinfo/search.asp?p=ipo&srange=1900&pstart=1/1/1998 (last viewed march 2003.  site no longer available).

[133] id.

[134] pep:  resources for parents, educators, and publishers, guide to parental controls/internet safety products, at http://www.microweb.com/pepsite/software/filters.html.

[135] iste, supra note 46, at 7.

[136] id.; willard, supra note 36, at 6; cosn, supra note 50, at 12.

[137] cosn, supra note 50, at 12; willard, supra note 36, at 6; dobox, supra note 77, at 1.

[138] willard, supra note 36, at 6.

[139]  id.

[140] cipa section 1703(a)(2), pub. l. no. 106-554 (2000).

[141] cipa, supra note 64.

[142] cate, supra note 43, at 2; cosn, supra note 50, at 15; iste, supra note 46, at 9.

[143] iste, supra note 46, at 5; comment by karen gillespie, grayson county public library at 1 (aug. 8, 2002); comment by janice friesen, emints at 2 (aug. 9, 2002) [hereinafter emints]; ranney, supra note 36, at 1; bosley, supra note 36, at 1; comment by jason stone, east brunswick public library, east brunswick, nj at 1 (aug. 14, 2002) [hereinafter ebpl]; ledeboer, supra note 36, at 2; bojda, supra note 117, at 1.

[144] iste, supra note 46, at 5.

[145] cosn, supra note 50, at 16; nea, supra note 46, at 2; mar*tec, supra note 106, at 1; ebpl, supra note 143, at 1; ledeboer, supra note 36, at 1; iste, supra note 46, at 13; cdt, supra note 36, at 3; cate, supra note 43, at 1.

[146] cosn, supra note 50, at 16.

[147] cate, supra note 43, at 1.

[148] id.

[149] ebpl, supra note 143, at 1; ledeboer, supra note 36, at 1.

[150] ledeboer, supra note 36, at 1.

[151] cosn, supra note 50, at 17; nea, supra note 46, at 2; aftab, supra note 43, at 19; iste, supra note 46, at 13; cdt, supra note 36, at 3.

[152] aftab, supra note 43, at 9-28.

[153] iste, supra note 46, at 13.

[154] id. at 5; nea, supra note 46, at 9.

[155] national research council, supra note 8, at 235.

[156] id.

[157] id. at 235-236.

[158] ebpl, supra note 143, at 1.

[159] mar*tec, supra note 106, at 1.

[160] aftab, supra note 43, at 19.

[161] nea, supra note 46, at 3.

[162] cdt, supra note 36, at 7.

[163] id. at 1.

[164] bojda, supra note 117, at 1.

[165] willard, supra note 36, at 8.

[166] id. at 4.

[167] emints, supra note 143, at 1.

[168] duggan, supra note 115, at 3.

[169] ledeboer, supra note 36, at 2.

[170] mim, supra note 36, at 1.

[171] aftab, supra note 43, at 35.

[172] mar*tec, supra note 106, at 1.

[173] nea, supra note 46, at 9.

[174] cosn, supra note 50, at 17;  emints, supra note 143, at 1.

[175] emints, supra note 143, at 1.

[176] id.

[177] some commenters expressed concern with sanctions that remove computer privileges from students.  such sanctions severely disadvantage students without home computers or internet access.

[178] emints, supra note 143, at 1.

[179] iste, supra note 46, at 2.

[180] id.

[181] id.

[182] greene, supra note 75, at 1.

[183] id.

[184] aclj, supra note 36, at 8.

[185] id.

[186] id.

[187] mar*tec, supra note 106, at 1.

[188] id.

[189] national research council, supra note 8, at 238-240.

louis vuitton outlet