Revised Network Engineering Plan for NSF HPC Award #ANI-9810309

Kansas State University April 2, 1999

C.3 Network Engineering Plan

The network engineering plan in the original HPC award application specified a connection to the vBNS. With the development of the Abilene Network, the Great Plains Network and therefore Kansas State University chose to connect to Abilene rather than the vBNS. This revised engineering plan specifies how Kansas State University connected to the Abilene Network.

C.3.1 Current K-State Network Infrastructure

Network Core and Distribution

The core of Kansas State University's current network is a switched full duplex 100 Mbps collapsed Ethernet backbone connecting Cisco Systems, Inc. routers (see figure 1).

The network is distributed to each campus building via three fiber stars with multiple single and multi-mode fiber cables running to each building. Each star is serviced by one or more routers connected to the backbone switch at 10 or 100 Mbps. Buildings with historically lower bandwidth requirements are connected to core routers at 10 Mbps which are in turn connected to the backbone switch at 10 Mbps. Buildings with key campus servers and higher bandwidth needs are connected to high performance core routers at full duplex 100 Mbps which are in turn connected to the core switch at full duplex 100 Mbps. Network connections to each building are continually monitored to prioritize and anticipate the need for greater connectivity.

Over the past two years, upgrades to the core and distribution network infrastructures have been made in response to and in anticipation of high speed connectivity needs on campus. Some upgrades were also initiated when K-State became a charter member of the Internet2 project. Upgrades included installation of the backbone switch, the addition of Cisco 7000 and 7513 routers in the core, and the replacement of building entry point shared hubs with Ethernet switches capable of supporting multiple high speed technologies (Fast Ethernet, 100VG, FDDI, and ATM in use, Gigabit Ethernet planned). Most of these devices are Cisco Catalyst series switches.

Fiber optic cables connect several outlying campus centers and leased T1 lines connect other remote sites, including the K-State campus in Salina, Kansas.

Building Infrastructure to the Servers and Desktops

Networks within buildings are a mixture of shared 10 Mbps, switched 10 Mbps, switched 100 Mbps Ethernet, 100VG, FDDI, and ATM. A gigabit Ethernet network has been installed in the Department of Computing and Information Sciences to support research. The campus cable plant is a mixture of Category 3 and 5 twisted pair, coax, and fiber. All new installations are either Category 5 copper or, when distance is an issue, fiber optic cable.

Concurrent with the enhancements in the core and distribution segments of the campus network, infrastructure within the buildings is being upgraded. Shared hubs are being replaced with Ethernet switches where performance problems have been identified. All new installations use switches instead of hubs. As for the cable plant, a long-term project is underway to re-wire old buildings that have obsolete and problematic network wiring. Fiber infrastructure is being installed to reach new state-of-the-art terminal rooms to distribute data throughout the building at speeds of at least 100 Mbps. Construction on the first two buildings is complete with more planned for the near future. This design allows for flexible growth by distributing appropriate network technologies in the terminal rooms throughout the buildings in response to either localized or wide area requirements.

Internet Connectivity

Wide area networking to the commercial Internet and other Kansas education institutions, including the University of Kansas, is provided by the Kansas Research and Education Network (KANREN), a consortium of higher education institutions, K-12 school districts, and other non-profit organizations in the state of Kansas. KANREN's DS-3 and T1 backbone connects K-State to other KANREN members and provides redundancy for Internet connectivity.

C.3.2 Planned High-Speed Campus Infrastructure

Initially, the core campus network will retain the switched Ethernet backbone of routers and connect additional buildings at full duplex 100 Mbps Ethernet to the core routers. All buildings involved in the primary research projects described above in section C.2 will be connected to the core at full duplex 100 Mbps within the next year and equipped with a high performance switch to serve the needs of the researchers in those buildings. Installing high-performance modular switches such as the Cisco Catalyst models 3200, 4000, 5000, and 5500 provide the flexibility to connect buildings to the core with different high-speed technologies as the needs change.

Gigabit Ethernet Backbone

Within the first year of the campus network enhancement project, the core switch and routers will be replaced with a switched Gigabit Ethernet backbone of Cisco 550X series switches. Older routers will be retired and two of the newer routers (Cisco 7000 and 7513) will be retained for WAN connections and for connecting smaller buildings with lower performance requirements.

Within buildings, shared hubs will continue to be replaced by switching devices and Category 3 cable replaced by Category 5 and fiber optic cabling. K-State will initially provide at least a switched 10 Mbps connections to the desktop of each researcher with a meritorious HPC application. In cases where 10 Mbps is inadequate, switched 100 Mbps Ethernet connections will be provided. Concurrent with this will be switched 100 and 1000 Mbps connections to servers and 100 and 1000 Mbps backbones within buildings. Long term plans call for Gigabit Ethernet to every major building and selected desktops and servers, along with aggregating multiple Gigabit Ethernet channels to provide greater bandwidth and more fault tolerance as needed.

Plans are also underway to run fiber to K-State's Manufacturing Learning Center in an industrial park located several miles from the main campus. This will replace the leased T1 currently connecting the Center and extend high-speed access to that site. Related to this project is the replacement of the T1 serving the K-State campus in Salina, Kansas, with a T3 circuit. Both projects are expected to be completed during 1999.

While these improvements are at least partially motivated by the need to provide high-speed access to the Great Plains Network, Abilene Network, the vBNS, and Internet2 sites for specified researchers, all users in their respective buildings will benefit from the increased bandwidth to their building and in the core. No restrictions will be placed initially on these connections that will limit the commodity use by every faculty, staff, and student user at Kansas State University. In essence, Quality of Service (QoS, see section C.3.4) guarantees will be provided by over-provisioning the network. However, QoS developments of Internet2 and the research community will be monitored closely and deployed in our core and distribution networks in order to provide end-to-end guarantees to the researchers.

C.3.3 High-Speed WAN/Abilene Connectivity

Kansas State University has connected to the Abilene Network through the Great Plains Network GigaPoP in Kansas City, Missouri. K-State connects to the Great Plains Network with a DS-3 ATM circuit managed by KANREN. The connection between the Great Plains GigaPoP and Abilene is an OC-12 circuit shared by other Great Plains Network HPC awardees.

C.3.3.1 KANREN

The Kansas Research and Education Network (KANREN, http://www.kanren.net) is a consortium of institutions of higher education, K-12 school districts, and other non-profit organizations within the state of Kansas. KANREN provides commercial Internet service for both Kansas State University and the University of Kansas. KANREN currently has a backbone that traverses the state to connect the member institutions and provide redundancy for the commercial Internet connections.

The KANREN backbone consists of DS-3 circuits between the campuses at Manhattan (Kansas State University), Lawrence (University of Kansas), and Kansas City (University of Kansas Medical Center)(see Figure 2). T1 circuits link Wichita (Wichita State University) with Manhattan and Kansas City. Connectivity to the commercial Internet is provided through the Wichita hub for the southern Kansas sites and through the Great Plains Network in Kansas City for the northern KANREN customers which include Kansas State University, the University of Kansas, and the University of Kansas Medical Center.

As major users of the KANREN backbone and Internet connectivity, Kansas State University and the University of Kansas provide much of the funding for the KANREN backbone. Kansas State University uses the new KANREN DS-3 backbone to connect to the Great Plains Network and therefore the Abilene Network. Both institutions have been intimately involved in KANREN since its inception and are instrumental in the planning process for the upgrades. KANREN's network engineers have a long history of designing and building campus and regional networks. They were also instrumental in engineering MIDnet, one of the first regional NSFNET networks. They are currently network specialists associated with the University of Kansas.

KANREN is committed to developing a network which will support the protocols necessary to sustain connectivity for Kansas State University and the University of Kansas to the Abilene Network and vBNS through the Great Plains Network. This includes QoS either at the IP layer or within ATM, IPv6, multicasting, and whatever future technologies emerge. To accomplish this, KANREN will collaborate extensively with Kansas State University, the University of Kansas, and the staff and membership of KANREN and the Great Plains Network.

C.3.3.2 The Great Plains Network

The Great Plains Network (GPN - http://www.greatplains.net) is a consortium of central plains universities, including Kansas State University, in the states of North Dakota, South Dakota, Nebraska, Kansas, Oklahoma, and Arkansas. Initial funding for the GPN comes from an EPSCoR/NSF grant awarded in August, 1997. This grant supports research in Earth Systems Science between the participating EPSCoR Universities and the Earth Resources Observation Systems (EROS) Data Center in Sioux Falls, SD. In addition, the GPN functions as an Internet2 GigaPoP providing Abilene connections for eleven member institutions, including Kansas State University. The GPN became operational in August, 1998, and was one of the first GigaPoPs to connect to Abilene.

GPN Topology, Facilities, and Management

The Great Plains GigaPoP has two routing nodes - one in Kansas City and the other at the EROS Data Center in Sioux Falls, SD (see figure 2). These collection points house a combination of routers and ATM switches along with both monitoring and measurement equipment. Each of the participating states (KANREN in the case of Kansas) has at least a DS-3 connection to one of the routing nodes. Some connections are expected to be upgraded to OC-3 in the near future. A complete description of the GPN network design is available at http://www.greatplains.net/noc/network-design.html .

The locations for the two routing nodes were chosen for a variety of reasons: to minimize the lengths of the circuits, to maximize the potential for connecting to the national infrastructure, and to satisfy the requirements of the original EPSCoR/NSF award. The EROS Data Center already houses several agency network connections and increasingly serves as a focal point for connections in this region. A GigaPoP routing node located at EROS enhances the abilityto bring agency networks to the campuses via the GigaPoP rather than through direct connections. EROS is also the source of much of the data and other resources relevant to the scientific investigations proposed in the EPSCoR/NSF grant. The choice of Kansas City for the other GigaPoP routing node was made because it is a telecommunications focal point and is central to the states participating in the Great Plains consortium. Initially, only the Kansas City routing node connected to Abilene. In the near future, the EROS Data Center routing node may also connect to Abilene, thereby providing redundant connections for GPN HPC institutions.

The GPN will monitor and manage all connections to the GigaPoP and coordinate communications with all connected networks, including KANREN and Abilene. The GPN and its member institutions are dedicated to maintaining the network in support of the research community throughout the region. As active participants in Internet2, the GPN will support the protocols necessary to bring advanced services such as QoS, multicast, and IPv6 to the network as quickly as possible. One of the major goals of those associated with GPN is to remain on the cutting edge, working together with the major national networks in support of the high performance network infrastructure. The membership is expected to participate actively in this process - Kansas State University and the University of Kansas are represented on the GPN management and technical teams. The expertise gained will be vital for implementing both local and state networks, and in providing local understanding of the national infrastructure.

C.3.3.3 Abilene Network Connectivity

Kansas State University has been connected to the Great Plains Network GigaPoP in Kansas City (see figure 2) since September, 1998. The GPN GigaPoP in Kansas City in turn has been connected to the Abilene Network since September 1998 with an OC-12 circuit that is shared by the eleven member institutions that received NSF HPC awards. An additional connection to Abilene is expected in the very near future from the EROS Data Center GPN routing node. It will support both redundancy to all GPN member institutions as well as primary connectivity to the HPC awardees in the northern part of GPN (South Dakota and North Dakota). As an NSF HPNSP, the Abilene Network provides Kansas State University and the other GPN members with connectivity to the other institutions attached to Abilene and the vBNS. Indeed, research activity using the GPN, Abilene, and the vBNS has already begun at many of the member institutions.

C.3.4 Quality of Service (QoS) Guarantees

Initially, QoS guarantees will be provided on the local campus by simple over-provisioning. Connections to researchers who need high performance network connections will be at least switched 10/100 Mbps to the end node with progressively greater bandwidth toward the core in ample amounts to not limit performance. The Network Technologies team in Computing and Network Services at Kansas State University will work with the researchers to monitor performance and improve connectivity where necessary to ensure that the requirements of the meritorious applications are met.

For the short-term in the wide area connections, KANREN and the Great Plains Network are committed to providing QoS guarantees to Abilene/vBNS traffic, most likely in the form of provisioning shared circuits. Again, Kansas State University will work closely with KANREN and the GPN to coordinate QoS efforts.

For the long-term, implementation of QoS is expected to evolve quickly toward dynamic differentiation of service classes over shared circuits at the local, regional, and national levels. Whether this happens over IP with protocols like RSVP, with the inherent QoS properties of ATM, or with a new unforeseen technology, K-State will work with KANREN, the GPN, and the national infrastructure to implement the service guarantees end-to-end and make them widely available.

C.3.5 Planning Process

Management of the campus data network at Kansas State University is the responsibility of the Network Technologies group in Computing and Network Services while the cable plant is the responsibility of the Department of Telecommunications. The two units meet regularly to coordinate and plan.

The network engineering plan described in this section was produced by a team representing computing, networking, telecommunications, and multimedia technologies with input from representatives of the major applications areas described in section C.2. The plan was then presented to the high-speed connectivity team working on this proposal for final approval. This team includes technical staff, central administration, and representatives from each of the four major applications areas.

Kansas State University also collaborated extensively with representatives from the University of Kansas, KANREN, and the Great Plains Network consortium to develop the engineering plan and ensure consistency, cooperation, and compatibility among the respective high-speed networking efforts.

C.3.6 Project Management

Administrative and Technical Staff

Implementation of the local infrastructure upgrades are under the administrative management of Dr. Beth Unger, Vice Provost for Academic Services and Technology. Oversight of the technical and operational activities of the project will be carried out by the Network Technologies team in Computing and Network Services (CNS) under the leadership of Harvard Townsend, Interim Director of CNS, and Richard Becker, Network Manager. This staff will work closely with the management and technical staff of both KANREN and the Great Plains Network to implement and support the high-speed Abilene connection out of the Great Plains GigaPoP routing node in Kansas City. Researchers with meritorious applications requiring high-performance networking will be aided by J. Robert Caffey, Associate Director for Information Systems in the Division of Continuing Education, and by the Information Technology Assistance Center (iTAC) under the leadership of Dr. Jeanette Harold, Director of iTAC.

Project Schedule

The proposed schedule of implementation is as follows:

September, 1998 Starting date for the HPC project
August, 1998 KANREN backbone upgraded to DS-3 ATM
August, 1998 Coordinate with other Great Plains HPC awardees to design Abilene connection at the Kansas City GigaPoP
September, 1998 Great Plains Network operational
September, 1998 Great Plains Network OC-12 connection to Abilene operational
Sept-Dec, 1998 Testing of Great Plains Network and Abilene connections
January, 1999 All buildings on K-State campus with primary meritorious applications connected to backbone at 100 Mbps or faster.
February, 1999 Routing of Abilene/vBNS traffic for meritorious applications
March, 1999 Upgrade campus network backbone to switched Gigabit Ethernet
August, 1999 Upgrade T1 connection to K-State Salina campus to a T3

Evaluation and Dissemination of Results

The success of this project resides more in the support of meritorious research projects rather than in the network infrastructure itself. Consequently, dissemination of results of this project will include the published results of the meritorious applications. Furthermore, the highly collaborative nature of both the research projects and the network connectivity ensures wide dissemination of results, especially among the membership of KANREN, the Great Plains Network, the Abilene Network and vBNS community, and Internet2 project members.

The text of this proposal, along with other documentation concerning the development of high-speed connectivity for Kansas State University will be documented on the World Wide Web at http://www.ksu.edu/Internet2. Computing and Network Services will on a quarterly basis during the grant period evaluate and report on progress of the project in relation to the proposed schedule. Prior to implementation of vBNS connectivity, benchmarks documenting performance of the current campus and commodity Internet network connections in relation to the meritorious applications will be recorded. Any problems with connectivity will likewise be noted. After vBNS connectivity is operational, the same benchmarks will be performed and the same researchers interviewed to document how connectivity has changed and evaluate the effectiveness of the high speed connectivity in supporting their research. This information will be summarized and distributed to researchers campus-wide, will be made available on the project web site listed above for dissemination to researchers nation-wide, and included in the final project report to NSF.