Saturday, March 29, 2008


A web space hosting service is a type of internet web space service that allows individuals and organizations to provide their own website accessible via the World Wide Web. Web hosts are companies that provide web space on a server they own for use by their clients as well as providing Internet connectivity, typically in a data center. Web space hostings can also provide data center space and connectivity to the Internet for servers they do not own to be located in their data center, called colocation.

The scopes of web hosting services vary widely. The most basic is webpage and for space for a small-scale file, where files can be uploaded via File Transfer Protocol (FTP) or a Web interface. The files are usually delivered to the Web "as is" or with little processing. Many Internet service providers (ISPs) offer this service free to their subscribers. People can also obtain Web page hosting, i.e space on the internet for their webpage , from other, alternative service providers. Personal web site space is typically free, advertisement-sponsored, or cheap. Business web site hosting often has a higher expense.

Space for a Single page is generally sufficient only for personal web pages. A complex site calls for a more comprehensive package that provides database support and application development platforms (e.g. PHP, Java, Ruby on Rails, and ASP.NET). These facilities allow the customers to write or install scripts for applications like forums and content management. For e-commerce, SSL is also required.

Hosting services limited to the Web:

* Free web hosting service: is free, (sometimes) advertisement-supported web hosting, and is often limited when compared to paid hosting.
* Shared web hosting service: one's Web site is placed on the same server as many other sites, ranging from a few to hundreds or thousands. Typically, all domains may share a common pool of server resources, such as RAM and the CPU. A shared website may be hosted with a reseller.
* Reseller web hosting: allows clients to become web hosts themselves. Resellers could function, for individual domains, under any combination of these listed types of hosting, depending on who they are affiliated with as a provider. Resellers' accounts may vary tremendously in size: they may have their own virtual dedicated server to a colocated server.
* Virtual Dedicated Server: dividing a server into virtual servers, where each user feels like they're on their own dedicated server, but they're actually sharing a server with many other users. The users may have root access to their own virtual space. This is also known as a virtual private server or VPS.
* Dedicated hosting service: the user gets his or her own Web server and gains full control over it (root access for Linux/administrator access for Windows); however, the user typically does not own the server. Another type of Dedicated hosting is Self-Managed or Unmanaged. This is usually the least expensive for Dedicated plans. The user has full administrative access to the box, which means the client is responsible for the security and maintenance of his own dedicated box.
* Managed hosting service: the user gets his or her own Web server but is not allowed full control over it (root access for Linux/administrator access for Windows); however, they are allowed to manage their data via FTP or other remote management tools. The user is disallowed full control so that the provider can guarantee quality of service by not allowing the user to modify the server or potentially create configuration problems. The user typically does not own the server. The server is leased to the client.
* Colocation web hosting service: similar to the dedicated web hosting service, but the user owns the colo server; the hosting company provides physical space that the server takes up and takes care of the server. This is the most powerful and expensive type of the web hosting service. In most cases, the colocation provider may provide little to no support directly for their client's machine, providing only the electrical, Internet access, and storage facilities for the server. In most cases for colo, the client would have his own administrator visit the data center on site to do any hardware upgrades or changes.
* Clustered hosting: having multiple servers hosting the same content for better resource utilization.
* Grid hosting : this form of distributed hosting is when a server cluster acts like a grid and is composed of multiple nodes.
* Home server: usually a single machine placed in a private residence can be used to host one or more websites from a usually consumer-grade broadband connection. These can be purpose-built machines or more commonly old PC's.

Saturday, February 23, 2008

No Directions Required--Software Smartens Mobile Robots

DARPA initiative to develop self-navigating robots introduces a world of potential for the development of autonomous vehicles, but will the government take advantage of its research or let it wither on the vine?

By Peter Sergo

SMART ROBOT: DARPA's LAGR initiative awarded each of eight teams of scientists $2 million to $3 million to develop software that would give unmanned vehicles the ability to autonomously learn and navigate irregular off-road terrain.
Courtesy of Yann Lecun, N.Y.U.

Computer experts recently gathered in San Antonio, Tex., to test one last time how well their software programs enabled a mobile robot vehicle to think for—and steer—itself. The event wrapped up the Defense Advanced Research Projects Agency's (DARPA) three-year Learning Applied to Ground Robots (LAGR) initiative, which awarded each of eight teams of scientists $2 million to $3 million to develop software that would give unmanned vehicles the ability to autonomously learn and navigate irregular off-road terrain.

Autonomous maneuvering may not seem terribly difficult for a reasonably smart robot on wheels. But although some vegetation, such as short grass on a prairie, is easily traversable, obstacles such as dense bushes and tree trunks are not. To expediently reach point B, the robot must be able to quickly sort through a range of flora and decide which ones it can travel over—or through—and which are rigid, impenetrable barriers.

Researchers initially believed that visual learning—making basic sense of a surrounding based on changes in light—would be easy to implement in computer systems. But Eero Simoncelli, a principal investigator at the New York University's (N.Y.U.) Laboratory for Computational Vision, pointed out that humans take vision for granted and overlook its complexity. "For you to avoid an object in your path is trivial," he says. "What's visual input [to a computer]? It's a bunch of pixels. It's a bunch of numbers that tell you how much light fell on each part of the sensor. That's a long way from a description of a cup sitting on a table." Extracting symbolic definitions from a large set of numeric values, he adds, is much harder than anyone realized.

Classifying natural obstacles was but one of myriad factors that DARPA researchers had to predict and implement in a software program to expand the capacity of a mobile robot to quickly analyze and travel through an environment. "Of course, no one [knew] how to design this," says Yann Lecun, professor of computer science at N.Y.U.'s Courant Institute of Mathematics who led the university's team. "So DARPA [was] interested in funding projects that advance the science of [robot] learning and vision."

Lecun, who has a knack for designing computer systems that pick out the key visual features in an environment, was an ideal candidate for the LAGR project. DARPA provided the funding and a standard test vehicle so Lecun and Urs Muller, CEO of software maker Net-Scale Technologies in Morganville, N.J., could focus on writing the software. They set out to push the realm of visual-based navigation forward—or to at least bring it up to speed.

A 2002 study by the Washington, D.C.–based National Research Council found that the increase in speed of unmanned ground vehicles was greatly outpaced by the rapid improvement in computer processing from 1990 to 2000 when the physical capability of a vehicle and course complexity is adjusted for. Muller points out that over the past decade there has been a 100-fold increase in computing power and a 1,000-fold gain in memory capacity but developments in unmanned navigational systems have lagged far behind these advances and will continue to without the development of new approaches to visual learning. "The limiting factor in software [design] is the human imagination," he says.

Until LAGR, most self-navigating mobile robots could only scan their immediate surroundings and plot a course over short distances. This made it difficult for robots to figure out an optimum route to any place farther than their own shortsighted universe of about 25 feet (7.6 meters), limiting them to a feel-as-you-go approach that often resulted in time-wasting, circuitous paths to a destination.