Google
 

Saturday, March 29, 2008

WEB SPACE HOSTING

A web space hosting service is a type of internet web space service that allows individuals and organizations to provide their own website accessible via the World Wide Web. Web hosts are companies that provide web space on a server they own for use by their clients as well as providing Internet connectivity, typically in a data center. Web space hostings can also provide data center space and connectivity to the Internet for servers they do not own to be located in their data center, called colocation.

The scopes of web hosting services vary widely. The most basic is webpage and for space for a small-scale file, where files can be uploaded via File Transfer Protocol (FTP) or a Web interface. The files are usually delivered to the Web "as is" or with little processing. Many Internet service providers (ISPs) offer this service free to their subscribers. People can also obtain Web page hosting, i.e space on the internet for their webpage , from other, alternative service providers. Personal web site space is typically free, advertisement-sponsored, or cheap. Business web site hosting often has a higher expense.

Space for a Single page is generally sufficient only for personal web pages. A complex site calls for a more comprehensive package that provides database support and application development platforms (e.g. PHP, Java, Ruby on Rails, and ASP.NET). These facilities allow the customers to write or install scripts for applications like forums and content management. For e-commerce, SSL is also required.

Hosting services limited to the Web:

* Free web hosting service: is free, (sometimes) advertisement-supported web hosting, and is often limited when compared to paid hosting.
* Shared web hosting service: one's Web site is placed on the same server as many other sites, ranging from a few to hundreds or thousands. Typically, all domains may share a common pool of server resources, such as RAM and the CPU. A shared website may be hosted with a reseller.
* Reseller web hosting: allows clients to become web hosts themselves. Resellers could function, for individual domains, under any combination of these listed types of hosting, depending on who they are affiliated with as a provider. Resellers' accounts may vary tremendously in size: they may have their own virtual dedicated server to a colocated server.
* Virtual Dedicated Server: dividing a server into virtual servers, where each user feels like they're on their own dedicated server, but they're actually sharing a server with many other users. The users may have root access to their own virtual space. This is also known as a virtual private server or VPS.
* Dedicated hosting service: the user gets his or her own Web server and gains full control over it (root access for Linux/administrator access for Windows); however, the user typically does not own the server. Another type of Dedicated hosting is Self-Managed or Unmanaged. This is usually the least expensive for Dedicated plans. The user has full administrative access to the box, which means the client is responsible for the security and maintenance of his own dedicated box.
* Managed hosting service: the user gets his or her own Web server but is not allowed full control over it (root access for Linux/administrator access for Windows); however, they are allowed to manage their data via FTP or other remote management tools. The user is disallowed full control so that the provider can guarantee quality of service by not allowing the user to modify the server or potentially create configuration problems. The user typically does not own the server. The server is leased to the client.
* Colocation web hosting service: similar to the dedicated web hosting service, but the user owns the colo server; the hosting company provides physical space that the server takes up and takes care of the server. This is the most powerful and expensive type of the web hosting service. In most cases, the colocation provider may provide little to no support directly for their client's machine, providing only the electrical, Internet access, and storage facilities for the server. In most cases for colo, the client would have his own administrator visit the data center on site to do any hardware upgrades or changes.
* Clustered hosting: having multiple servers hosting the same content for better resource utilization.
* Grid hosting : this form of distributed hosting is when a server cluster acts like a grid and is composed of multiple nodes.
* Home server: usually a single machine placed in a private residence can be used to host one or more websites from a usually consumer-grade broadband connection. These can be purpose-built machines or more commonly old PC's.

Saturday, February 23, 2008

No Directions Required--Software Smartens Mobile Robots

DARPA initiative to develop self-navigating robots introduces a world of potential for the development of autonomous vehicles, but will the government take advantage of its research or let it wither on the vine?

By Peter Sergo



SMART ROBOT: DARPA's LAGR initiative awarded each of eight teams of scientists $2 million to $3 million to develop software that would give unmanned vehicles the ability to autonomously learn and navigate irregular off-road terrain.
Courtesy of Yann Lecun, N.Y.U.

Computer experts recently gathered in San Antonio, Tex., to test one last time how well their software programs enabled a mobile robot vehicle to think for—and steer—itself. The event wrapped up the Defense Advanced Research Projects Agency's (DARPA) three-year Learning Applied to Ground Robots (LAGR) initiative, which awarded each of eight teams of scientists $2 million to $3 million to develop software that would give unmanned vehicles the ability to autonomously learn and navigate irregular off-road terrain.

Autonomous maneuvering may not seem terribly difficult for a reasonably smart robot on wheels. But although some vegetation, such as short grass on a prairie, is easily traversable, obstacles such as dense bushes and tree trunks are not. To expediently reach point B, the robot must be able to quickly sort through a range of flora and decide which ones it can travel over—or through—and which are rigid, impenetrable barriers.

Researchers initially believed that visual learning—making basic sense of a surrounding based on changes in light—would be easy to implement in computer systems. But Eero Simoncelli, a principal investigator at the New York University's (N.Y.U.) Laboratory for Computational Vision, pointed out that humans take vision for granted and overlook its complexity. "For you to avoid an object in your path is trivial," he says. "What's visual input [to a computer]? It's a bunch of pixels. It's a bunch of numbers that tell you how much light fell on each part of the sensor. That's a long way from a description of a cup sitting on a table." Extracting symbolic definitions from a large set of numeric values, he adds, is much harder than anyone realized.

Classifying natural obstacles was but one of myriad factors that DARPA researchers had to predict and implement in a software program to expand the capacity of a mobile robot to quickly analyze and travel through an environment. "Of course, no one [knew] how to design this," says Yann Lecun, professor of computer science at N.Y.U.'s Courant Institute of Mathematics who led the university's team. "So DARPA [was] interested in funding projects that advance the science of [robot] learning and vision."

Lecun, who has a knack for designing computer systems that pick out the key visual features in an environment, was an ideal candidate for the LAGR project. DARPA provided the funding and a standard test vehicle so Lecun and Urs Muller, CEO of software maker Net-Scale Technologies in Morganville, N.J., could focus on writing the software. They set out to push the realm of visual-based navigation forward—or to at least bring it up to speed.

A 2002 study by the Washington, D.C.–based National Research Council found that the increase in speed of unmanned ground vehicles was greatly outpaced by the rapid improvement in computer processing from 1990 to 2000 when the physical capability of a vehicle and course complexity is adjusted for. Muller points out that over the past decade there has been a 100-fold increase in computing power and a 1,000-fold gain in memory capacity but developments in unmanned navigational systems have lagged far behind these advances and will continue to without the development of new approaches to visual learning. "The limiting factor in software [design] is the human imagination," he says.

Until LAGR, most self-navigating mobile robots could only scan their immediate surroundings and plot a course over short distances. This made it difficult for robots to figure out an optimum route to any place farther than their own shortsighted universe of about 25 feet (7.6 meters), limiting them to a feel-as-you-go approach that often resulted in time-wasting, circuitous paths to a destination.

Scientists Tuning Very Large Array Radio Telescope for Deeper Exploration

The NSF's Very Large Array radio telescope is getting a digital makeover that will give it the sensitivity to pick up a cell phone signal on Jupiter, and to probe deeper into outer space

By Larry Greenemeier



SILENT VIGIL: The NSF's Very Large Array (VLA) radio telescope has become the Expanded VLA and will be 10 times more powerful when work is completed in 2012.
Courtesy of NRAO/AUI and Laure Wilson Neish

The National Science Foundation (NSF) is in the process of transforming its Very Large Array radio telescope into the—wait for it—Expanded Very Large Array, thanks to digital technology that will boost the Socorro, N.M., facility's already impressive ability to tune in on black holes, supernovae and the rest of the deep space menagerie.

Half of the Very Large Array's (VLA) 28 dish antennas—each weighing 230 tons—have already been upgraded so it can collect eight simultaneous data streams at about two giga- (billion) hertz, up from the previous capability of four data streams at about 50 mega- (million) hertz. The rest of the 28 antennas—which made their debut on the silver screen in the 1997 movie Contact, starring Jodie Foster and based on the eponymous Carl Sagan sci-fi novel—will go digital by 2012, increasing the facility's power 10-fold. The makeover will also replace original components that had been in operation since it was built in the 1970s.

"Certain objects radiate over a wide range of frequency," says Mark McKinnon, project manager for the Expanded VLA. "Improving the sensitivity of the telescope boils down to its bandwidth."

Completed in 1980 but operational before then, the VLA was behind the discoveries of water ice on Mercury; the complex region surrounding Sagittarius A*, the black hole at the core of the Milky Way galaxy; and it helped astronomers identify a distant galaxy already pumping out stars less than a billion years after the big bang.

The increased sensitivity and improved resolution of the EVLA will let scientists peer deep into star-forming clouds and spy on protoplanetary disks of dense gas surrounding young stars as well as track supernovae, fast-moving neutron stars and black holes, McKinnon says. The EVLA's receiving system will be sensitive enough to detect the weak radio transmission from a cell phone at the distance of Jupiter—half a billion miles away—at a projected cost of $94 million.

Data gathered by all 28 of the 82-foot- (25-meter-) diameter dish antennas are brought to a correlator—a central, special-purpose computer—which merges the input into a form that allows scientists to produce detailed, high-quality images of the astronomical objects under investigation. A new fiber-optic system replaces the older waveguide system for taking data collected by the receivers to the central control building and increases the amount of data that can be delivered from the antenna to the new $17-million correlator being built by Canadian scientists and engineers to handle the increased data flow.

In addition to its work for the NSF, the VLA site is also playing an important role in the development of another radio telescope, the Atacama Large Millimeter / submillimeter Array (ALMA). Started in 2003 and scheduled to be completed by 2012 in northern Chile's Atacama Desert at 16,500 feet (5,000 meters) above sea level, the facility employs more than 64 40-foot (12-meter) antennas. Scientists have been using the VLA site to test the performance of the dishes before they are installed at ALMA.

"The observations we make with the EVLA will be complementary with what they do at ALMA and at other radio telescopes," McKinnon adds. "Trying to understand astrophysical phenomena requires a multiwavelength approach."

Wednesday, January 23, 2008

Life on Mars

REAL OR ILLUSION? The image has set the Internet abuzz that there really is life on Mars.

REAL OR ILLUSION? The image has set the Internet abuzz that there really is life on Mars.


JUST LIKE US: The NASA images show a woman-like figure.
SO FAR: An arrow points towrds the woman-like figure.

London: Life on Mars? Well, bizarre images have emerged showing a mystery female figure walking down a hill on the arid planet.

The photo of what looks like a naked woman with her arm outstretched was among several taken on the red planet and sent back to Earth by NASA's Mars explorer Spirit, the Daily Mail reported on Wednesday, citing an unnamed website.

Though no official confirmation has come from NASA whether the figure is an alien or an optical illusion caused by a landscape on Mars, it has set the Internet abuzz that there really is life on Mars.

As one enthusiast put it on the website, "These pictures are amazing. I couldn't believe my eyes when I saw what appears to be a naked alien running around on Mars."

The news of the mystery woman on Mars came just days after a team of French scientists claimed to have discovered proof that the red planet possesses high-level dense clouds of dry ice, which scud across its orange sky.

Using data obtained by the OMEGA spectrometer on board ESA's Mars Express, the team found the existence of the ice clouds which sometimes become so dense that they throw quite dark shadows on the dusty surface of the red planet.

"This is the first time that carbon dioxide ice clouds on Mars have been imaged and identified from above. This is important because the images tell us not only about their shape, but also their size and density."

"Previously, we had to rely on indirect information. However, it is very difficult to separate the signals coming from the clouds, atmosphere and surface," according to lead scientist Franck Montmessin of the Service d'Aeronomie at University of Versailles.

Saturday, January 19, 2008

watt-hour meters

Watt-hour meters
The utility company is not too interested in how much power you’re using with one appliance,
or even how much power a single household is drawing, at any given time. By
far the greater concern is the total energy that is used over a day, a week, a month or a
year. Electrical energy is measured in watt hours, or, more commonly for utility purposes,
in kilowatt hours (kWh). The device that indicates this is the watt-hour meter
or kilowatt-hour meter.
The most often-used means of measuring electrical energy is by using a small electric
motor device, whose speed depends on the current, and thereby on the power at a
constant voltage. The number of turns of the motor shaft, in a given length of time, is directly
proportional to the number of kilowatt hours consumed. The motor is placed at
the point where the utility wires enter the house, apartment or building. This is usually
at a point where the voltage is 234 V. This is split into some circuits with 234 V, for
heavy-duty appliances such as the oven, washer and dryer, and the general household
fines for lamps, clock radios and, television sets.
You’ve surely seen the little disk in the utility meter going around and around,
sometimes fast, other times slowly. Its speed depends on the power you’re using. The
total number of turns of this little disk, every month, determines the size of the bill you
will get—as a function also, of course, of the cost per kilowatt hour for electricity.
Kilowatt-hour meters count the number of disk turns by means of geared, rotary
drums or pointers. The drum type meter gives a direct digital readout. The pointer type
has several scales calibrated from 0 to 9 in circles, some going clockwise and others going
counterclockwise.
Reading a pointer type utility meter is a little tricky, because you must think in
whatever direction (clockwise or counterclockwise) the scale goes. An example of a
pointer type utility meter is shown in Fig. 3-11. Read from left to right. For each little
meter, take down the number that the pointer has most recently passed. Write down
the rest as you go. The meter in the figure reads 3875 kWh. If you want to be really precise,
you can say it reads 3875-1/2 kWh.

wattmeters

Wattmeters
The measurement of electrical power requires that voltage and current both be measured
simultaneously. Remember that power is the product of the voltage and current.
That is, watts (P) equals volts (E) times amperes (I), written as P  EI. In fact, watts
are sometimes called volt-amperes in a dc circuit.
You might think that you can just connect a voltmeter in parallel with a circuit,
thereby getting a reading of the voltage across it, and also hook up an ammeter in series
to get a reading of the current through the circuit, and then multiply volts times amperes
to get watts consumed by the circuit. And in fact, for practically all dc circuits,
this is an excellent way to measure power
Quite often, however, it’s simpler than that. In many cases, the voltage from the
power supply is constant and predictable. Utility power is a good example. The effective
voltage is always very close to 117 V. Although it’s ac, and not dc, power can be measured
in the same way as with dc: by means of an ammeter connected in series with the
circuit, and calibrated so that the multiplication (times 117) has already been done.
Then, rather than 1 A, the meter would show a reading of 117 W, because P EI 117
 1 117 W. If the meter reading were 300 W, the current would be 300/117 2.56 A.
An electric iron might consume 1000 W, or a current of 1000/117  8.55 A. And a
large heating unit might gobble up 2000 W, requiring a current of 2000/117 17. 1 A. This
might blow a fuse or breaker, since these devices are often rated for only 15 A. You’ve
probably had an experience where you hooked up too many appliances to a single circuit,
blowing the fuse or breaker. The reason was that the appliances, combined, drew too
much current for the house wiring to safely handle, and the fuse or breaker, detecting the
excess current, opened the circuit.
Specialized wattmeters are necessary for the measurement of radio-frequency
(RF) power, or for peak audio power in a high-fidelity amplifier, or for certain other specialized
applications. But almost all of these meters, whatever the associated circuitry,
use simple ammeters as their indicating devices.

Free ebooks,rapidshare,great blog,megaupload,frantic ramblings,ideas,home equity loans,learn electronics,download
Powered By Blogger