Welcome to kamov.net
|
Web Hosting - Redundancy and Failover
Among the more useful innovations in computing, actually invented decades ago, are the twin ideas of redundancy and failover. These fancy words name very common sense concepts. When one computer (or part) fails, switch to another. Doing that seamlessly and quickly versus slowly with disruption defines one difference between good hosting and bad.
Network redundancy is the most widely used example. The Internet is just that, an inter-connected set of networks. Between and within networks are paths that make possible page requests, file transfers and data movement from one spot (called a 'node') to the next. If you have two or more paths between a user's computer and the server, one becoming unavailable is not much of a problem. Closing one street is not so bad, if you can drive down another just as easily.
Of course, there's the catch: 'just as easily'. When one path fails, the total load (the amount of data requested and by how many within what time frame) doesn't change. Now the same number of 'cars' are using fewer 'roads'. That can lead to traffic jams.
A very different, but related, phenomenon occurs when there suddenly become more 'cars', as happens in a massively widespread virus attack, for example. Then, a large number of useless and destructive programs are running around flooding the network. Making the situation worse, at a certain point, parts of the networks may shut down to prevent further spread, producing more 'cars' on now-fewer 'roads'.
A related form of redundancy and failover can be carried out with servers, which are in essence the 'end-nodes' of a network path.
Servers can fail because of a hard drive failure, motherboard overheating, memory malfunction, operating system bug, web server software overload or any of a hundred other causes. Whatever the cause, when two or more servers are configured so that another can take up the slack from one that's failed, that is redundancy.
That is more difficult to achieve than network redundancy, but it is still very common. Not as common as it should be, since many times a failed server is just re-booted or replaced or repaired with another piece of hardware. But, more sophisticated web hosting companies will have such redundancy in place.
And that's one lesson for anyone considering which web hosting company may offer superior service over another (similarly priced) company. Look at which company can offer competent assistance when things fail, as they always do sooner or later.
One company may have a habit of simply re-booting. Others may have redundant disk arrays. Hardware containing multiple disk drives to which the server has access allows for one or more drives to fail without bringing the system down. The failed drive is replaced and no one but the administrator is even aware there was a problem.
Still other companies may have still more sophisticated systems in place. Failover servers that take up the load of a crashed computer, without the end-user seeing anything are possible. In fact, in better installations, they're the norm. When they're in place, the user has at most only to refresh his or her browser and, bingo, everything is fine.
The more a web site owner knows about redundancy and failover, the better he or she can understand why things go wrong, and what options are available when they do. That knowledge can lead to better choices for a better web site experience.
Web Hosting - Sharing A Server – Things To Think About You can often get a substantial discount off web hosting fees by sharing a server with other sites. Or, you may have multiple sites of your own on the same system. But, just as sharing a house can have benefits and drawbacks, so too with a server. The first consideration is availability. Shared servers get re-booted more often than stand alone systems. That can happen for multiple reasons. Another site's software may produce a problem or make a change that requires a re-boot. While that's less common on Unix-based systems than on Windows, it still happens. Be prepared for more scheduled and unplanned outages when you share a server. Load is the next, and more obvious, issue. A single pickup truck can only haul so much weight. If the truck is already half-loaded with someone else's rocks, it will not haul yours as easily. Most websites are fairly static. A reader hits a page, then spends some time skimming it before loading another. During that time, the server has capacity to satisfy other requests without affecting you. All the shared resources - CPU, memory, disks, network and other components - can easily handle multiple users (up to a point). But all servers have inherent capacity limitations. The component that processes software instructions (the CPU) can only do so much. Most large servers will have more than one (some as many as 16), but there are still limits to what they can do. The more requests they receive, the busier they are. At a certain point, your software request (such as accessing a website page) has to wait a bit. Memory on a server functions in a similar way. It's a shared resource on the server and there is only so much of it. As it gets used up, the system lets one process use some, then another, in turn. But sharing that resource causes delays. The more requests there are, the longer the delays. You may experience that as waiting for a page to appear in the browser or a file to download. Bottlenecks can appear in other places outside, but connected to, the server itself. Network components get shared among multiple users along with everything else. And, as with those others, the more requests there are (and the longer they tie them up) the longer the delays you notice. The only way to get an objective look at whether a server and the connected network have enough capacity is to measure and test. All systems are capable of reporting how much of what is being used. Most can compile that information into some form of statistical report. Reviewing that data allows for a rational assessment of how much capacity is being used and how much is still available. It also allows a knowledgeable person to make projections of how much more sharing is possible with what level of impact. Request that information and, if necessary, get help in interpreting it. Then you can make a cost-benefit decision based on fact. Copyright Law Plagiarism Plagiarism Is Simply Unethical Anyone who is a writer is concerned with plagiarism. Copyright Plagiarism Laws protects copyright holders from having their works plagiarized. Many people think it is ironic that the word plagiarism derives from “kidnapper” in Latin. However, it is true. If a person uses another person’s words without permission, they have indeed stolen or kidnapped something that was owned by another and is in violation of copyright law. Plagiarism is a very bad word in the writing world. Crediting the author of the work will not keep someone immune from being in violation of copyright law. Plagiarism is plagiarism, even if the author is cited if the author did not give permission for the work to be used. One of the most common areas that copyright law plagiarism is violated is in the academic world. Many students will copy and paste the information they need for their research papers and essays straight off the Internet and turn it in to their professors. However, this type of cheating is easily detected now with special programs that professors can use. Plagiarism is unethical, not only in the writing world, but in the academic world, as well. Did you know that you could plagiarism a work but not be in violation of the copyright? Likewise, you can be in violation of a copyright and not have been plagiarizing. It is really not that hard to understand. Let’s say you are using Abraham Lincoln’s exact words in a paper and you did not cite him as the source or give him credit. Well, Lincoln’s words aren’t copyrighted because they are in the public domain. But, you did plagiarize because you tried to pass off his words as your own. Alternatively, if you use a picture in a book and you did not gain permission to use the book, you have violated copyright law because you did not source the artist and you did not get permission from the artist to use the picture. If you are in school, the best way you can get around committing plagiarism is to simply list your sources. If you use someone’s word, list it in an endnote or in a footnote. List the resource you found it in the bibliography. Another way around copyright law plagiarism violations is to take notes when you are reading. Take notes in your own words and put the resource away. Write your paper from your own words. No one wants to be singled out for plagiarism, especially a student who is concerned about their reputation at school and writers who need to keep their credibility in good standing. With today’s technological advances, it is not too hard to pinpoint plagiarized work. Even webmasters who run websites are on to the plagiarism crowd. They can run their entire sites through a special program to see if their content has been stolen and duplicated elsewhere on the Internet. If you are dealing in the written word, either academically or as a profession, it is a good idea that you only use your own words. It was probably easier to get away with plagiarism 100 years ago, but it is not that easy today. The changes are very high that if you are caught violating copyright law plagiarism laws you will be caught. Not only is it embarrassing, but it can cost you a bundle in a lawsuit. |