Wednesday, September 24, 2008

Web Security Basics

There are several key Concern related to web security.how secure are the systems that control the exchange of information on the web ? how secure is the information stored on the numerous computer across the web?. it is known fact the what can be used can also be misused. we should
always remember that if organizational information is hacked either through the network or through other the other means ,it could incur heavy cost to the compay . A failure in the network
security could also cost the organization in terms of goodwill and reputation.No other organization would be interested in doing busiiness with that organization .that can't protect its information and security system.
Following points need to be keep track:
1>Login Page Should Be Encrypted :
The number of times I have seen Web sites that only use SSL (with https: URL schemes) after user authentication is accomplished is really dismaying. Encrypting the session after login may be useful — like locking the barn door so the horses don’t get out — but failing to encrypt logins is a bit like leaving the key in the lock when you’re done locking the barn door. Even if your login form POSTs to an encrypted resource, in many cases this can be circumvented by a malicious security cracker who crafts his own login form to access the same resource and give him access to sensitive data.
2>Data Validation Should Be Done Server Side:Many Web forms include some JavaScript data validation. If this validation includes anything meant to provide improved security, that validation means almost nothing. A malicious security cracker can craft a form of his own that accesses the resource at the other end of the Web page’s form action that doesn’t include any validation at all. Worse yet, many cases of JavaScript form validation can be circumvented simply by deactivating JavaScript in the browser or using a Web browser that doesn’t support JavaScript at all. In some cases, I’ve even seen login pages where the password validation is done client-side — which either exposes the passwords to the end user via the ability to view page source or, at best, allows the end user to alter the form so that it always reports successful validation. Don’t let your Web site security be a victim of client-side data validation. Server-side validation does not fall prey to the shortcomings of client-side validation because a malicious security cracker must already have gained access to the server to be able to compromise it.er-side:
3>Manage your Web site via encrypted connections:
Using unencrypted connections (or even connections using only weak encryption), such as unencrypted FTP or HTTP for Web site or Web server management, opens you up to man-in-the-middle attacks and login/password sniffing. Always use encrypted protocols such as SSH to access secure resources, using verifiably secure tools such as OpenSSH. Once someone has intercepted your login and password information, that person can do anything you could have done.
4>Use strong, cross-platform compatible encryption:
Believe it or not, SSL is not the top-of-the-line technology for Web site encryption any longer. Look into TLS, which stands for Transport Layer Security — the successor to Secure Socket Layer encryption. Make sure any encryption solution you choose doesn’t unnecessarily limit your user base, the way proprietary platform-specific technologies might, as this can lead to resistance to use of secure encryption for Web site access. The same principles also apply to back-end management, where cross-platform-compatible strong encryption such as SSH is usually preferable to platform-specific, weaker encryption tools such as Windows Remote Desktop.
5>Connect from a secured network:
Avoid connecting from networks with unknown or uncertain security characteristics or from those with known poor security such as open wireless access points in coffee shops. This is especially important whenever you must log in to the server or Web site for administrative purposes or otherwise access secure resources. If you must access the Web site or Web server when connected to an unsecured network, use a secure proxy so that your connection to the secure resource comes from a proxy on a secured network. In previous articles, I have addressed how to set up a quick and easy secure proxy using either an OpenSSH secure proxy or a PuTTY secure proxy.

6>Prefer key-based authentication over password authentication:Password authentication is more easily cracked than cryptographic key-based authentication. The purpose of a password is to make it easier to remember the login credentials needed to access a secure resource — but if you use key-based authentication and only copy the key to predefined, authorized systems (or better yet, to separate media kept apart from the authorized system until it’s needed), you will use a stronger authentication credential that’s more difficult to crack.

7>Maintain a secure workstation:
If you connect to a secure resource from a client system that you can’t guarantee with complete confidence is secure, you cannot guarantee someone isn’t “listening in” on everything you’re doing. Keyloggers, compromised network encryption clients, and other tricks of the malicious security cracker’s trade can all allow someone unauthorized access to sensitive data regardless of all the secured networks, encrypted communications, and other networking protections you employ. Integrity auditing may be the only way to be sure, with any certainty, that your workstation has not been compromised.

In order for anyone to attack your web site, there has to be a way in - an unguarded doorway into your server.
There are only three places where these doorways exist:
1. On your local computer that you use to upload your web pages.
2. Through any form field that you use on your web site to collect information from your visitors.
3. At the physical location where your server is located.

With virus and worm attacks there also has to be a doorway in, but the doorway exists primarily in the software your web site visitors use to view your site (Microsoft Internet Explorer), not in the site itself.
You can guard against the virus replicating itself on your mail server or the worm being downloaded to your local network or computer by using firewall software, but responding to virus and worm attacks consists mainly of educating yourself and your co-workers to prevent more infections and waiting for Microsoft to patch their software.

On the other hand, if you're running your own server, you need to stay on top of potential new security threats from viruses and worms daily to close any potential doorways into your server (covered in server security). The good news is that if you're just programming a web site (the majority of police webmasters), there are many straight forward and relatively simple methods you can use to protect your site and close the doorways in.
Source :http://blogs.techrepublic.com.com/security/?p=424

Thursday, September 11, 2008

How Important are backlinks?

First let me introduce to concept of back links

Backlinks are links that are directed towards a webpage. Popularly called Inbound links. The number of backlinks to a webpage act as vote for the webpage. It also adds to the popularity of the webpage.

Google's first published algorithm is based on Page rank. According to the Page rank algorithm inbound links are counted as vote for the page and outbound as links as votes which the webpage provides to other pages. Since back links are immense part of page rank algorithm they are important from a Search Engine optimizer point of view.

Backlinks need to be quality backlinks for better ranking of web page

According to Search Engine a backlink is a quality backlink if it is relvant for the keyword for which it has been created. further theme of the voting website is similar to the theme of the voted website. Thus we cannot be satisfied with merely getting inbound links we need to keep a check on the quality of the inbound link that matters. Thus a inbound link becomes more relevant if the inbound links to the website come from sites that have content related to the site. If inbound links are found on sites with unrelated content, they are considered less relevant. The higher the relevance of inbound links, the greater their quality.


Quality Backlinks can only be created with time. While it is fairly easy to manipulate links on a web page to try to achieve a higher ranking it is very difficult to influence a search engine with backlinks from other websites. This is also a reason why backlinks factor in so highly into a search engine's algorithm. Lately, however, a search engine's criteria for quality inbound links has gotten even tougher, thanks to unscrupulous webmasters trying to achieve these inbound links by deceptive or sneaky techniques, such as with hidden links, or automatically generated pages whose sole purpose is to provide inbound links to websites. These pages are called link farms, and they are not only disregarded by search engines, but linking to a link farm could get your site banned entirely.

Tips for Quality Backlinks

1. reciprocal linking:

Many times webmasters agree to reciprocal link exchanges in order to boost website rankings. It is a kind of link exchange where a webmaster places a link on his website that points to another webmasters website, and vice versa. It is possible that such kind of links are not relevant. Major search engines like google strongly oppose such kind of irrelevant linking and do update their algorithm for filtering them.


2. keep track of backlinks:
While building your backlink building campaign it is important that one keeps track of backlinks and how the anchor text of the backlink incorporates keywords relating to your site.

Recommended resource