Showing posts with label News. Show all posts
Showing posts with label News. Show all posts

North Pole of Mars Mysteries Revealed

North Pole of Mars (NASA/ Caltech/ JPL/ E. DeJong/ J. Craig/ M. Stetson)

VIVAnews - Trench-shaped giant spiral troughs on Mars' north pole that resembles a giant windmills that are no longer mysterious. Scientists have little explanation about one of the characteristics of Mars, called Mare Boreum or giant valleys.

As reported by Space.com, May 26, 2010 edition, the formation of valleys that resembles a giant spiral groove animal snail shell formed through a very long process.

The process of formation of the giant valley just as the formation of small valleys commonly called Chasma Boreale, a pre-existing in the same location.

Process that lasted over a certain period was due to the influence of the sun and wind. Climate on the Red Planet memungkikan also assessed the formation of valleys previously mysterious. Erosion processes in a long time is also one cause of the formation of giant canyons.

Initially, researchers thought that the valleys are older geological phenomena. But they have not been able to prove and can only guess.
Isaac Smith, a planetary scientist at the University of Texas at Austin, said that so far the view that researchers can not be proven. "They are like judge a book by its cover," said Smith.

However, radar technology, opening the eyes of scientists. Radar technology allows scientists to take pictures of two-dimensional (2D). Starting from the cross section and layer in the trench wall, Radar also help track the points geometry of underground structures, to further create three-dimensional image display.

Steep valleys in between the cliffs a total of approximately 1000 kilometers. Reached about two miles thick. Radar research results indicate that the steep valleys were formed long before the emergence of a spiral-shaped shallow trench earlier.

Early research calls, the valleys created by the activity of the giant melting icebergs for long. But the question is, in radar research studies there is no evidence remains of the ice melts.

This 6 Companies 'Tourism' Space

 (AP Photo/Kyodo News)

VIVAnews - a new era of commercial space flight began. Falcon 9 rocket from the company Space Exploration Technologies (SpaceX) has been launched from Florida, Friday, June 4th, 2010 ago.

SpaceX is not alone through the atmosphere for sending cargo and astronauts into space.

United States space agency, NASA has been known to ask for SpaceX and other companies, Virginia's Orbital Sciences to build unmanned rocket to send cargo to the international space station.

After that, SpaceX plans to modify the design of capsules Oriental Lockheed Martin as a space station lifeboat. Meanwhile, corporate giants Boeing also hopes to provide commercial crew of space flight.

Several smaller companies are also ambitious to build a space rocket to carry humans into the sky. Already there are six companies in the list.

1. Space Exploration Technologies (SpaceX)

Name of aircraft: Dragon and Falcon 9, can carry seven passengers or less - if coupled with the transport of cargo.

Founder of the company: Elon Musk, founder of the site money transfer, PayPal, wealth worth U.S. $ 100 million of wealth Musk and U.S. $ 20 million from outside investors.

Location: Hawthorne, California, began operations in 2002

Flights: Debut rocket launch in 2010. While the first flights will be conducted in 2011.

Two unmanned rocket that SpaceX, Falcon 9 and Dragon, originally intended to transport cargo to the international space station. Dragon will be ready to fly astronauts in watu three years after receiving a contract from NASA.

2. Orbital Sciences

Name of aircraft: 2 Cygnus and Taurus, Cygnus is designed as an unmanned aircraft.

Founder: David W. Thompson, Bruce W. Ferguson, Scott L. Webster, with a fortune: Approximately U.S. $ 1 billion

Location: Dulles, Virginia, began operations in 1982
Plan flight to outer space: 2011

Orbital Sciences has a contract worth U.S. $ 1.9 billion, with NASA to provide eight cargo delivery mission to the international space station using the Cygnus and Taurus 2. The launch is planned in 2011 from Wallops Island, Virginia.

Orbital has not announced plans to change the Cygnus became a manned aircraft.

3. Blue Origin

Name of aircraft: New Shepard, at least, can carry three astronauts
Founder: Jeff Bezoz, who is also the founder of Amazon.com
Location: Kent, Washington, began operations in 2004
Plan rocket launch, mid 2010

This company plans to close the meeting information about the launch of human spaceflight. However, Blue Origin has been testing prototype aircraft, the New Shepard in Texas.

Earlier this year, NASA chose the Blue Origin to develop an astronaut rescue system and build a prototype composite space capsule as part of a commercial crew launch the program.

4. Bigelow Aerospace

Name plane: Sundancer - can carry three crew and BA-330 - can carry six passengers.
Founder: Robert Bigelow, with a fortune of U.S. $ 180 million company from the wealth of Bigelow.
Location: North Las Vegas, Nevada, began operations in 1999
Rocket launch plan: 2015

Rocket Sundancer and BA-330 is expected to be a space station, not just a rocket. Company founder, Robert Bigelow dreaming develop the station on the Moon with inflatable technology. - Can be pumped.

Although Bigelow did not have a rocket or spacecraft to fly to the station, the company has worked with Boeing in the provision of crew members.

5. SpaceDev / Sierra Nevada Corp.

Name plane: Dream Chasers, can carry four passengers in suborbital flight, and 6 in orbital flight.

Founder of the company: Jim Benson (died), replaced by Fatih Ozmen
Location: Poway, California, began operations in 1997
Mengangkasa Plan: under development

California-based SpaceDev, wholly owned by Sierra Nevada Corp.. This company has developed the Dream Chaser - a plane that can carry crew and cargo into space, through the Atlas 5 rocket.

In February, the Sierra Nevada won projects worth U.S. $ 20 million from NASA to continue development Dream Chaser's.

6. Virgin Galactic

Name of aircraft: Spaceship Two, can carry six passengers and two pilots.

Owner: miyuner England, Sir Richard Branson, as well as providers of funds.

Location: London, England and Spaceport, New Mexico, began operations in 2004.

The flight plan: the end of 2011 or early 2012.

Virgin Galactic aircraft designed for a trip into space. Passengers who want to be a tourist is required to pay approximately U.S. $ 200 thousand per seat.

Meanwhile, aircraft carriers, WhiteKnightTwo - can be modified to launch small rockets or satellites for NASA or other users.

SpaceShipTwo is designed by aerospace engineer Burt Rutan. These are larger versions of SpaceShipOne, which flew successfully in suborbital flights in 2004.

Arriving Soon, Sun Storm Threat

Solar storms (BBC)

VIVAnews - solar activity would become more active and will result in negative effects for the Earth.

To prepare for the worst, a leading solar scientists gathered in Washington DC, USA Tuesday, June 8, 2010, to discuss the best ways to protect satellites and Earth's vital systems of the solar storm.

Solar storm occurs when some point the sun burst and spew splattered particles that can be damaging. This activity takes place in cycles of 11 years.

"The sun has got up from bed length. And in the next few years we will see solar activity in the higher level," said the head of NASA's Heliophysics Division, Richard Fisher, like the Christian Science Monitor published pages.

"At the same time, technological society is developing a new sensitivity to face the storm the sun."

Society in the 21st century rely heavily on high-tech systems in everyday life are susceptible to storm the sun.

GPS navigation, air travel, financial services and emergency radio communications could all die suddenly by solar activity.

Economic damage caused by solar storms are expected twenty times larger than Hurricane Katrina - as a warning, issued the National Academy of Sciences in a report in 2008.

Fortunately, a lot of damage can be overcome if it knows when a storm is coming. That is why understanding of solar weather and a better ability to provide early warning, it is very important.

Placing the satellite in 'safe mode' and release the transformer in order to protect the electronics from damaging power surge.

"Space weather forecast is still under development, but we're making rapid progress," said Thomas Bogdan, director of the National Oceanic and Atmospheric Administration (NOAA).

NASA and NOAA are working together to manage the fleet of satellites that monitor the sun and help to predict changes in solar.

A pair of spacecraft called Stereo (Solar Terrestrial Relations Observatory) is located on the opposite side of the sun, which can display a mix of 90 percent of the solar surface.

In addition, the SDO (Solar Dynamics Observatory which), which has just launched in February 2010, can produce new photo active part in the solar surface.

Also, an old satellite called the Advanced Composition Explorer (ACE), which was launched in 1997, still monitoring the sun.

"I believe we are on the verge of new era where space weather can affect our lives everyday like usual weather of the earth." Fisher said. "For us, this is very serious." (Hs)

Unfolding, Ever There Alien Life on Titan

  Illustration of the lake on the surface of Titan (NASA)

VIVAnews - Titan - Saturn's largest satellite has long been suspected to have signs of life.

And alleged that leads right. Scientists of the National Space Agency (NASA) to reveal the facts leading to the evidence that there is life on one of Saturn's largest moon.

NASA scientists believes it has found vital evidence which indicates that the possibility of primitive alien life on Titan ever.

As quoted from page Daily Telegraph, June 5, 2010, data explorer rocket NASA's Cassini analyze complex chemical conditions that exist on the surface of Titan - known as the only satellite or moon has an atmosphere dense.

Scientists discover, life beings Titan, whatever it is, breathing inhaling the atmosphere and get food from the surface of Titan.

Previously, astronomers claimed that the satellite or months is generally too cold to support life. Even so there can be no water flowing on its surface.

Studies on Titan is described in two separate papers.

The first report was published journal Icarus. Reveal that the hydrogen gas flowing through Titan's atmosphere vanished while on the surface. This shows that the aliens in the process of respiration or breathing.

Meanwhile, a report to the two in the Journal of Geophysical Research, concluded, chemicals on the surface of Titan minimal. Reduction of minerals consumed by the creature allegedly due to Titan.

Chris McKay, NASA astrobiologi Moffett Field, California, who led the study says, "we concluded that the hydrogen consumption occurred. Therefore, it is clear that the gas consumed in the Titan, the same as humans consume oxygen on Earth."

"If indeed this is a sign of life, it will be very interesting because it means that could prove the existence of these two life forms - which are different from life on Earth which sustained the water," said McKay.

Meanwhile, Professor John Zarnecki of the Open University, said the conclusion that there is life on Titan's getting close.

"We are convinced the chemicals that currently there is the composer's life. Just add the element of heat and warmth to begin that process.

"In the past four billion years, when the Sun turned into a red giant, Titan could be a paradise," he added.

However, scientists warn there may be other explanations for these findings.

Action 'spies' Google sparked strong protests

(AP Photo/Jae C. Hong)

VIVAnews - Action Google's 'spying' Internet users through a vehicle whose task was to retrieve the data for Google Street View, invited strong protests from various parties.

German consumer protection minister, Ilse Aigner, protested strongly to the Google over privacy violations and urged the company to be more willing to work with the relevant authorities for the protection of consumer data.

"Based on the information we collect, Google has to penetrate into the private network, which is done by illegal ways," Aigner said, quoted from the AP. He also accused Google to withhold information requested by regulators in Germany.

About two weeks ago, Google said the German consumer protection authorities, through Google Street View vehicle, they just collect the data, a WiFi network name (SSID) and a WiFi router data unique address (MAC number).

At that time, Google said that Google's mobile Street View it does not collect information that is sent through the WiFi network (data payload).

However, through the company's newest blogs, Google claimed that by mistake, Google is also collecting data that milling on the WiFi network was not encrypted, which they watched.

"We are one of the samples had been collected the data payload from the open WiFi network (which is not protected by a password), although we also never use the data to Google's products," wrote Alan Eustace, Senior Vice President, Engineering Research Google .

Thus, the information in the form of pieces of mail or the internet pages of information that is accessed by people who use open WiFi networks through which the cars Google Street View, can be read by Google.


Google Street View is a Google service that provides photographs in various places taken by the Google camera car. This service has provoked controversy in Germany and in several countries, disputed by the defenders of privacy rights, because of fears they will be caught Google's cameras.

According to Eustace, this error occurs inadvertently. In the year 2006 a Google staff working on an experimental project to create a program that is able to take samples of all categories of data that is accessed via WiFi.

A year later, when Google started the project of data collection WiFi network SSID information and MAC address for Google Street View, it turns out that the program code is included into the new software project, so that the payload data that is accessed via an open WiFi, also collected by Google.

Aigner Minister requested that Google followed the rules to declare their activities to all consumer protection authorities around the world.

Until now, Google has collected about 600 GB for a single data from WiFi network in over 30 countries, including the US. Google's own team admitted to delete all the data.

"Google's technical team really worked hard to earn your trust, and we are very aware we are failing in this regard. We are very, very sorry for this error and decided to learn from mistakes that we do." In addition Google has also decided to stop the activities of its Street View vehicle. (Hs)

10 Tips for Safe on Twitter

TEMPO Interactive, Jakarta - The development of Twitter social networking in Indonesia so fast. A report from ReadWriteWeb.com states that at the beginning of 2010 Indonesia has been a Twitter user countries of the world's sixth largest.

Popularity was also followed by increased threats from hackers and cyber criminals. When Twitter account belongs to the President Barrack Obama could just hacked, let alone yours.

"Only in four years, the popularity of Twitter increases rapidly around the world. The increase was made Twitter as an easy target cyber criminals," said Ema Linaker, Global Head of Online Engagement AVG, a computer security company, in his press release.

One type of threat is quite popular these days is a fake phishing site alias. Performers usually rip off the victims account and use that account to send a private message or Direct Message (DM) to the follower of these accounts.

DM that will contain links that will lead users to a fake site. After that, theft of information and data began.

Linaker offers ten tips for users of Twitter to keep it safe:

* Limit your talk

Users often feel it's easy mencericit about their location and activities. The comments provided in response to light over time can be used to determine the daily schedule of activities and plans for that user. Criminals ready to make it as following the victim.

* Beware of Opening Links

Many people use a URL shortening service on Twitter, so it is often difficult to really know the site. AVG LinkScanner feature in AVG Antivirus can be used to check these links. But if rising doubts in the liver, the link should not be opened.

* Watchful

Be aware of suspicious activity in the timeline or inbox. When starting any suspicious messages or your friends start sending spam messages smell, should confirm whether such co acount or not broken.

* Think before Tweet

Tweet can be read by everyone even after deleted. Think carefully what you write.

* Do not Hurry Believe

You never know who your follower. Do not easily believe in and make friends with people who may have bad intentions.

* Check the Third Party Applications

There are hundreds of applications on Twitter. Before you use, ensure its security. Look for a lot of applications are discussed in the trusted sites. Remember that applications always ask for a keyword and IDs.

* Password Information

Use passwords and e-mail is different for each of your social networking account. That way, when you close the account,

You can easily delete the e-mail account.

* When Log in

Check your browser settings, make sure that the information is not recorded when you use the computer together.

* Beware of Phishing Attacks

Beware of attempts to obtain personal information through a Tweet or DM.

* When Using Cell Phones

Be careful with cell phones and anyone who can use it. If you have a Twitter application, make sure you always log out after use.

Deddy Sinaga

Geologists Discover Relations and Climate Earth Orbit

TEMPO Interactive, Jakarta - From the analysis of 1.2 million last year, geologist at the University of California Santa Barbara Lorraine Lisiecki claimed to have found a pattern of orderly change in the relationship obit the earth's climate cycles. The findings are reported in the scientific journal Nature Geoscience.

Lisiecki analyze core temperature (core) marine sediments from 57 locations around the world. By analyzing the sediment, scientists can create a chart of the earth's climate over millions of years in the past.

Lisecki linking climate with the historical record of the earth's orbit. He obtained data on the sun orbits the earth changes shape every 100 thousand years. Orbit is better or more oval on the time interval.

The form known as eksentritas orbit. One related aspect is the 41 thousand year cycle in the slope of the axis of the earth. Earth glaciation also occurred every 100 thousand years.

Lesiecki find the time to climate change and eccentricity occur together. »Clear relationship between time changes in the earth's orbit and climate change is strong evidence of the relationship between the two," Lisiecki concluded. »It is not possible that these events would not be related to one another."

In addition to finding the relationship between the change in the form of the orbit and early glaciation, correlation Lisiecki find surprising. He found the largest glacial cycles occur during the change of the weakest in the eccentricity of the orbit of the earth and vice versa.

He found strong changes in the earth's orbit associated with climate change is weak. »This may mean that Earth's climate has internal instability in addition to sensitivity to changes in the orbit," Lisiecki said.

He concluded that the pattern of climatic changes during the last one million years may involve complex interactions between different parts of the climate system, as well as three berebda orbital system, namely obrit eccentricity, tilt, and precession or changes in axis orientation.

ScienceDaily | PURW

There's an Alien Hiding in Glacier Dengue?



Antarctica, KOMPAS.com - In recent years, the panorama Glacier Blood reappeared at a location of the Antarctic continent. The phenomenon is located in Mc Murdo Dry Valleys are famous as vast areas without ice, is one the most unique regions in the continent of Antarctica.

The valley is located at the South Pole though, but there are rarely ever the ice, because the winds that sweep into the valley with the speed (hurricanes) 320 km / h capable of grabbing all the moisture.

When someone walked alone up the valley, after the carcasses of penguins and other animals, can finally see a glacier, "Blood."

It is said that the bloody glacier was discovered by Robert Scott's expedition team in 1911, later proved to be caused by iron pengoksidasian.

Reportedly, each specific period, the glacier can shoot clear fluid that is rich in iron oxidation and then immediately going into an old red menggiriskan.

Discover Magazine mention, the liquid is derived from lake-rich salty water content of salt at a depth of 390 meters of ice.

Recent studies have found there are bacteria in such a difficult situation whose lives rely on the substance of sulfur and iron compounds.

According to researchers, since the glacier was born from the lake, creating an ecological environment in such a cold, dark and without oxygen. Such a group of bacteria already isolated over 150 million years.

Besides the scientists also think, bloody glacier produced by the bacteria there is the possibility of outer space creatures that live in our solar system, for example under the polar ice sheets both Mars and (a month's) Jupiter is also the possibility of life there.

The Standing Cat!

Cat family is known for high-level equilibrium. They also cited have nine lives. But, quite rare cat that can stand on two legs.

Except for this one cat!

Step Repair issues Black Screen of Death in Windows 7

As the first party interference issue has breathed Black Screen of Death on Windows 7, Prevx Ltd. provides automatic repair tool that can be downloaded from the site. Nevertheless, Prevx also provide remedial measures to the user manual that has the technical knowledge of computing. Following these steps.


1. Restart the computer.

2. Do log on and wait for a black screen appears.

3. Make sure the computer is connected to the internet.

4. Press the CTRL, ALT, and DEL simultaneously.

5. Click Start Task Manager.

6. In Task Manager, click the Application tab.

7. Then click New Task.

8. Enter the command: "C:\Program Files\Internet Explorer\iexplore.exe", and
    "http://info.prevx.com/download.asp?GRAB=BLACKSCREENFIX".

9. Click OK and the download process begins immediately.

10. Run the new program is downloaded.

11. Restart the computer.

Mafia Wars: Special Energy Pack for Mafia Wars Toolbar

Special Energy Pack for Mafia Wars Toolbar

If You have installed the toolbar and keep it, you will have one benefit,If you have the toolbar installed, you can get a Mini energy pack every 8 hours which restores 25% of your total energy.




  • For normal energy pack, we have to wait until 23 hour and get our energy increase to 125% of max energy.
  • For Special energy pack, we have to wait for 8 hour and get our energy increase to 25% of max energy. And it's continue giving special energy pack when we have used it.


To download toolbar the link is  Mafia Wars Toolbar
Click this.
Enjoy

Year 2038 Problem: Unix Millennium bug, or Y2K38

What's wrong with Unix systems in the year 2038?
 
The typical Unix timestamp (time_t) stores a date and time as a 32-bit signed integer number representing, roughly speaking, the number of seconds since January 1, 1970; in 2038, this number will roll over (exceed 32 bits), causing the Year 2038 problem (also known as Unix Millennium bug, or Y2K38). To solve this problem, many systems and languages have switched to a 64-bit version, or supplied alternatives which are 64-bit.

What is the year 2038 bug?

In the first month of the year 2038 C.E. many computers will encounter a date-related bug in their operating systems and/or in the applications they run. This can result in incorrect and grossly inaccurate dates being reported by the operating system and/or applications. The effect of this bug is hard to predict, because many applications are not prepared for the resulting "skip" in reported time - anywhere from 1901 to a "broken record" repeat of the reported time at the second the bug occurs. Also, leap seconds may make some small adjustment to the actual time the bug expresses itself. I expect this bug to cause serious problems on many platforms, especially Unix and Unix-like platforms, because these systems will "run out of time". Starting at GMT 03:14:07, Tuesday, January 19, 2038, I fully expect to see lots of systems around the world breaking magnificently: satellites falling out of orbit, massive power outages (like the 2003 North American blackout), hospital life support system failures, phone system interruptions (including 911 emergency services), banking system crashes, etc. One second after this critical second, many of these systems will have wildly inaccurate date settings, producing all kinds of unpredictable consequences. In short, many of the dire predictions for the year 2000 are much more likely to actually occur in the year 2038! Consider the year 2000 just a dry run. In case you think we can sit on this issue for another 30 years before addressing it, consider that reports of temporal echoes of the 2038 problem are already starting to appear in future date calculations for mortgages and vital statistics! Just wait til January 19, 2008, when 30-year mortgages will start to be calculated.

What causes it?

What makes January 19, 2038 a special day? Unix and Unix-like operating systems do not calculate time in the Gregorian calendar, they simply count time in seconds since their arbitrary "birthday", GMT 00:00:00, Thursday, January 1, 1970 C.E. The industry-wide practice is to use a 32-bit variable for this number (32-bit signed time_t). Imagine an odometer with 32 wheels, each marked to count from 0 and 1 (for base-2 counting), with the end wheel  used to indicate a positive or negative integer. The largest possible value for this integer is 2**31-1 = 2,147,483,647 (over two billion). 2,147,483,647 seconds after Unix's birthday corresponds to GMT 03:14:07, Tuesday, January 19, 2038. One second later, many Unix systems will revert to their birth date (like an odometer rollover from 999999 to 000000). Because the end bit indicating positive/negative integer may flip over, some systems may revert the date to 20:45:52, Friday, December 13, 1901 (which corresponds to GMT 00:00:00 Thursday, January 1, 1970 minus 2**31 seconds). Hence the media may nickname this the "Friday the Thirteenth Bug". I have read unconfirmed reports that the rollover could even result in a system time of December 32, 1969 on some legacy systems!

What operating systems, platforms, and applications are affected by it?


A quick check with the following Perl script may help determine if your computers will have problems (this requires Perl to be installed on your system, of course):


#!/usr/bin/perl
#
# I've seen a few versions of this algorithm
# online, I don't know who to credit. I assume
# this code to by GPL unless proven otherwise.
# Comments provided by William Porquet, February 2004.
# You may need to change the line above to
# reflect the location of your Perl binary
# (e.g. "#!/usr/local/bin/perl").
# Also change this file's name to '2038.pl'.
# Don't forget to make this file +x with "chmod".
# On Linux, you can run this from a command line like this:
# ./2038.pl
use POSIX;
# Use POSIX (Portable Operating System Interface),
# a set of standard operating system interfaces.
$ENV{'TZ'} = "GMT";
# Set the Time Zone to GMT (Greenwich Mean Time) for date calculations.
for ($clock = 2147483641; $clock < 2147483651; $clock++)
{
    print ctime($clock);
}
# Count up in seconds of Epoch time just before and after the critical event.
# Print out the corresponding date in Gregorian calendar for each result.
# Are the date and time outputs correct after the critical event second?

I have only seen a mere handful of operating systems that appear to be unaffected by the year 2038 bug so far. For example, the output of this script on Debian GNU/Linux (kernel 2.4.22):
# ./2038.pl
Tue Jan 19 03:14:01 2038
Tue Jan 19 03:14:02 2038
Tue Jan 19 03:14:03 2038
Tue Jan 19 03:14:04 2038
Tue Jan 19 03:14:05 2038
Tue Jan 19 03:14:06 2038
Tue Jan 19 03:14:07 2038
Fri Dec 13 20:45:52 1901
Fri Dec 13 20:45:52 1901
Fri Dec 13 20:45:52 1901

Windows 2000 Professional with ActivePerl 5.8.3.809 fails in such a manner that it stops displaying the date after the critical second:
C:\>perl 2038.pl
Mon Jan 18 22:14:01 2038
Mon Jan 18 22:14:02 2038
Mon Jan 18 22:14:03 2038
Mon Jan 18 22:14:04 2038
Mon Jan 18 22:14:05 2038
Mon Jan 18 22:14:06 2038
Mon Jan 18 22:14:07 2038

So far, the few operating systems that I haven't found susceptible to the 2038 bug include very new versions of Unix and Linux ported to 64-bit platforms. Recent versions of QNX seems to take the temporal transition in stride. If you'd like to try this 2038 test yourself on whatever operating systems and platforms you have handy, download the Perl source code here. A gcc-compatible ANSI C work-alike version is available here. A Python work-alike version is available here. Feel free to email your output to me for inclusion on a future revision of this Web page. I have collected many reader-submitted sample outputs from various platforms and operating systems and posted them here.

For a recent relevant example of the wide-spread and far-reaching extent of the 2038 problem, consider the Mars rover Opportunity that had a software crash which resulted in it "phoning home" while reporting the year as 2038 (see paragraph under heading "Condition Red").
A large number of machines, platforms, and applications could be affected by the 2038 problem. Most of these will (hopefully) get decommissioned before the critical date. However, it is possible that some machines going into service now, or legacy systems which have never been upgraded due to budget constrains, may still be operating in 2038. These may include process control computers, space probe computers, embedded systems in traffic light controllers, navigation systems, etc. It may not be possible to upgrade many of these systems. For example, Ferranti Argus computers survived in service long enough to present serious maintenance problems. Clock circuit hardware which has adopted the Unix time convention may also be affected if 32-bit registers are used. While 32-bit CPUs may be obsolete in desktop computers and servers by 2038, they may still exist in microcontrollers and embedded circuits. For instance, when I last checked in 1999, the Z80 processor is still available as an embedded function within Altera programmable devices. Such embedded functions present a serious maintenance problem for all rollover issues like the year 2038 problem, since the package part number and other markings usually give no indication of the device's internal function. Also, I expect we've already forgotten how many devices are running strange mutations of embedded Microsoft Windows. I can recall encountering some telephony devices and printers running Windows CE and NT under the hood, just off the top of my head. And don't forget emulators that allow older code (both applications and operating systems) to run on newer platforms!
Many Intel x86 platforms have BIOS date issues as well. The Linux BIOS time utility hwclock has issues around the critical second in 2038 too (DO NOT try this on a production system unless you REALLY know what you're doing):
[root@alouette root]# hwclock --set --date="1/18/2038 22:14:06"
[root@alouette root]# hwclock --set --date="1/18/2038 22:14:07"
RTC_SET_TIME: Invalid argument
ioctl() to /dev/rtc to set the time failed.
[root@alouette root]# hwclock --set --date="1/18/2038 22:14:08"
date: invalid date `1/18/2038 22:14:08'
The date command issued by hwclock returned unexpected results.
The command was:
  date --date="1/18/2038 22:14:08" +seconds-into-epoch=%s
I performed this test on an Intel Celeron laptop which has a Toshiba BIOS. Note that trying to set the BIOS hardware to the "critical second" (relative to my time zone) resulted in a different error than setting the BIOS to one second later (although that failed too). Usually Linux systems do not rely too heavily on the BIOS clock and will try to synchronize themselves to an NTP server after boot anyway. Again I must emphasize, do not play with the date on a product server! This story may help illustrate why one should never try critical date testing on production machines...
"Three years ago, I had several servers timed to a single (note I say single) NTP located at a university that, unbeknownst to me, was checking for the Unix equivalent of the Y2K bug that will occur in the year 2038. Well, perhaps the university didn't realize that businesses such as mine actually rely on accurate data. During their test, they advanced their real time clock to the year 2038, and suddenly, without warning, each of my log files started writing the year 2038 instead of 1999. This caused massive problems for systems across the network. The lesson learned was that most NTP clients have the ability to reference two or more stratums, writing the average of difference to the software clock, and where the difference is [significantly] out of sync, the client will either compensate to the clock closest to the last known value or exit (1), leaving the local clock(s) unchanged." - The Importance of Choosing an Accurate NTP
I believe the year 2038 problem will more likely result in air traffic control disasters, life-support systems failure, and power grid meltdown than the year 2000 problem. The year 2000 problems often involved higher-level application programs, disrupting inventory control, credit card payments, pension plans, and the like. The 2038 problem may well cause more serious problems because it involves the basic system timekeeping functions from which most other time and date information is derived. Databases using 32-bit Unix time may survive through 2038, and care will have to be taken in these cases to avoid rollover issues. Some problems related to the year 2038 have already started to show themselves, as this quote from the Web site 2038bug.com illustrates:
The first 2038 problems are already here. Many 32-bit programs calculate time averages using (t1 + t2)/2. It should be quite obvious that this calculation fails when the time values pass 30 bits. The exact day can be calculated by making a small Unix C program, as follows:
echo 'long q=(1UL<<30);int main(){return puts(asctime(localtime(&q)));};' > x.c && cc x.c && ./a.out
In other words, on the 10th of January 2004 the occasional system will perform an incorrect time calculation until its code is corrected. Thanks to Ray Boucher for this observation.
The temporary solution is to replace all (t1 + t2)/2 with (((long long) t1 + t2) / 2)(POSIX/SuS) or (((double) t1 + t2) / 2) (ANSI). (Note that using t1/2 + t2/2 gives a roundoff error.)
Some Unix vendors have already started to use a 64-bit signed time_t in their operating systems to count the number of seconds since GMT 00:00:00, Thursday, January 1, 1970 C.E. Programs or databases with a fixed field width should probably allocate at least 48 bits to storing time values. 64-bit Unix time would be safe for the indefinite future, as this variable won't overflow until 2**63 or 9,223,372,036,854,775,808 (over nine quintillion) seconds after the beginning of the Unix epoch - corresponding to GMT 15:30:08, Sunday, December 4, 292,277,026,596 C.E. This is a rather artificial and arbitrary date, considering that it is several times the average lifespan of a sun like our solar system's, the very same celestial body by which we measure time. The sun is estimated at present to be about four and a half billion years old, and it may last another five billion years before running out of hydrogen and turning into a white dwarf star.
A recent example of the 2038 problem documented on Wikipedia:
In May, 2006, reports surfaced of an early Y2038 problem in the AOLServer software. The software would specify that a database request should "never" timeout by specifying a timeout date one billion seconds in the future. One billion seconds after 21:27:28 on 12 May, 2006 is beyond the 2038 cutoff date, so after this date, the timeout calculation overflowed and calculated a timeout date that was actually in the past, causing the software to crash.

What can I do about it?

If you are a programmer or a systems integrator who needs to know what you can do to fix this problem, here is a checklist of my suggestions (which come with no warranty or guarantee):
Consider testing your mission-critical code well ahead of time on a non-production test platform set just before the critical date, or with utilities such as the FakeTime Preload Library. FTPL "...intercepts various system calls which programs use to retrieve the current date and time. It can then report faked dates and times (as specified by you, the user) to these programs. This means you can modify the system time a program sees without changing the time system-wide" [emphasis theirs].
An organization called The Open Group (formerly X/Open), which maintains the Unix specification and trademark, has a number of programming recommendations which should be followed by developers to deal with the year 2038 and other problematic dates.
Also, see this article regarding Solutions to the Year 2000 Problem by my colleague Steve Manley. Many of his suggestions can be applied to the 2038 problem too. I would also suggest this very concise and well-written essay on the 2038 problem by a programmer named Roger M. Wilcox.
If you are working with Open Source code, this free library may be a useful reference for patching existing code for high-accuracy longterm time calculation: "libtai is a library for storing and manipulating dates and times. libtai supports two time scales: (1) TAI64, covering a few hundred billion years with 1-second precision; (2) TAI64NA, covering the same period with 1-attosecond precision. Both scales are defined in terms of TAI, the current international real time standard." An attosecond, defined in U.S. usage, is one quintillionth (10**18) of a second (it takes a very fast stopwatch for a researcher to clock those pesky photons). This is the kind of good timekeeping one might need for deep-space probes or high-reliability systems.
For more general applications, just using large types for storing dates will do the trick in most cases. For example, in GNU C, 64-bits (a "long long" type) is sufficient to keep the time from rolling over for literally geological eons (I can hear Carl Sagan in my head saying "beeeelions... and beeeelions..."). This just means any executables the operating systems runs will always get the correct time reported to them when queried in the correct manner. It doesn't stop the executables from having date issues of their own.
"The best way to predict the future is to engineer it." There will always be stuff we won't foresee, or simply didn't know about. Even this most exhaustive list of critical dates by Dr. Stockton may miss some critical dates - perhaps you work with emerging, proprietary, or closed source code. Check Dr. Stockton's list twice if you need to worry about "five-nines" reliability: you'd be surprised how many issues there are just within two centuries of date calculation. A lot of modern computers do a considerable amount of work within these years, since it covers a couple of recent human generations of calculable human demographics. Software for genealogy, mortgage calculation, vital statistics, and many other applications may need to frequently and reliably peer a century or two forward or backward.
Good luck, and I hope no ones flying car breaks down in 2038! :-)

How is the 2038 problem related to the John Titor story?

If you are not familiar with the name John Titor, I recommend you browse the site JohnTitor.com. I understand that my site's URL appears quoted a number of times in the discussion of this apparently transtemporal Internet celebrity. I don't know John Titor, and I have never chatted with anyone on the Internet purporting to be John Titor. The stories have not convinced me so far that he has traveled from another time. I also have some technical issues with his rather offhanded mention of the 2038 problem. Furthermore, John Titor has conveniently returned to his time stream and can not answer email on the subject. Having said that, I think it fair to say that I find John Titor's political commentary insightful and thought-provoking, and I consider him a performance artist par excellence.

Sources: The Project 2038

Complete SEO for blogs and sites

SEO which you need to do after you download the template and upload it to your blog:
after uploading your template sucessfully, remain on the same Dashboard>Layout>Edit HTML page and analyze the template code. Find the META tags which are providing in the beginning lines of the code. META tags start with 'meta'. Do the following steps:




  • Provide your Blog Description in Meta Description tag by removing "Your Blog description..."
  • Provide words relevant to your blog in the Meta Keywords tag by removing "Your keywords..."
  • Provide yourname/company in the Meta authors tag by removing "Yourname/company"
After you have done putting description, keywords and yourname/company. make sure you click "PREVIEW" button to make sure you`ve entered the info properly in your blog. Click "SAVE TEMPLATE" when you`re done.

The second step is submit your blog/site to 3 major search engines [Google | Yahoo | Bing] to get indexed and get traffic. You can do this by submitting only index/main page of your site using their submission links below:



That`s it!

If you need to customize and search engine optimize your blog/site deeply, you can follow my my complete Customization and SEO guide provided below. You need some HTML, XML and CSS skills to work it out.


COMPLETE CUSTOMIZATION AND SEO GUIDE FOR BLOG/SITE

HTML size
 

Page size matters because search engines limit size of a cached page. For example, Google will only cache a full page if the size of its HTML is less than 101 Kb (images and external scripts are not included). Yahoo! caches text of up to 500 Kb per page. This means if your HTML page is too large, search engines will not cache the full page, and only the top part of the text will be searchable.


Same color text and background
If the color of the text on a page is close to the background color, the text becomes almost invisible. As a rule, this technique is employed to populate a page with keywords without damaging its design. Since it is considered as spam by most search engines, I suggest that you do not try it.


Tiny text
If a page uses Cascading Style Sheets and there are fonts smaller than 4 pixels, they are reported as tiny texts. Most search engines consider tiny texts as an abusive practice - this is why you should avoid using them.

Immediate keyword repeats
The same keyword repeated one after the other a few times, for example air tickets on-line, air tickets, air tickets, air tickets, air tickets in Hong Kong is a questionable trick. For this example, there will be three repetitions reported, because the keyword was placed three times in a row after it was used first. Such repetitions are considered as spam by most search engines.

Controls
Try to avoid too many controls on your page, especially in the top area, since it may decrease your keyword prominence and result in low rankings.

Frames
Not all search engines support frames, i.e. can follow from a frameset page to content frames and index texts. If your website consists of frames, and you cannot redesign it, you can solve this problem by putting the content of an optimized page with links to other pages into a "NOFRAMES" HTML tag.

External and Internal JavaScript
Do not use too many embedded scripts on the page, because your keyword prominence will be reduced, and thus your page will be ranked lower on search engines. We advise putting the script in an external file or move it as close to the closing Body tag as possible.

External and Internal VBScript

Please note that excessive use of scripts in the top area of the page dilute keyword prominence and therefore affect your rankings. Put the script in an external file or move it as close to the closing Body tag as possible.

File robots.txt allows spidering
Robots.txt is a text file placed in the root directory of a website/blog to tell robots on how to spider the website. Only robots that comply with the Robots Exclusion Standard will read and obey the commands in this file. Robots.txt is often used to prevent robots from visiting some pages and subdirectories not intended for public use. However, if you want search engine robots to spider your site, there should not be disallowing commands included within this file for all or particular search engine robots.

Head area
Each HTML document should have a HEAD tag at the beginning of each document. The information contained inside the head tag (...) describes the document, but it doesn't show up on the page returned to the browser. The Title tag and meta tags are found inside the Head tag.

An HTML tag within the Head tag is used to define the title of a Web page. The content of the Title tag is displayed by browsers on the Title bar located at the top of the browser window. Search engines use the Title tag to provide a link to the site matching the user's query. The text in the Title tag is one of the most important factors influencing search engine ranking algorithms. By populating your most important keywords in the Title tag, you dramatically increase the search engine ranking of the page for those keywords. Your title tag should be the first tag in the HEAD tag.

Stop Words
To save space and speed up searching, some search engines exclude common words from their index, therefore these words are ignored when searches are carried out.

'The', 'or', 'in', 'it' are examples of such words. These words are known as "stop words." To make your pages search engine-friendly, you should avoid using stop words in the most important areas of your page like title, meta tags, headings, alternative image attributes, anchor names, etc.

Besides, stop words have no contextual meaning - using them in short areas such as a title, headings, and anchor texts will reduce weight, prominence and the frequency of keywords.

Keyword frequency
Frequency is the number of times your keyword is used in the analyzed area of the page.

Example: If the page's first heading is 'Get the best XYZ services provided by XYZ Company', frequency of keyword 'XYZ' in the heading will be two. Frequency relates only to the exact matches of a keyword. Therefore, frequency of key phrase 'XYZ services' will be one, because as exact match, this keyword is used only once.

Search engines use frequency as a measure of keyword importance.

Search engines rate pages with more keywords as more relevant results, and score them higher. However, you should not use too many keywords, since most search engines will penalize you for this practice for being seen as an attempt to artificially inflate rankings.

Keyword weight (density)
Keyword weight is a measure of how often a keyword is found in a specific area of the Web page like a title, heading, anchor name, visible text, etc. Unlike keyword frequency, which is just a count, keyword weight is a ratio.

Keyword weight will depend on the type of keyword, that is if the keyword is a single word or phrase. If the keyword includes two or more words, for example, 'XYZ services', every word in the key phrase (i.e. both 'XYZ' and 'services') contributes to the weight ratio in the weight formula, and not as one keyword ('XYZ services').

Keyword weight is calculated as the number of words in the key phrase multiplied by frequency and divided by the total number of words (including the keyword).

Example: The title of a Web page is 'Get Best XYZ Services'. Keyword weight for 'XYZ services' is 2*1/4*100%=50%. If you reduce the number of words in the title by removing the word 'get', so the title becomes 'Best XYZ Services', than the keyword weight will be larger: 2*1/3*100%=67%. Finally, if you only keep 'XYZ Services' in the title, the keyword weight will become 100% -- 2*1/2*100%.

So, to increase the keyword weight, you should either add some more keywords or reduce the number of words in the page area. The proportion of the keywords to all words will become larger, so will the keyword weight.

Many search engines calculate keyword weight when they rank pages for a particular keyword. Normally, high keyword weight tell search engines that the keyword is extremely important in the text; however, a weight that is too high can make search engines suspect you of spamming and they will penalize your website's rankings.

Keyword Prominence
Keyword Prominence is another measure of keyword importance that relates to the proximity of a keyword to the beginning of the analyzed page area. Being the keyword that is used at the beginning of the Title, Heading, or on top of the visible text of the page is considered more important than other words. Prominence is a ratio that is calculated separately for each important page area such as a title, headings, visible text, anchor tags, etc.

HTML pages are written in a document-like fashion. The most important items of a document's visible text are placed at the top, and their importance is gradually reduced towards the bottom. This idea can be also applied to keyword prominence. Normally, the closer a keyword to the top of a page and to the beginning of a sentence, the higher its prominence is. However, search engines also check if the keyword is present in the middle and at the bottom of the page, so you should place some keywords there too.

The prominence formula takes the following factors into account:
 
1) Keyword positions in the area,
2) Number of words in the keyword, and
3) Total number of words in the area.

100% prominence is given to a keyword or keyphrase that appears at the beginning of the analyzed page area.
Example 1: Let's take the page title 'Daily horoscopes on your desktop' and analyze prominence of keyphrase 'daily horoscopes'. The title word order will be: 'Keyword1, keyword2, word3, word4, word5'. Prominence will be 100% here as the keyphrase is present at the beginning of the sentence.

The keyword/keyphrase in the middle of the analyzed area will have 50% prominence.

Example 2: The anchor name is 'Find here the daily horoscope for your sign'. The keyword prominence of the phrase 'daily horoscope' in this case will be 50% as the keyphrase is located in the middle of the sentence -- 'Word1, word2, word3, keyword4, keyword5, word6, word7, word8'.

As a keyword appears farther back in the area, its prominence will be counted from zero and it will depend on how close to the end it is. If the keyword appears at the end of the area, its prominence will be close to 0%. If the keyword appears at the beginning of the area and then is repeated in the middle or at the end, its prominence will be 100% because prominence of the fist used keyword prevails over the repeated keywords.

META Description
Syntax: < META name="Description" content="Web page description">

This is a Meta tag that provides a brief description of a Web page. It is important the description clearly describes the purpose of the page. The importance of the Description tag as an element of the ranking algorithm has decreased significantly over years, but there are still search engines that support this tag. They log descriptions of the indexed pages and often display them with the Title in their results.

The length of a displayed description varies per search engine. Therefore you should place the most important keywords at the beginning of the first sentence -- this will guarantee that both users and search engines will see the most important information about your site.

META Keywords
Syntax: < META name="Keywords" content="keyword1, keyword2, keyword3">

This is Meta tag that lists the words or phrases about the contents of the Web page. This tag provides some additional text for crawler-based search engines. However because of frequent attempts to abuse their system, most search engines ignore this tag. Please note that none of the major crawler-based search engines except Inktomi provide support for the Keywords Meta tag.

Similar to the description tag, there is a limit in the number of captured characters in Keywords meta tag. Ensure you've chosen keywords that are relevant to the content of your site. Avoid repetitions as search engines can penalize your rankings. Move the most important keywords to the beginning to increase their prominence.

META Refresh 
Syntax: < META http-equiv="refresh" content="0;url=http://newURL.com/">

This HTML META tag also belongs in the Head tag of your HTML page.

The META Refresh tag is often used as a way to redirect the viewer to another Web page or refresh the content of the viewed page after a specified number of seconds. The META Refresh tag is also sometimes used as a doorway page optimized for a certain search engine, which is accessed first by users, who then are redirected to the main website. Some search engines discourage the use of this META tag, because it is an opportunity for webmasters to spam search engines with similar pages that all lead to the same page. In addition, this also clutters the search engines databases with irrelevant and multiple versions of the same data. Try to avoid doorways and redirects altogether in your Web/blog  building.

META Robots
Syntax: < META name="Robots" content="INDEX,FOLLOW">

The robots instructions are normally placed in a robots.txt file that is uploaded to the root directory of a domain. However, if a webmaster does not have access to /robots.txt, then instructions can be placed in the Robots META tag. This tag tells the search engine robots whether a page should be indexed and included in the search engine database and its links followed.

The content of the robots meta tag is a comma separated list that may contain the following commands:

ALL also INDEX,FOLLOW -- there are no restrictions on indexing the page or following links; NONE also NOINDEX,NOFOLLOW -- robots must ignore the page; a combination of INDEX, FOLLOW, NOINDEX, NOFOLLOW -- if you want a search engine robot just to index a page but not to follow links, you should specify 'INDEX,NOFOLLOW', if you want it to follow links without indexing the page, you should instruct robots as 'NOINDEX,FOLLOW'.

BODY area
The body tag indentifies the beginning of the main section of your Web page, the main content area. The whole of the Web page is designed between the opening and closing body tag. (...) including all images, links, text, headings, paragraphs, and forms.

The recommendations on how to use keywords in the BODY tag are the same as in other important areas. Your primary keywords should be placed at the top of your body tag (first paragraph) and as close to the beginning of a sentence as possible. Do not forget to use them again in each paragraph. Keywords should not be repeated one after another. For search engines that check keyword presence at the bottom of the body tag, you should use your most important keywords within the last paragraph from the closing body tag.

Visible text
The content of the Body tag includes both visible and invisible text. The term 'Visible text' refers to the portion that is displayed by the browser.

Extra emphasis by search engines is put on keywords when you underline them or make them bold, thus helping higher rankings for these keywords.

Keyword in the Heading
It is important the keyword is present in the very first heading tag on the page regardless of its type. If the keyword is also used as a first word, you will raise its prominence.

All headings

There are standard rules for the structure of HTML pages. They are written in a document-like fashion. In a document, you start with the title, then a major heading that usually describes the main purpose of the section. Subheadings highlight the key points of each subsection. Many search engines rank the words found in headings higher than the words found in the text of the document. Some search engines incorporate keywords by looking at all the heading tags on a page.

Links
Anchor tags on the page can also have keyword-rich text as anchor names. This text can be important to some search engines and therefore also for the rankings of the destination pages. Create anchored links with keywords in them to link pages of your website.

Text in links including ALTs
Images like buttons, banners, etc. may include Alt attributes as a text comment describing the graphic image. If this image has been used as a hyperlink, the Alt attribute is interpreted as a link text by some search engines, and the destination page will have a significant boost in rankings for the keyword in the Alt attribute. Use graphic links with keyword-rich Alts to link pages of your website.

ALT image attributes
Optimization of Alt image attributes gives you another opportunity to use keywords. It is advantageous if the page is designed with large graphics and very little text. Include the target keyword in at least the first three Alt attributes.

Comments
This tag lets webmasters write notes about the page code, which is only for their guidance and is invisible to the browser. Most search engines do not read the content of this tag, so Comments optimization will not be as helpful as Title optimization. The Comment tags should be populated with keywords only if the design of the Web page does not allow more efficient and search engine-friendly methods.

Keyword in URL
Having keyword in your domain name and / or folder names and file names increases your chances to gain top positions for these keywords. If you aren't a brand-oriented business, it is recommended that you purchase the domain name that contains your keyword. If your keyphrase consists of more than one keyword, the best way to separate them in the URL is with a hyphen "-":
www.my-keyphrase-here.com
If it seems impossible to get such a domain name, or your site is already well established over a keyword-poor domain, attempt to compensate it by using keywords in folder and file names of your site's file system on the server.

Link popularity
This is the number of links from other website pages to your page that search engines are aware of.
Each search engine only lists links embedded on the sites that are preindexed by that particular search engine. So, the presence of certain links in Google's index will not guarantee that Inktomi has also indexed the same sites. Therefore the number of links shown will be different from engine to engine.

In general, the more links that point to your page, the better your page will rank.

However, a large number of links is not the deciding factor that helps your site get to the top of the results pages -- the quality of those links is of greater importance. If a link to your site is placed on a page having very little importance that is this page itself is linked to only a few other pages or none, this kind of link will not improve a page's popularity. The links to your pages should be subject-relevant because theme-based search engines will check the parity of content between referring and referred pages. The closer they are, the more relevant your site page is to the searcher's query for your keyword. Avoid reciprocal linking with sites that have a low weight, or a questionable reputation or are different from yours in subject matter. As a part of their anti-spam measures, search engines can penalize your site's rankings for ignoring these pitfalls.

Theme
For spam-free and relevant results, search engines start evaluating sites as one page to find the main theme covering all pages of the site. Most major search engines have become theme-based.

Search engines extract and analyze words on all pages of a website to discover its theme. The more keywords found on your website that relate to the user's query, the more points you get for the theme. Therefore, if your Web business includes many products or services, try to find the theme that covers them all.

Open Directory Project listing (dmoz.org)
The ODP (also known as DMOZ) is the largest human-edited directory on the Web. Many major search engines use the ODP data to provide their directory results. This works because sites put forward for inclusion in the ODP are reviewed by real people who care about the quality of their directory.

It is still a good for a website to be present in the ODP. For new sites, it is an excellent starting point, because Google regularly spiders the ODP to update its own directory based on the ODP listings, and if your site is included, you'll get a link that Google believes important enough to start off crawling your site.

As well as the weight of a link from the ODP, it would be even better if the site were listed in the most topic-specific category to make the link not only important, but also content-relevant.

Yahoo! Directory listing
This is similar to the ODP -- Google relationship. The Yahoo! directory is regularly crawled by the Yahoo! robots.

A new site has a greater chance of being included faster in the Yahoo! search engine if there is a link to this site from the Yahoo! directory. If you get your site is listed within the Yahoo! category closest to your site theme, this particular link will help your site move up.

Search Engine Bot
Search engine bot is a type of web crawlers which collects web documents to generate and maintain index for the search engines.

Google PageRank
Google PageRank is the measure of a page’s importance in Google’s opinion. PR calculations are based on how many quality and relevant sites across the Web link to this page. The higher the PageRank of the referring page, the more weight this link has.

Alexa Traffic Rank
Alexa Traffic Rank is a combined measure of page views and users (reach). This information is gathered with the help of Alexa Toolbar used by millions of Web surfers. First, Alexa calculates the reach and number of page views for all sites on the Web on the daily basis. Then these two quantities are averaged over time.

Backlinks Theme
To determine your site rankings, search engines take into consideration theme relevance of those sites linking to you. If the linking sites have something in common with yours (keywords in the BODY, titles, descriptions of the linked pages, etc.), your website gets better chances to gain high positions for these keywords.

PR Statistics for linking sites
PR Statistics for linking sites is statistic information about the Page Rank of the pages linking to you. Statistics are presented both in numerical and percentage terms. The higher the PR of the referring site, the better chances your own Web page has to get high PR.

Five Best Antivirus Applications



Computer viruses are increasingly sophisticated and pervasive. If you can't afford to run your computer without some sort of antivirus software installed, check out these five popular options to protect your PC. Photo by Cushing Memorial Library and Archives, Texas A&M University Archives.
Earlier this week we asked you to share your favorite antivirus application; now we're back with the five most popular apps to help you secure your PC and keep it running virus free.
Note: For each entry, we reviewed the lowest cost option available from the company in question. Most companies offer premium packages of varying cost and with varying additional bells and whistles, save for the always-free Microsoft Security Essentials. For the purpose of this comparison, we stuck to the free/entry level options.

Avast! (Windows/Linux, Basic: Free, Premium $39.95 per year)


Having just celebrated its 21st birthday, Avast! is an old player in the antivirus market. Avast! has built up a solid following based on their philosophy of offering dependable and effective antivirus protection for free to home users. In addition to standard antivirus scanning, Avast! offers a variety of resident protection modules that cover different aspects of your computer like instant messaging, email, P2P applications, and more.

Microsoft Security Essentials (Windows, Free)


Microsoft Security Essentials is the newest addition to Microsoft's computer protection software. It replaces the Windows Live OneCare subscription service and Windows Defender by providing more comprehensive coverage than either of the two originally provided. Microsoft Security Essentials is free for all Windows users and provides protection against a variety of threats including viruses, malware, adware, and spyware.

Avira (Windows, Basic: Free, Premium: $30 per year)


Avira is another antivirus app available for free, although the free version of Avira doesn't offer as many bells and whistles as some of the other free offerings in today's Hive. Nonetheless, you still get dependable antivirus scanning and protection from malware and rootkits. In addition to the free antivirus software, Avira also offers a Linux Live CD recovery disc loaded with Avira and other free system recovery tools to help you get back on your feet if fighting the virus infection from within Windows just isn't cutting it.

ESET NOD32 Antivirus (Windows, $39.99 per year)

NOD32 has built a large based of users over the years by having a low number of false positives and a high rate of early detection thanks to their community-sourced ThreatSense detection system. As a fun bit of trivia, American users may know the application as "NOD" and pronounce the acronym as an actual word, but the name is actually an acronym that hails from the Eastern European origins of the application. From the Wikipedia entry on NOD32:
The acronym NOD stands for Nemocnica na Okraji Disk ("Hospital at the end of the disk"), a pun related to the Czechoslovakian medical drama series Nemocnica na okraji mesta (Hospital at the End of the City).

AVG (Windows, Basic: Free, Premium: $54.99 per year)


The free offering from AVG is one of the lightest, feature-wise, among the nominations in this Hive Five. That said, if you're looking for a basic antivirus application that will scan your computer, keep an eye out for spyware, and keep you from visiting malware and virus laden websites (via their LinkScanner protection), AVG is a solid free offering.

Popular Posts

Powered By Blogger