What is a Gigahertz?

By | 2008-01-14

One gigahertz is equal to 1,000 megahertz (MHz) or 1,000,000,000 Hz. It is commonly used to measure computer processing speeds. For many years, computer CPU speeds were measured in megahertz, but after personal computers eclipsed the 1,000 Mhz mark around the year 2000, gigahertz became the standard measurement unit. After all, it is easier to say “2.4 Gigahertz” than “2,400 Megahertz.”

While gigahertz is most commonly used to measure processor speed, it can also measure the speed of other parts of the computer, such as the RAM and backside cache. The speed of these components, along with other parts of the computer, also impact the computer’s overall performance. Therefore, when comparing computers, remember the number of gigahertz is not the only thing that matters.

Author: dwirch

Derek Wirch is a seasoned IT professional with an impressive career dating back to 1986. He brings a wealth of knowledge and hands-on experience that is invaluable to those embarking on their journey in the tech industry.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.