In most computer systems, a byte is the unit most computers use to represent a character such as a letter, number or typographic symbol. Each byte holds a string of bits that need to be used in a larger unit for application purposes, kilobyte, megabyte, etc. A bit has a single binary value, either 0 or 1. So a bit is either on (1) or off (0) and it is the combination of eight bits each being on or off which represent, to a computer, a letter, number, etc.

A group of four bits, or half a byte, is sometimes called a nibble or nybble. This unit is most often used in the context of hexadecimal number representation. If you understood that last sentence, you belong with the folks with pony tails and Einstein tee shirts working in a basement lab at IBM or another computer company.

In telecommunication, the bit rate is the number of bits that are transmitted in a given time period, usually a second. So if your at-home network says its running at 150mb (megabits), that means it's transmitting 150 million bits per second or nearly nineteen million letters or numbers per second. Not too shabby if you were around agonizing over a dial up transmission back in the days.

More Info: en.wikipedia.org