Memory Hierarchy
Memory can be generalized into five hierarchies based upon intended use and speed. If what the processor needs isn't in one level, it moves on to the next, to look for what it needs. The first three hierarchies, registers, cache, and main memory, are currently volatile memory. This means once the power is cut from them, they lose their data. The last two are not volatile and are considered "permanent storage".
Registers
Typical access time: One clock cycle.
Registers are typically Static RAM in the processor that hold a data word, which on modern processors is typically 64 or 128 bits. The most important register, found in all processors, is the program counter. This tells where the next instruction is at. Most processors have a status word register which is used for decision making (notably MIPS processors don't have one) and the accumulator, which is used to store the result of a math operation. Complex instruction set computers usually have around a dozen or so registers since they're geared more for accessing main memory. Reduced instruction set computers have a lot more registers.
Cache
Typical access time: Tens to hundreds of clock cycles.
Cache is also usually found within the processor, but occasionally it may be another chip. Divided into levels, cache holds frequently used chunks of data from main memory. The more times that piece of data is used, its latency when accessing it approaches cache speed. Single core processors rarely have more than two levels of cache. Modern multi-core processors will have three, two levels for each core, one level that's shared.
Main Memory
Typical access time: Hundreds of clock cycles.
Most commonly called RAM. Main memory is relatively fast and holds most of the data and instructions that are needed by currently running programs. This used to be a precious resource up until the mid to late 2000s. Today, memory is so plentiful that modern-day operating systems will use unused portions of main memory to store the core parts of programs that are used a lot (which is also called caching) so that when opening that program, it appears to load a lot faster.
Secondary Memory
Typical access time: Millions of clock cycles.
Secondary memory is where data can be permanently stored, usually a hard disk drive (HDD) or solid state disk (SSD). The unfortunate thing is that secondary memory has a huge gap of latency, and there's a lot of work done to close this. Aside from bandwidth performance (2GB/s vs. 25.6GB/s on DDR3-1600 RAM), latency is still a bit of an issue (~100,000ns versus 10ns).
Removable memory
Typical access time: Tens of millions of clock cycles
Data that's intended to move around resides on removable memory. Examples include floppy disks, CDs and DVDs, and USB thumb disks. The biggest drawback of course is that they're horrendously slow. However, external hard drives can retain near native speeds with eSATA or USB3.0.