Well I did overclocked my older system (week after building - that was over 6 years ago), by default it was 2 x 1.8Ghz. I wanted it at 2 x 2.4ghz (at the time they sold it at 2.4ghz but like £400 vs £180). I did a lot of reading and it turned out that I had the 2.4Ghz cpu (looked up serial code) but the multiplier had be limited back internally. So I upped the FSB to 240, put a memory divider on (so the memory ran at 200Mhz x 2, DDR1) and even had lower voltage than stock. It only ever ran slightly hotter than stock at 1.8Ghz due to its under-volting. This system is still running fine now at 2.4Ghz. So a 33% o/c improvement (on each core).
However with new series of cpus is there any need to o/c now? I upgraded to a 2600k so I could overclock later (the K is the unlocked multiplier).. however it runs everything fine. Benchmarks are great at first, seeing a few more marks after o/c however if you play games on it more, then are these extra marks really going to make a lot of difference? So I played it safe with it.. for now.
With the modern cpu's they do overclocking of themselves. I found if I manually set to 3.5Ghz, then all cores only run at 3.5Ghz max.. however if I leave the chip to do it, then it can overclock itself up-to 3.9Ghz depending on how many cores are used (more cores used, lower o/c).
However, I think its gone of the days where we saw 20-40% increase from overclocking (unless you go extreme), most of the manufacturers have already jumped on the wagon.. even gfx cards (such as nVidia Keepler's) overclocking themselves too. So much easier buy a gfx already o/c - this is were the biggest gains from o/c is felt in gaming.