Will consume ~100-150 watts (assume 125W), net ~20kH/s. Those figures are probably off up to 200% since it isn't listed in the Litecoin Mining Hardware Comparison Wiki page and I'm not familiar with Intel CPUs. Assume you pay $.1/kWh.
30 days @ 20kH/s @ current difficulty will net you ~$1.68 worth of LTC. (30*24*125)/1000=90kWh consumed. 90 * .1 = $9 in electricity consumed per thirty days. $1.68 < $9.00
My figures are probably way off, but even if you didn't pay for electricity, the wear on your CPU probably isn't worth it.
Edit: TDP of i5 750 is 95W. However, on the mining hardware comparison page, it appears the CPUs consume significantly more than the TDP when mining, so the power draw figure's probably roughly accurate (likely closer to 100W than 150W). 2500k performance vs. 750 shows 2500k is ~20% faster on benchmarks which may or may not be relevant. According to wiki page, 2500k hashes @ 31kH/s. 31*.8=24.8kH/s
Soo.... if optimistic, 30 days @ 24.8kH/s @ current difficulty = $2.09 revenues. (30*24*95)/1000=68.4kWh. 68.4*.1 = $6.84 expenses.