There are still concerns about scalability. Up to 20 TB you are probably OK. Once you get up over that, you really need to take a look at how the product is architected, how to manage performance and scalability, and even data destruction on the back end. How does the product remove expired data from the index? How is the index rebuilt over time? How do you add more capacity? It really becomes a much more complex proposition at the enterprise level.
So, these issues are still there, and companies need to be aware of them. On the plus side, this technology is so hot right now, and vendors are throwing resources at it trying to bring their products up to snuff. They know that companies want this, and they know it's a huge problem. Expect a lot of changes in the next 12 months.
Check out the entire Data Deduplication FAQ.