Hi folks,
I'm not a proper sysadmin.. so excuse the layman question.
Back in the day, the common wisdom was never to run file systems at > 90%. Also you'd be seeing performance issues at > 80%. Approaching 100% and performance nose dives and data loss gremlins arrive.
Does the same thing still hold these days? Take an example Linux NAS with a 60T XFS with samba and NFS. At 95% used, it still has 3T free. That thing sitting on a NAS with a dozen people hitting it hard and another dozen hitting it softly.. would be OK? Might see performance issues? Data loss?
How about at 80%? At 99% are the klaxons sounding?
If it occasional creeps to 95%, vs sitting there for a couple of weeks... Does that make a difference to the likelihood and severity of problems?
Apologies for the 20 questions. I'd like to offer some risk assessment type advise to management sorts, but recognise I may be woefully out of date.
Thanks! Rangi. To unsubscribe from the list send a blank e-mail to mailto:studiosysadmins-discuss-request@studiosysadmins.com?subject=unsubscribe