What's new
Pinball info

Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

pinside

  • Thread starter Thread starter Deleted member 2463
  • Start date Start date
as I understand it - it is but fire brigade made them cut power to the whole site. we have punters in the place also.
Imagine all the millions spent in ‘disaster scenario’ planning and then the fire brigade turn up and just cut all power and tell you no generator backup allowed....

I’m sure the phrase ‘unprecidented’ is going to be used a lot if you work there...
 
Imagine all the millions spent in ‘disaster scenario’ planning and then the fire brigade turn up and just cut all power and tell you no generator backup allowed....

I’m sure the phrase ‘unprecidented’ is going to be used a lot if you work there...
assume nothing...assume its gone, no access no nothing....i would not want to be that DR manager, i hope he has an up to date CV
 
Imagine all the millions spent in ‘disaster scenario’ planning and then the fire brigade turn up and just cut all power and tell you no generator backup allowed....

I’m sure the phrase ‘unprecidented’ is going to be used a lot if you work there...
not the first time I've seen that happen, this is why I don't believe in tier 3 and tier 4 DCs if you want diversity put it 100km away.

we all have back up network operation centres for a major disaster that were useless in dealing with covid!
 
not the first time I've seen that happen, this is why I don't believe in tier 3 and tier 4 DCs if you want diversity put it 100km away.
Exactly, very surprising if there was no proper long distance external site mirror type scenario. This is one thing in my old life I was pleased about when colleges in Leeds merged, many more physical locations to mirror servers so unless you take out a full city the chance of the shuddering words of ‘system down’ were unlikely 😎🥳
 
not the first time I've seen that happen, this is why I don't believe in tier 3 and tier 4 DCs if you want diversity put it 100km away.

we all have back up network operation centres for a major disaster that were useless in dealing with covid!
Kilometre?...pfft miles is better.
 
oh but "yadda yadda" I don't need power in a tier 3/4 I have my own generator. then some asshats fly two planes into a building and the port authority stops anyone delivery any diesel for a while...
 
Exactly, very surprising if there was no proper long distance external site mirror type scenario. This is one thing in my old life I was pleased about when colleges in Leeds merged, many more physical locations to mirror servers so unless you take out a full city the chance of the shuddering words of ‘system down’ were unlikely 😎🥳
Cough...process...process...all well and good...then you find the holes in the processes.
 
Oh dear sidetracked thread....techie anoraks discuss the joys of DR (which is not the same as business continuity) just sayin.
 
If it's this place:


....it should worry EVERYONE. This seems to be their slogan:

Your Data: If You Have Nothing to Hide, You Have Nothing to Fear​


They go on:

What Data We Collect​


Every day, people leave a digital trail of electronic breadcrumbs as they go about their daily routine. They go to work using electronic fare cards; drive through intersections with traffic cameras; walk down the street past security cameras; surf the internet; pay for purchases with credit/debit cards; text or call their friends; and on and on.


There is no way to predict in advance which crucial piece of data will be the key to revealing a potential plot. The standard operating procedure for the Domestic Surveillance Directorate is to "collect all available information from all available sources all the time, every time, always".


Even if it's NOT this place (there seem to be many data centres in Utah) it is a solid lesson in NOT trusting your information solely to "the cloud" - what a joke name anyway.

I keep all of my stuff on cloned drives refreshed once a week. One kept at the factory and one is taken home. I can only ever lose one week of info, worst case. If I kept the info in "in the clouds" I could lose the lot overnight and with no chance of ever getting anything back, how ridiculous that people have been conned into believing this is actually a safe storage method.
 
Last edited:
LOL @Homepin

that website is a joke parody website - its not the real thing.

I had some clown telling me yesterday that every bit of internet traffic in the UK goes to the US for "vetting". Absolute nonsense.
 
“Hello Everyone,

Now that we have a better understanding of what happened we would like to give everyone an update.

One of our old generators that have worked for years and was recently load tested had a mechanical failure and caught fire resulting in power being cut to our core routers and fire suppression system controlling the fire. Unfortunately, the fire department opted to cut power to the rest of the building as a precaution even though the power systems were independent. We are currently waiting for an emergency inspector to arrive to give the all-clear so we can bring most of the servers in Ogden back online. Some servers will have an extended outage as they may require rebuilds due to some water damage. Those builds have a high probability that data is intact.

We would like to thank you for your patience and know that we are doing everything we can to get everyone back online.

If you have a Server at our LA location and need help please email us at lawebnx@webnx.com

WebNX


Erm, ok then..... 😳
 
Water damage, how can that be if it was just the geny? Surely it’s not that close to the servers!
Depends on the building, they likely put in fire breaks, water blankets and damping down (basically soaking any area with fire damage or adjoining to the fire damage) begins as soon as the fire is confirmed out and an FI has inspected the seat/cause of the fire.
 
not the first time I've seen that happen, this is why I don't believe in tier 3 and tier 4 DCs if you want diversity put it 100km away.

we all have back up network operation centres for a major disaster that were useless in dealing with covid!

And test it!

Many years ago, I did some consultancy for a VERY large bank, after having spent years working for their workflow supplier. I knew their 'redundant' workflow/imaging system couldn't work (lack of that functionality had been a customer complaint for years) but they wouldn't believe me - nothing I did could convince them as the big external Dev that put the system and took millions for doing so told them it worked fine. In the end, I took the directors to the bunker which had the redundant systems and asked one to point to an optical disk - I then just ripped it out of the tower. Shocked/Angry faces, but when I turned it over to show them it was blank, and then removed several more at random I think I got my point across... (The IT Director lasted three weeks after that, and wasn't my biggest fan...)
 
Depends on the building, they likely put in fire breaks, water blankets and damping down (basically soaking any area with fire damage or adjoining to the fire damage) begins as soon as the fire is confirmed out and an FI has inspected the seat/cause of the fire.
I wouldn’t be happy with that kind of risk in the design, surely they have plenty of space out there to keep the genys at a safe distance.
Must be close to the servers then.
 
Back
Top Bottom