TYOC038 Update - All the people left with a broken service are not directly involved in the disk issue it had, but most likely affected by it in that the system stuttered while they were creating and didn't create as efficiently, it looks like a provisioning issue similar to the other node earlier, I forgot the name. They never created properly. These date back from 04/11/2021 until now. We're going to set them back to pending, adjust the due date again to make up for the lost time, and add them to the pending queue. These should still be created first on the script again but I want to be very clear that it's not a guarantee. I don't see any reason why they shouldn't since it's the same script that would select these first if they were already selected first.
So if yours never provisioned correctly, today, you will see it go back to pending and due date changed to later.
Tokyo Update
Then, we're going to schedule maintenance with Tokyo to add in a new disk to TYOC038, as well as TYOC035. These nodes are both missing 2TB and have <65% CPU usage currently, so after the maintenance, creations will run again.
There will be total 3.5TB space for services, so about 10GB for 384MB plan and 50GB for 2.5GB plan for a rough average of 30GB, or 116 services. There are currently 72 services we know that did not provision correctly so those will fit, plus about 44 others.
TYOC033 has about 1.5TB of space, so up to another 50 here.
TYOC036 already received another disk, for another 1.75TB or 58 more services.
Total room for up to 152 services after TYOC038 is taken care of. Probably closer to 100 due to CPU constraints and since larger plans are left at the end due to previously discussed issue with the script.
Other nodes, they still need to cool off or may already be at capacity. That means 50-100 services at the end without an immediate home. Going to go through another round of requested refunds and see if we can make it all work.
TYOC037 has 1.5TB of space but it needs to calm down, we did just recently create a good amount on this one.
TYOC034 has about 1.2TB of space, but same, needs to calm.
TYOC039 has 660GB of space and it's pretty calm but I don't want to put anyone else here.
TYOC040 has 690GB of space that can maybe be used but I'll have to monitor it.
(edit) Actually a lot of this space issue can be resolved quickly, I'm going to go through anyone who has a ticket open about TYOC038 which has an active network status and start cancelling/refunding people since we request tickets not be opened in these cases. Email was sent as well already.
谁能告诉我,他说的什么意思,翻译看起来奇奇怪怪的 |