Message boards :
Number crunching :
Server Thread
Message board moderation
Previous · 1 . . . 6 · 7 · 8 · 9 · 10 · 11 · 12 . . . 16 · Next
Author | Message |
---|---|
Send message Joined: 12 Aug 17 Posts: 21 Credit: 58,957,280 RAC: 0 |
about time to hire a nanny for this kindergarten! |
Send message Joined: 30 Oct 16 Posts: 183 Credit: 18,395,933 RAC: 0 |
about time to hire a nanny for this kindergarten!You missed the capital A. I guess you failed kindergarten. I would have been spanked for writing such poor English. |
Send message Joined: 18 Jul 17 Posts: 138 Credit: 1,379,173,617 RAC: 0 |
Anyone have an idea of how often the "top computers" update runs? Thank you. A proud member of the OFA (Old Farts Assoc.) |
Send message Joined: 12 Aug 17 Posts: 21 Credit: 58,957,280 RAC: 0 |
Zipping the files together would reduce the network overhead, and wouldn't cost a lot of spare cash :). Could that be done with reasonable ease? yes - implemented since boinc 5.8 https://boinc.berkeley.edu/trac/wiki/JobTemplates <gzip_when_done/> use this to compress the output file before uploading, see FileCompression |
Send message Joined: 4 Feb 15 Posts: 847 Credit: 144,180,465 RAC: 0 |
And our files are zipped already (using But this works for single files, not whole result in 6 files. As I already describe in few posts... Krzysztof 'krzyszp' Piszczek Member of Radioactive@Home team My Patreon profile Universe@Home on YT |
Send message Joined: 12 Aug 17 Posts: 21 Credit: 58,957,280 RAC: 0 |
To zip multiple files in a "mix & match" fashion, you can use the boinc_filelist function provided. Basically, it's a crude pattern matching of files in a directory, but it has been useful for us on the CPDN project. Just create a ZipFileList instance, and then pass this into boinc_filelist as follows: bool boinc_filelist( const std::string directory, const std::string pattern, ZipFileList* pList, const unsigned char ucSort = SORT_NAME | SORT_DESCENDING, const bool bClear = true ); int do_gzip(const char* strGZ, const char* strInput) { // take an input file (strInput) and turn it into a compressed file (strGZ) // get rid of the input file after FILE* fIn = boinc_fopen(strInput, "rb"); if (!fIn) return 1; //error gzFile fOut = gzopen(strGZ, "wb"); if (!fOut) return 1; //error fseek(fIn, 0, SEEK_SET); // go to the top of the files gzseek(fOut, 0, SEEK_SET); unsigned char buf[1024]; long lRead = 0, lWrite = 0; while (!feof(fIn)) { // read 1KB at a time until end of file memset(buf, 0x00, 1024); lRead = 0; lRead = (long) fread(buf, 1, 1024, fIn); lWrite = (long) gzwrite(fOut, buf, lRead); if (lRead != lWrite) break; } gzclose(fOut); fclose(fIn); if (lRead != lWrite) return 1; //error -- read bytes != written bytes // if we made it here, it compressed OK, can erase strInput and leave boinc_delete_file(strInput); return 0; |
Send message Joined: 25 Sep 15 Posts: 23 Credit: 6,587,067 RAC: 0 |
Weather forecast helping the universe !! (j'attends la team 1er degré de pied ferme ;) ) |
Send message Joined: 18 Jul 17 Posts: 138 Credit: 1,379,173,617 RAC: 0 |
Hi, I am getting random task files running 6-8+ hours. How long before these will "time out"? Tom M A proud member of the OFA (Old Farts Assoc.) |
Send message Joined: 15 Aug 20 Posts: 38 Credit: 8,554,094,169 RAC: 0 |
but how much processing overhead will this induce on the server having to unzip ~200,000 results/day? seems like it'll just trade one bottleneck for another. |
Send message Joined: 12 Aug 17 Posts: 21 Credit: 58,957,280 RAC: 0 |
about nothing compared to the havoc we had. |
Send message Joined: 15 Aug 20 Posts: 38 Credit: 8,554,094,169 RAC: 0 |
about nothing compared to the havoc we had. yeah, but you're only having to do it for 100-200 results per day because CPDN has much longer tasks and less work available. Universe is handling 1000x the amount of results daily they're getting back 2-3 results per SECOND. the overhead will be massive. |
Send message Joined: 24 Mar 22 Posts: 17 Credit: 137,338,000 RAC: 0 |
Scale back and redeploy to other projects starting with those most contributing to the backlog? |
Send message Joined: 30 Oct 16 Posts: 183 Credit: 18,395,933 RAC: 0 |
Scale back and redeploy to other projects starting with those most contributing to the backlog?What?! |
Send message Joined: 18 Jul 17 Posts: 138 Credit: 1,379,173,617 RAC: 0 |
This is a server question for a non-U@H project. I know "several" people here also participate in the Einstein@Home project for their gpus and over here for the CPU tasks. Is anyone else getting a Proxy 502 from the e@h messages website? Tom M A proud member of the OFA (Old Farts Assoc.) |
Send message Joined: 18 Jul 17 Posts: 138 Credit: 1,379,173,617 RAC: 0 |
This is a server question for a non-U@H project. I know "several" people here also participate in the Einstein@Home project for their gpus and over here for the CPU tasks. Is anyone else getting a Proxy 502 from the e@h messages website? Tom M A proud member of the OFA (Old Farts Assoc.) |
Send message Joined: 30 Oct 16 Posts: 183 Credit: 18,395,933 RAC: 0 |
This is a server question for a non-U@H project.Yes. Subscribe to this thread, lots of stuff is posted there when there are problems with any project. https://boinc.berkeley.edu/forum_thread.php?id=10279 |
Send message Joined: 23 Apr 22 Posts: 167 Credit: 69,772,000 RAC: 0 |
We seem to be back. Uploads are going through, slowly, after a few retries. Scheduler requests are taking a while to get a response, but work is being allocated- but it can take a few requests to get a response that isn't an error or just a timeout. However actually being able to download allocated Tasks is near impossible without lots of "Retry pending transfers" action. So what happened??? Grant Darwin NT |
Send message Joined: 18 Jul 17 Posts: 138 Credit: 1,379,173,617 RAC: 0 |
I propose we had a "Labor (Day)" strike and now we are being hit by a "work slowdown" :) While I don't know what happened it seems like currently we are experiencing a classic server overloaded by upload symptoms. Tom M A proud member of the OFA (Old Farts Assoc.) |
Send message Joined: 18 Jul 17 Posts: 138 Credit: 1,379,173,617 RAC: 0 |
Uploads finished for me. Downloads started. A proud member of the OFA (Old Farts Assoc.) |
Send message Joined: 4 Feb 15 Posts: 847 Credit: 144,180,465 RAC: 0 |
It was a power cut over last night in server room. Everything should be ok now. Krzysztof 'krzyszp' Piszczek Member of Radioactive@Home team My Patreon profile Universe@Home on YT |