Message boards :
News :
New ULX's
Message board moderation
Previous · 1 · 2 · 3 · Next
Author | Message |
---|---|
Send message Joined: 24 May 15 Posts: 1 Credit: 93,494,957 RAC: 0 |
Please stop this crap! Thx Ritschie |
Send message Joined: 28 Feb 15 Posts: 23 Credit: 42,229,680 RAC: 0 |
Better you stop ULX 0.12 and send tommorow only a few number out for testing. <core_client_version>7.14.2</core_client_version> <![CDATA[ <stderr_txt> 20:46:12 (4040): called boinc_finish(0) </stderr_txt> <message> upload failure: <file_xfer_error> <file_name>universe_ulx_502_12608_20000_1-999999_650000_2_r524705463_1</file_name> <error_code>-131 (file size too big)</error_code> </file_xfer_error> </message> ]]> WU was downloaded 19 Mar 2020, 17:24:15 UTC for Win 10 and Ryzen 3600. |
Send message Joined: 25 Jan 20 Posts: 4 Credit: 1,308,667 RAC: 0 |
Thu 19 Mar 2020 08:54:57 PM CET | Universe@Home | Output file universe_ulx_502_101025_20000_1-999999_260000_0_r1757261875_1 for task universe_ulx_502_101025_20000_1-999999_260000_0 exceeds size limit. Thu 19 Mar 2020 08:54:57 PM CET | Universe@Home | File size: 1391041251.000000 bytes. Limit: 500000000.000000 bytes Thu 19 Mar 2020 08:55:11 PM CET | Universe@Home | Output file universe_ulx_502_101023_20000_1-999999_260000_1_r165725895_1 for task universe_ulx_502_101023_20000_1-999999_260000_1 exceeds size limit. Thu 19 Mar 2020 08:55:11 PM CET | Universe@Home | File size: 1391041251.000000 bytes. Limit: 500000000.000000 bytes |
Send message Joined: 2 Mar 20 Posts: 4 Credit: 50,400,833 RAC: 0 |
What is with upload? All my completed tasks since afternoon are stuck in uploading state and those which are actively uploading have speed 0.00kbps. |
Send message Joined: 4 Feb 15 Posts: 847 Credit: 144,180,465 RAC: 0 |
To many simultaneous uploads of huge ULX results files. I will generate smaller batches for this app as results for it are quite large... Also, even if we need more ULX's to be done I will also generate BHSpins to make easer to upload/download tasks for everybody. Krzysztof 'krzyszp' Piszczek Member of Radioactive@Home team My Patreon profile Universe@Home on YT |
Send message Joined: 18 Jul 17 Posts: 2 Credit: 457,375,786 RAC: 0 |
To many simultaneous uploads of huge ULX results files. Thank you, Krzysztof. Just so you know, I am still showing some errors with the new batch of ULX's, though it is a much lower percentage now. Like this one: Application Universe ULX 0.12 Name universe_ulx_502_9297_20000_1-999999_950000 State Computation error Received Thu 19 Mar 2020 10:40:07 AM MDT Report deadline Thu 02 Apr 2020 10:40:06 AM MDT Estimated computation size 807 GFLOPs CPU time 01:36:19 Elapsed time 01:37:41 Executable universe-ULX_12_x86_64-pc-linux-gnu It's still on the client, caught up with some other tasks that won't be uploaded for a while |
Send message Joined: 22 Mar 18 Posts: 29 Credit: 24,402,488 RAC: 0 |
Some of these ULX WUs are taking 9 hours. The scoring should be proportional to the CPU time and not just a fixed number. I fully agree. So fast running, few credits, long running, more credits. It is a way of calculation. Now if some WU needs 9 hours on your monster computers, perhaps somethings is wrong ! On my very little i7-2600K; it take about 4 hours. But I have buy a reserve of Belgian waffles to feed my hosts (only very few persons, here, can understand it) |
Send message Joined: 18 Jul 17 Posts: 2 Credit: 457,375,786 RAC: 0 |
Perhaps there are some minimum requirements for running these ULX tasks? I do run relatively small 240-250GB SSD's, generally with about 160GB free or so. This seems like it should be plenty, though some of the hosts have high core count. |
Send message Joined: 2 Jun 16 Posts: 169 Credit: 317,253,046 RAC: 0 |
To many simultaneous uploads of huge ULX results files. BHSpins tasks actually work too. |
Send message Joined: 30 Jun 16 Posts: 42 Credit: 309,815,029 RAC: 0 |
I switched over to the BHspin wu's. 71 invalid and 117 errors :( Sorry but what a waste of electricity. |
Send message Joined: 30 Jun 16 Posts: 42 Credit: 309,815,029 RAC: 0 |
Wow...... 5.5hrs work for 333 credits. :/ |
Send message Joined: 25 Jan 20 Posts: 4 Credit: 1,308,667 RAC: 0 |
Fri 20 Mar 2020 12:28:46 PM CET | Universe@Home | Aborting task universe_ulx_504_6827_20000_1-999999_310000_2: exceeded disk limit: 1510.36MB > 1430.51MB This again. I'm gonna immediately abort all the ULX tasks I receive. Can't see any point in crunching tasks only to get "exceeded disk limit" error at the end. |
Send message Joined: 20 Feb 15 Posts: 32 Credit: 3,502,459 RAC: 0 |
Received 1 task from "new batch" after you cxld all the "bad" ones on server but it also failed and been stuck trying to upload for about 6 hours now. Might as well cxl all the ULX tasks until you can reduce the output size. |
Send message Joined: 2 Jun 16 Posts: 169 Credit: 317,253,046 RAC: 0 |
Fri 20 Mar 2020 12:28:46 PM CET | Universe@Home | Aborting task universe_ulx_504_6827_20000_1-999999_310000_2: exceeded disk limit: 1510.36MB > 1430.51MB Just unselect the ULX in your user preferences instead of receiving them just to abort them. The server is having enough issues with all the connections. |
Send message Joined: 14 Jul 19 Posts: 4 Credit: 4,758,533 RAC: 0 |
I’ve just installed Centos 7.7.1908 (x64). The WUs fail (BHspin2_19_x86_64-pc-linux-gnu: error while loading shared libraries: libmvec.so.1: cannot open shared object file: No such file or directory). Is the reason of the failure linked to your post (the application is compiled on 4.9 kernel in Debian 9.1)? ldd BHspin2_19_x86_64-pc-linux-gnu ./BHspin2_19_x86_64-pc-linux-gnu: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by ./BHspin2_19_x86_64-pc-linux-gnu) ./BHspin2_19_x86_64-pc-linux-gnu: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by ./BHspin2_19_x86_64-pc-linux-gnu) ./BHspin2_19_x86_64-pc-linux-gnu: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by ./BHspin2_19_x86_64-pc-linux-gnu) linux-vdso.so.1 => (0x00007fff895f4000) libm.so.6 => /lib64/libm.so.6 (0x00007f3461a6c000) libmvec.so.1 => not found libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f3461765000) libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f346154f000) libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f3461333000) libc.so.6 => /lib64/libc.so.6 (0x00007f3460f65000) /lib64/ld-linux-x86-64.so.2 (0x00007f346204a000) Thanks |
Send message Joined: 22 Mar 18 Posts: 29 Credit: 24,402,488 RAC: 0 |
Hello, If I good remember, your problem was already reported. To solve it, you need to manually download all the files. By reading your message, the project is not correctly initialized. Go to download libraries of project. Not very convenient. I agree. Good luck and take care. Each time I see the dayly report of your country, my eyes become trouble. Our teams are concurent here, but all togheter, we need to support your country. Belgium have a very large Italian comunity. Italy in the past have help my country by working here. If you not known, our previous first minister come from Italy. I repeat : take care Best regards. |
Send message Joined: 15 Oct 17 Posts: 11 Credit: 4,735,011 RAC: 0 |
I've aborted 6 ULX tasks which had failed with the Disk usage limit exceeded error on other computers. One of the tasks on that list was cancelled by the server earlier today after the workunit had its 7th error task (with 4 of them being disk limit exceeded). "The ultimate test of a moral society is the kind of world that it leaves to its children." - Dietrich Bonhoeffer |
Send message Joined: 5 Mar 20 Posts: 6 Credit: 4,846,167 RAC: 0 |
Hello, an ULX task from 6 April auto-aborted due to disk size exceeded (after 6 hours of CPU processing, ouch): 10-Apr-2020 15:00:24 [Universe@Home] Aborting task universe_ulx_510_4959_20000_1-999999_130000_1: exceeded disk limit: 1808.91MB > 1716.61MB I'm using Linux and strangely, a wingmate running Windows seems to have had no problem with the same task. |
Send message Joined: 5 Mar 20 Posts: 6 Credit: 4,846,167 RAC: 0 |
|
Send message Joined: 14 Jul 20 Posts: 1 Credit: 163,833 RAC: 0 |
Set the file size to 5GB because the size of any tasks from any projects vary, i haven't had any tasks fail ever since, i don't think i need the full 5 GB but i've had some tasks in the past that somehow needed more capacity so i just accolate an extra couple free GB just incase/ |