More information on ATLAS@Home can be found on the LHC@Home pages
ATLAS task progress (for tasks running on LHC@Home) is still maintained here, until it can be moved to LHC@Home
Here are the list of taks with task name (number of jobs done/total number of jobs):
- Task mc16_13TeV AZNLOCTEQ6L1_Ztautau.simul (12341011): 254/20000
- Task mc16_13TeV 3000_3500.simul (12236680): 241/16000
- Task mc16_13TeV 3000_3500.simul (12236667): 243/4000
- Task mc16_13TeV 2500_3000.simul (12236655): 238/16000
- Task mc16_13TeV 2500_3000.simul (12236643): 502/4000
- Task mc16_13TeV 2000_2500.simul (12236631): 1159/16000
- Task mc16_13TeV 2000_2500.simul (12236619): 3780/4000
- Task mc16_13TeV 1000_1500.simul (12236607): 14618/16000
- Task mc16_13TeV ZZvvqq_mqq20.simul (12236561): 156478/199300
User of the day
British born now living in Canada. Retired from Air Canada in 2001
No more ATLAS tasks, credit will be transferred to LHC soon
There are no more tasks left to run on ATLAS@Home, please disconnect from the project and join LHC@Home. The credit will be moved to LHC soon - due to various people on holiday around Easter it will probably happen in 1-2 weeks from now.
Thanks again for all your contributions to ATLAS@Home and please continue on LHC@Home! 7 Apr 2017, 13:59:30 UTC
All remaining ATLAS tasks will be cancelled on Friday 7 April
We are happy to see that many of you have moved over to LHC@Home. There are still a few remaining tasks left over here on ATLAS@Home, and in order to complete the move to LHC we plan to cancel all these tasks on Friday 7 April. If you are still running some ATLAS tasks please try to complete them by then so that you can get the credit.
After we have cancelled all the tasks we can proceed to move the credit over to LHC. 4 Apr 2017, 11:43:45 UTC
ATLAS/LHC consolidation is ready
After several months of extensive testing, we are now ready to complete the move to LHC@Home! The ATLAS app is now out of beta so we encourage you to start crunching ATLAS tasks on LHC. We do not plan to submit any more tasks to ATLAS@Home.
As stated earlier, credit will not be lost! We will move all the credit accumulated here to LHC@Home once all the remaining tasks are finished. To make this easier it would be very helpful if your email address you register with is the same on both projects, since this is the only way to match the accounts on each project.
In the LHC project preferences you can select which apps to run, so if you want to only run ATLAS then check the ATLAS box only.
There is an updated version of Yeti's checklist available, so please check this if you experience problems. There is also a dedicated message board for ATLAS.
Thank you for your crunching on ATLAS@Home over the years and we look forward to continued crunching on LHC! 21 Mar 2017, 9:00:17 UTC
As you may know we are working on putting all LHC-related projects under the combined LHC@Home project. The idea is to provide a single entry point for everyone who wants to participate in helping CERN research.
We have now set up a beta app for ATLAS on LHC@Home so we invite you to try it and provide feedback on the ATLAS message board. You need to enable beta apps in your project preferences to get these WU.
Currently the WU there are just for tests and we don't use the results but soon we will proceed with real WU. Once everything is working well there we will ask everyone to migrate over to LHC@Home and stop sending work to the ATLAS@Home project.
Credit will not be lost! We will move all the credit accumulated here to LHC@Home once all the remaining tasks are finished. To make this easier it would be very helpful if your email address you register with is the same on both projects, since this is the only way to match the accounts on each project.
Happy continued crunching,
The ATLAS@Home team 23 Jan 2017, 11:55:08 UTC
Thank you all for your contributions to ATLAS@Home this year! It is due to you that ATLAS@Home is one of the most important resources for ATLAS simulation. The project team wishes you a happy holiday and successful 2017!
Most of us will be away for the next two weeks so response to any problems will be slow. However we have loaded up the system with WU so hopefully there is enough to crunch on until the new year.
The project team 23 Dec 2016, 8:42:51 UTC
News is available as an RSS feed