Wednesday, November 29, 2006
Status Report for 11/22/06 to 11/28/06
I’m still updating source codes of MEME portlets and Jobqueue portlets by adding comments and removing debugging messages. The source codes and war files will be exported to the SourceForge repository. The following portlets will be added:
a. memejob-portlet : A portlet for MEME job execution
b. moab-dashboard : A portlet for job queue status query
c. serialjob-portlet : A general portlet to execute a series of jobs
2. Open Science Grid
Meanwhile waiting for a user certificate from OSG CA, I’m working on setting up OSG client in my local machine. I think my installation was successful but $VDT_LOCATION/post-install/README reports a few warnings I should fix. I will clear up those as soon as possible and start to look at OGCE in OSG and CEMon.
Wednesday, November 22, 2006
Status Report for 11/15/06 to 11/21/06
After coming back Supercomputing '06 at Tampa, I'm looking around Open Science Grid (OSG) to verify whether OGCE portal can also work with OSG environments. For this test, I’ve installed OSG client by following the client installation instructions and been trying to setup correct configuration which includes certificates for GSI settings.
2. OGCE portlets
I’m preparing to export source codes of MEME and job queue portlets to CVS repository by adding comments into the codes and removing verbose debugging messages.
Wednesday, November 08, 2006
Status Report for 11/1/06 to 11/7/06
During the last week, I was working on Big Red portal by customizing for Supercomputing 06 and updating key portlets such as MEME job submission portlet and
1. MEME job submission portlet
a. File downloading: Basically, in portlets, file downloading feature is not working well since html header cannot be set by portlets but, instead, portlet container usually control html header such as "Content type" or "Content-Disposition". Thus, using servlet than portlet is more ideal in this situation. Good news here is that servlet and portlet can share data by using session. Thus, I created a servlet for file downloading which can get proxy information for authorization by using shared session.
b. Output retrieval: Outputs executed by a job managers can be retrieved later by using job handler which is an URL string starting with "http" or "https". Under the hood of this function, GASS server is in charge to save data and hand it over to the client upon request.
2.
a. Job query belongs to the user: Since each machine in Teragrid has different user map file on user certificate, username(id) can be different system by system even in using the same certificate. Simply, we can submit a GRAM job before starting query to get back a correct username by executing “whoami” command. Later, I’m planning to maintain information service to map usernames on different machines file by using RDF and