Los Alamos National LaboratoryAPEX: Alliance for Application Performance at Extreme Scale
Collaborating to design, acquire, and deploy advanced technology high performance computing systems

APEX 2020 Request for Proposal (RFP) No. 387935

All proposals are due on or before 10:00 a.m. Mountain Time on Thursday, November 17, 2016.

Interested parties are invited to submit a proposal for two (2) subcontracts for delivery of the Crossroads and NERSC-9 Computer Systems, plus two (2) non-recurring engineering (NRE) subcontracts. Subcontracts that may result from this RFP are in support of the Alliance for Application Performance at Extreme Scale (APEX) composed of the following High Performance Computing organizations:

  • The New Mexico Alliance for Computing at Extreme Scale (ACES):
    • Los Alamos National Laboratory (LANL)
    • Sandia National Laboratories (SNL)
  • The National Energy Research Computing Center (NERSC) at Lawrence Berkeley National Laboratory (LBNL)

Interested parties are advised to monitor this website for potential APEX 2020 RFP amendments and other APEX 2020 RFP information updates. The Contract Administrator may notify interested parties of updated APEX 2020 RFP information via e-mail; however, there is no obligation to do so.

It is the responsibility of all interested parties to monitor this website for current APEX 2020 RFP information.

Interested parties must submit in writing all communication regarding the APEX RFP (questions, comments, etc.) to the Contract Administrator.

APEX 2020 RFP Questions and Answers

Q1: Can one vendor be selected to deliver the Crossroads system and can a different vendor be selected to deliver the NERSC-9 system?
A1: Irrespective of any language in the RFP that might be ambigous to this point, Offerors may rely on the statement in the RFP Invitation Letter (2nd page) that LANS and UC each reserve the right to make awards to separate Offerors.


The following questions are regarding Subsection 3.5.1 of the RFP Technical Requirements document:

Q2: Should an Offeror submit the SSI spreadsheet with its other submission material? Should all configurations for which benchmark data is being provided be included in one single SSI spreadsheet with multiple tabs, or should we include a separate SSI spreadsheet for each configuration?
A2: The SSI spreadsheet should be submitted with the Offeror's other submission material. Additional spreadsheets should be submitted as a separate spreadsheet file using the formatting provided in the download. The Offeror may submit multiple spreadsheets where different machine configurations are being proposed. The name of each individual spreadsheet file should clearly relate to the machine variant in the main proposal response. The file(s) should be provided with the other submission material but logically separate (different directory). Submissions may be provided by read-only media (CD or DVD) or the LANS file transfer service as a single file archive (e.g. ZIP file).

Q3: Should the benchmark modifications or new benchmark variants be provided together with other material or should it be provided on a separate CD or USB drive?
A3: The Offeror benchmark modifications or new variants should be provided with the other submission material but logically separate (different directory). The directory name should reflect the modified benchmark and/or variants. Submissions may be provided by read-only media (CD or DVD) or the LANS file transfer service as a single file archive (e.g. ZIP file).

Q4: Should an Offeror prepare a separate document describing predicted or extrapolated results, or should this information be included in the 200 page technical proposal? If a separate document can be provided, is there a page limit or format required for this separate document?
A4: The Offerors methodology should be documented in the 200 page proposal. Summarized results/predictions relevant to the SSI should also be included.


The following questions are regarding RFP Form A Schedules of Proposed Quantities and Prices:

Q5: Regarding the Schedule entitled "Sum of Proposed Pricing All Pricing Schedules," the last line reads, "Sum of All Proposed Pricing". Will the amount entered in this last line be used as an Offerors total evaluated price or will an Offerors evaluated price be a subset of the above 3 Schedules? If it will be a subset, what Schedules will be included to form an Offerors evaluated price?
A5: Offerors should transfer the totals from each of Schedules 1.1, 1.2, 1.3, 2.1, 2.2, and 2.3 to the table at the top of RFP Form A (Sum of Proposed Pricing All Pricing Schedules). An Offerors total evaluated price will be based on the Sum of All Proposed Pricing; however, an Offerors pricing may be adjusted as appropriate by the Source Evaluation Committee in accordance with Section 4.c.2. of the RFP Instructions to Offerors.

Q6: Regarding Schedule 1.3 and 2.3, Maintenance and Support, the schedule is asking for both 7X24 and 5x9 pricing and the last line indicates' Total.... Options Pricing (this price shall be the sum of all amounts above). Wouldn't that amount be overinflating an Offerors evaluated bid amount, since one should only choice one or the other maintenance package?
A6: The evaluation of pricing on Options should be clear to Offerors based on Section 4.c.2. of the RFP Instructions to Offerors.

Q7: If an option requires some portion of NRE funding, how and where should that option cost be reflected?
A7: How NRE will affect pricing of any aspects of system build or of options is not known until the scope of work for the NRE subcontract is proposed and the results of the NRE work is complete. Therefore, pricing proposed for the builds and for options should not include NRE funding.

Q8: 'RFP-FORM-B-SCHEDULE-OF-PROPOSED-MILESTONES-AND-PAYMENTS', shows an anticipated funding stream for each system and NRE program on which offerors are asked to deliver a table of milestones and associated payments for each system subcontract and for each NRE subcontract. To guide us in creating this pricing information, can you please clarify what is the NRE portion of the anticipated funding stream shown in this document?
A8: The RFP Technical Requirements Document, Section 4 Non-Recurring Engineering, provides that “It is anticipated that the NRE subcontracts will be approximately 10%-15% of the combined Crossroads and NERSC-9 system budgets.

Q9: The Crossroads system shows a funding stream beginning in FY2016. Can you clarify that this refers to a Government FY, and also, since Government FY 16 is now past, can you indicate when we should target the start date of the NRE programs?
A9: Yes, the fiscal years indicated are Government Fiscal Years. Offerors should consider funding from FY16 to be available as funds that will roll-over to FY17. Start dates for NRE subcontracts are anticipated to start in 3rd-4th Quarters of FY17.

Q10: Please provide information on the requirement in Section 3.4.7 regarding "serialized namespaces".
A10: Serializing and restoring a namespace provides a way to manage the data associated with a workflow. The feature could be used to efficiently move workflows' data, whether between tiers or within tiers. For example, it may be necessary to move a set of files associated with a workflow from/to fast on-platform storage. An inefficient mechanism for moving those files might be to move each file independently via FTP. Another possible mechanism might be to serialize the entire namespace so that all of underlying metadata storage blocks are moved in a small number of transfers.

Q11: Section 5.3, Test Systems, of the RFP Technical Requirements Document requests a number of computational nodes for tests systems (Application Regression and Software Development). Since vendors have different architectural node designs, shouldn't the test systems be sized relative to the node count of the compute partition of the main system instead of by specific node count?
A11: The node counts selected by the APEX team (200 for the Application Regression test system and 50 for the Software Development test system) are not based on the scale/performance of the main system. The size of Application Regression testbed is based on the anticipated number of concurrent application developers and automated software regression suites. The Software Development testbed is expected to provide an environment for patch testing, software environment testing, and potentially joint collaboration with the selected vendor. The size of this testbed is based on a representative amount of nodes to potentially replicate any issues at small scale.

Q12: Regarding The Militarily Critical Technical Data Agreement found under the SPARC Benchmark Sandia Section at http://www.lanl.gov/projects/apex/. What level of export control clarification does the code and data fall into? If it is ITAR controlled, what US Munitions List (USML) category classification does it fall into?
A12: ITAR SPARC is ITAR Cat. XVI. The data that comes out of it will most likely take the same category and in no way changes the vendor’s responsibility of protecting at the appropriate level. The USML category is the ITAR Cat. XVI. The USML is Part 121 of the ITAR and consists of 21 categories of protected items, information, technology.

Q13: Is hard-drive retention required during the term of the subcontract?
A13 (ACES): Offeror shall agree that ACES will not return any failed storage media as part of the Return Material Authorization (RMA) process installed as part of the subcontract. Due to security requirements, ACES personnel will destroy this storage media upon removal from operation. Failed disks and similar media storage devices cannot be returned to the Successful Offeror (Subcontractor). In its description of support services, an Offeror shall describe the process it will require to replace destroyed media and any expectations it has of ACES for documenting failed hardware.
A13 (NERSC): All media that has contained sensitive data (such as passwords) will be accounted for and destroyed with documentation provided that confirms the drives’ destruction. Individual drives of a RAID array may not be subject to the hard-drive retention policy if the data on the drive is not deemed sensitive.

Q14: Some of the applications which can be obtained from the ACES laboratories are ITAR or Export Controlled, how should a vendor include any diffs or code changes from these applications in the vendor proposals?
A14: If Offerors wish to return any code modification or source code from ITAR labeled acceptance codes from the ACES laboratories in their response, then these responses should be provided in a separate bundle/DVD which is clearly marked as modified acceptance application source code. The Offeror should not include application source code from the acceptance codes in their printed/written response to the APEX 2020 RFP.

Visit Blogger Join Us on Facebook Follow Us on Twitter See our Flickr Photos Watch Our YouTube Videos Find Us on LinkedIn Find Us on iTunesFind Us on GooglePlayFind Us on Instagram