diff --git "a/database/tawos/deep/DM_deep-se.csv" "b/database/tawos/deep/DM_deep-se.csv" new file mode 100644--- /dev/null +++ "b/database/tawos/deep/DM_deep-se.csv" @@ -0,0 +1,5382 @@ +"issuekey","created","title","description","storypoint" +"DM-9","03/05/2014 09:49:35","Open up LSST software mailing lists","We all benefit from making LSST software development as open as possible and conducive to outside volunteer contributions (*). One way to increase community involvement is to open up our development mailing lists to the public, analogous to the way other open source projects do. For example, we could have: * software-devel@lsstcorp.org: the development mailing list, equivalent to current lsst-data * software-users@lsstcorp.org: the users mailing list, equivalent to current lsst-dm-stack-users mailing list (but it could possibly be replaced by StackOverflow/Confluence Questions) * lsst-dm@lsstcorp.org: internal, DM-staff only mailing list, for the *rare* discussions/notices that should go out to staff only. (*) Though we don't rely on them for meeting the project specs (legally required disclaimer :) ).",1 +"DM-52","03/07/2014 21:53:05","Qserv configuration - detailed design","Detailed design covering how all Qserv components will be configured for runtime. ",3 +"DM-68","03/07/2014 22:31:39","Qserv should fail when table not registered in CSS","""Table does not exist"" message from CSS is currently ignored and query proceeds. Example, a query ""select count(*) from t"" where t does not exist will successfully return 0 rows.",0 +"DM-71","03/07/2014 22:37:23","Data Distribution Design v1","Need to come up with detailed design covering how we will deal with data distribution: managing multiple replicas, recovering from faults, adding new nodes to the cluster, registering new data from L2 ingest and user data (L3).",8 +"DM-78","03/11/2014 10:48:10","Save a git-branch when a forced push is detected","Create a gitolite hook that will save a branch when a forced push is detected. E.g., if we have a ticket: 'tickets/DM-AAAA' and someone rebases it and pushes with '--force' before applying the update --- then the hook will branch off the old state into (say): backups/tickets/DM-AAAA/NNNN where NNNN is a monotonically increasing number (per branch).",1 +"DM-98","03/13/2014 12:59:23","clean up isr utility code","There is some commented code in isr.py. This should be removed or updated so that it works.",2 +"DM-148","03/13/2014 15:29:35","Improve naming of getters in AmpInfoTable","The names of the methods to get values from a record on AmpInfoCatalog are potentially confusing. This is because the convention is to call the getters get[attributename]. We could change the method names in the AmpInfoCatalog, or add methods in the SWIG wrapper.",1 +"DM-199","03/14/2014 10:23:59","Develop new master-worker result system ","Reimplement how results are returned from worker to the czar. Currently it relies on mysqldump, which is fairly inefficient. This is related to DMTF-1650-045",20 +"DM-202","03/14/2014 10:30:34","Qserv: unit testing (query execution) ","Design and build toy prototype of a test framework for testing query execution module. This might require a mock framework, as we want to be able to test things in isolation, without testing everything around the query execution module at the same time. This is related to DMTF-16570-21.",8 +"DM-213","03/14/2014 14:36:50","Setup multi-node testbed","It'd be useful to test Qserv using Winter2014 or Summer2014 data set on a multi-node cluster, just to exercise all pieces of the software and double check we are not missing anything.",5 +"DM-224","03/14/2014 14:57:43","Switch to MariaDB","We should switch Qserv to the MariaDB Foundation based MySQL.",3 +"DM-228","03/14/2014 20:30:31","Setup dev test environment","Setup whole Qserv environment, including installing data set, and validate it by running some simple queries. Suggest changes/improvements as appropriate.",8 +"DM-242","03/16/2014 07:11:32","switch from '.' to '_' in afw::table fields","We've been mapping '.' to '_' in afw::table I/O, which unnecessarily complicates lots of things. We'd like to switch to using '_' in the field names themselves, which requires ending this mapping in I/O, but we need to be backwards compatible. So we'll add a version to the FITS headers, and continue the mapping if the version is not present or is less than some value. Until we do this, the new field names being used in meas_base won't round-trip.",3 +"DM-271","03/19/2014 15:49:27","Setup the new Buildbot CI system","Setup the Buildbot 0.8.8 testbed for the DM environment. This includes: (1) setting up slaves on the set of common OS on which the DM stack runs; (2) creating a new continuous integration slave using the new eupsPkg-based build and distribution support, Definition of done: * Every git change of master should trigger a build of master * If a build failed, an e-mail will be sent to lsst-data (if the build succeeds, nothing happens) * Failures due to internal buildbot issues (e.g., config problems, transient system availability issues, timeouts, etc.) should go to the buildbot owner. * Allow user-triggered builds via web page (with specified refs to be built), with a common user (until we get LSSTC LDAP directory hooked up). It's understood that locking may not be fully implemented/fleshed out in this story. * It should be possible for a user on lsst-dev to easily setup the stack for either a failed or a successful build. ",100 +"DM-272","03/19/2014 16:03:50","Move TCT-relevant twiki documentation to Confluence","Congregate all the trac TCT-relevant documents (standards, policies, guidelines, meeting history) onto Confluence.",2 +"DM-273","03/19/2014 16:23:28","Develop and then create the organizational structure for DM Confluence space","Before we start populating the DM Confluence space with active pages, we should define an overall organizational structure/taxonomy.",1 +"DM-278","03/19/2014 22:16:00","Improve handling errors occuring in AsyncQueryManager","AsyncQueryManager is initialized based on configuration file, if the configuration is invalid, an exception should be thrown (eg in _readConfig()) and gracefully handled upstream.",1 +"DM-280","03/20/2014 10:18:58","clean up multiple aperture photometry code","I've been doing some minor work on the HSC-side ApertureFlux algorithm, and I wanted to record some concerns here (from both me and RHL) that should be addressed in the new meas_base version: - We should consider merging ApertureFlux and EllipticalApertureFlux into the same algorithm (with a config field to choose whether to use elliptical apertures). We could still register it twice, with a different default config value, and this should eliminate a lot of code duplication. We could also consider having them inherit from a common base class (instead of having EllipticalApertureFlux inherit from ApertureFlux, as is done now). - We should test that the threshold at which we switch from Sinc to naive apertures is obeyed exactly. - We should create a flag for the failure mode in which an aperture cannot be measured because we go off the edge of the image, and test that it appears at the right point. If possible, we should set this flag and measure what area we can within that aperture, instead of just bailing out. - (Somewhat off-topic) We should consider having utility functions on SourceSlotConfig to set all slots to None for use in unit tests.",5 +"DM-290","03/21/2014 17:40:19","Eliminate dependence of query analysis on parser and antlr","I would like to write and compile query analyzer code completely independently of the parser and ANTLR (transitively). This doesn't seem to work right now. This is not currently possible. This might take any where from a day to a week. (I'm not sure if we can finish anything in half a day, if you include the testing, review, feedback, and revision process, but perhaps unit testing will make that faster). Updates to follow after the scope is estimated. Dependencies to be broken: query --> parser, antlr (due to predicate depending on antlr nodes) qana --> parser, antlr ",8 +"DM-296","03/22/2014 16:51:48","fix namespaces in all Qserv core modules","This was suggested by the code review for ticket 1945 (https://dev.lsstcorp.org/trac/wiki/SAT/CodeReviews/1945), pasted below: common/src/*: While it's not required by the coding standards, I'm a big proponent of using namespace scopes in .cc files, which usually save you from needing namespace aliases and will certainly save you from having prefix every declaration with qserv::. At some point I'd recommend changing the header file extension from .hh to .h to match the rest of the LSST DM code, unless it's a big backwards compatibility issue. (transferred from trac ticket 2528)",5 +"DM-298","03/22/2014 17:03:03","restarting mysqld breaks qserv","Restaring mysqld results in unusable qserv (even if the restart happens when qserv is completely idle). The error message is: ERROR 2013 (HY000): Lost connection to MySQL server during query This happens most likely because qserv caches the connection, which becomes invalid when server is restarted. I am guessing the same will happen when there is a long period of inactivity (the connection times out). (transferred from trac 2853)",1 +"DM-309","03/24/2014 11:54:05","Jira for Qserv","Jira setup for Qserv, includes things like adding new tasks, transferring tasks from trac, epic/story/task division, assigning story points, setting scrum board, just learning things and more...",8 +"DM-313","03/24/2014 13:34:00","cleanup includes in Qserv core modules","Includes need cleanup: group into standard lib, boots and local, sort as appropriate etc. Also, unify forward declarations.",2 +"DM-321","03/25/2014 14:07:02","Re-think thread.cc and dispatcher.cc python interface","The mess of thread.cc and dispatcher.cc need to be re-thought and re-designed so that the interface is smaller and more obvious. ",2 +"DM-322","03/25/2014 15:12:41","Trim python importing by czar in app.py","Clean up the way modules are imported in qserv master, use relative import when appropriate instead of lsst.qserv.master. (migrated from Trac #2369)",2 +"DM-326","03/26/2014 12:42:16","Libraries being built in lib64 on OpenSUSE, when EUPS tables assume lib","A report from Darko Jevremovic : {quote} Hi Mario, I managed to build stack v8 on OpenSuse13.1 There were standard problems with lib/lib64 - namely system builds libraries in $PREFIX/lib64 and some programs are hard wired for $PREFIX/lib if you could change the last line of mysqlclient-5.1.65+3/ups/eupspkg.cfg.sh from (cd $PREFIX/lib && ln -s mysql/* . ) to ( cd $PREFIX && if [ ! -f ""lib"" ] ; then ln -sf lib64 lib; fi &&cd $PREFIX/lib && ln -s mysql/* . ) or something along that line (am not sure whether the syntax would work). Also if you could add in the same manner to ups/eupspkg.cfg.sh ( cd $PREFIX && if [ ! -f ""lib"" ] ; then ln -sf lib64 lib; fi) for the following packages: minuit2 gsl cfitsio wcslib {quote} ",1 +"DM-334","03/28/2014 12:16:46","Cut Qserv release","It'd be very useful to have fully functioning Qserv release with the latest set of changes (build, packaging, CSS, Daniel's fixes etc) during the Hackathon week.",2 +"DM-335","03/28/2014 14:04:02","Migrate std::lists to std::vectors","Suggested by Andy when reviewing DM-296, discussed at Qserv mtg 3/27. std::list --> std::vector * why? Default now is vector, iterating over vector is much more efficient than over list * revisit on case by case bases, do not blindly replace * preferred solution: typedef, and name it in a way that conveys the intent (e.g., might call it a ""container""), underneath use vector",8 +"DM-337","03/28/2014 14:06:55","removed dead code in stringUtil.h","Remove obsolete strToDoubleFunc (and more) in util/stringUtil.h.",1 +"DM-354","03/31/2014 13:45:06","Add cameraGeom overview to Doxygen documentation","The CameraGeom package needs an overview page (part of afw's main.dox) as part of the Doxygen documentation. I think it's up to Simon or me to add this.",2 +"DM-355","03/31/2014 15:00:36","Install and tag multiple Qserv versions on the same distserver","Done in DM-366",0.5 +"DM-365","04/01/2014 16:02:27","Integration tests dataset should be packaged in eupspkg","A qserv-integration-tests package should be created : - it would allow to manage easily, in ups/qserv.table, tests version for a given Qserv version. - it would allow to install Qserv dependencies related to testing, like partition (and other data ingest code which may arrive.",3 +"DM-366","04/01/2014 16:05:42","Refactor install/distribution procedures using lsst-sw"," Here's Andy Salnikov remark, during #3100 review : https://dev.lsstcorp.org/trac/ticket/3100#comment:18",5 +"DM-370","04/01/2014 23:16:44","improved how default values for CSS are handled","Need to improve how defaults are handled in qserv_admin. There seems to be some desire to warn when values are not set--how about setting defaults and just printing what configuration is being used? If this is something human-created, we should have reasonable defaults and not bother the user, unless no default is viable. I think we should only be strict on machine-generated input, where we would like to catch bugs as soon as possible. (This came up in the review of DM-56, the review comments are captured in DM-225)",5 +"DM-372","04/02/2014 09:55:07","fix testQueryAnalysis","5 tests fail in the testCppParser.",2 +"DM-380","04/03/2014 13:47:41","loadLSST bug(s) for csh, ksh","A flaw in the v8.0 loadLSST scripts (and/or in eups/bin/setups) causes the following errors: 1) When using ksh: {code} $INSTALL_DIR/loadLSST.ksh ksh: /path/to/INSTALL_DIR/eups/bin/setups.ksh: cannot open [No such file or directory] {code} And indeed, there is no eups/bin/setups.ksh file. 2) When attempting to run the installation demo (v7.2.0.0): {code} $> printenv SHELL /bin/tcsh {code} [The same issue appears with csh, unsurprisingly.] {code} $> source /path/to/install_dir/loadLSST.csh $> cd /path/to/demo $> setup obs_sdss $> ./bin/demo.sh ./bin/demo.sh: line 7: /volumes/d0/lsst/stack80/eups/*default*/bin/setups.sh: No such file or directory ./bin/demo.sh: line 12: setup: command not found {code} After hand-editing the demo.sh script to omit the ""/default"" string from the offending line, the demo runs normally to completion. Note that everything works fine for bash with v8.0, which is what I tested awhile back. ",1 +"DM-384","04/03/2014 18:42:33","Add Versioning to SourceTable in lsst::afw::table","Add version to afw::table::SourceTable. Persist that version number to fits file when the table is saved, and restore when the table is restored. Tables created and saved to disk prior to this modification will have the version number 0, by default. Tables created with the S14 version will have the version number 1. This change is to enable a new version of slots and field naming conventions as needed by the Measurement Framework overhaul, at the same time allowing current clients of SourceTable to continue to function. The work to define and persist the slots depending on the version will be on a separate issue. Should not appear as an alterable member of the metadata, but should be saved with the metadata and reloaded when the file is reloaded. getVersion and setVersion methods will be used to allow clients to alter this number.",1 +"DM-405","04/08/2014 02:43:27","Write Linux Standard Base - compliant init.d scripts","Qserv services init.d scripts have to rely on LSB, in order to work on multiple systems. Remark : xrootd has to be launched as a background process (i.e. with a & at the end). But this always send of return code equal to 0, even if xrootd fails to start, a shell function wait_for_pid will be implemented in xrootd init.d scritps to solve that (inspired from mysqld init.d script).",5 +"DM-429","04/08/2014 13:00:59","Make NoiseReplacer outputs reproduceable","We need a way to get back the noise-replaced Exposure as it was when a particular source was measurement, after the measurement has been run, without having to run noise-replacement on all the previous objects again. There is already code in afw::math::Random to output its state as a string; I think we should probably just save this string in the output catalog. This will require some API changes to allow the NoiseReplacer to modify the schema and set a field in the output records.",3 +"DM-435","04/08/2014 14:05:36","add aperture-correction measurement code to the end of calibrate","At the end of CalibrateTask, we'll want to compute the PSF and aperture fluxes of the PSF stars, and send those to the PSF model to be stored and interpolated (using the featured added via DM-434). We'll also need to run any other flux measurement algorithms that need to be tied to the PSF fluxes on these same stars; because these can be somewhat slow, we probably want to limit these measurements to only the PSF stars, rather than requiring all these algorithms to be run as part of calibrate.measurement. The relationships between these fluxes and the PSF fluxes will be additional fields to be added to and interpolated by the PSF. The HSC implementation of this work (as well as that of DM-436) was done on issue HSC-191: https://hsc-jira.astro.princeton.edu/jira/browse/HSC-191 There were changes to many packages, but the relevant ones for LSST are: https://github.com/HyperSuprime-Cam/afw/commit/057fb3c0581c512d5664f1883a72da950c9eae9d https://github.com/HyperSuprime-Cam/meas_algorithms/compare/HSC-3.0.0...u/jbosch/DM-191 https://github.com/HyperSuprime-Cam/pipe_tasks/compare/4c3a53e7238cbe9...u/jbosch/DM-191",8 +"DM-446","04/09/2014 18:33:25","Setup PeakLikelihoodFlux with new Algorithm Framework","Move PeakLikehoodFlux to meas_base framework ",1 +"DM-468","04/11/2014 10:30:47","Alias measurement.plugins to measurement.algorithms","The config item in the old measurement task, measurement.algorithms was changed to measurement.plugins in meas_base. The creates a backward compatibility issue for code which refers to this class member. Jim's suggested fix is to alias plugins with algorithms in the new measurement task.",1 +"DM-470","04/11/2014 14:21:27","Rework exceptions in css (python side)","Rework exceptions in css/KwInterface.py: split into key-value related exceptions, possibly moving the rest that deals with db/tables into client. This came up in the CSS review, see DM-225: ""CssException feels a bit out of place....""",1 +"DM-481","04/14/2014 03:48:47","Unit tests install directory","Hello, Tests target seems to be correctly defined in core/modules/SConstruct (cf. getTests() function), but tests doesn't seems to be nor built or runned. In DM-58 branch i've introduced a few line of code which now build the tests. For now, tests binaries are located in build/moduleName/ directory, do you think we should let the tests here or install it : - in the Qserv install directory ? - in a tests/ subdirectory of Qserv install directory ? - other solution ? - do we set an option to scons procedure in order to build, and run, these unit tests ? Thanks, Fabrice",0 +"DM-488","04/15/2014 13:24:07","Make JIRA notification e-mail more useful","From HipChat/Data Management: [12:09] Mario Juric: @jbosch @KTL @KSK Could you double-check if any of you got an e-mail from Jira on Saturday (Apr 12th) re issue DM-78 (I made you reviewers, but it looks like you weren't notified)? [12:10] K-T Lim: I don't recall and can't determine now; it would have been deleted (irrevocably). [12:10] Simon Krughoff: I did get an email. [12:10] Jim Bosch: @mjuric, ah, it appears that I actually did. The fact that I was a reviewer was just buried, and I didn't notice it. [12:10] Simon Krughoff: I must have missed that I was a reviewer. [12:10] Mario Juric: OK, thanks! That gives me not one, but two useful data points (#1 -- emails work, #2 -- they're useless :) ). [12:12] Simon Krughoff: I'm not sure why they are useless. The emails from trac were a very important part of my workflow as far as being notified of review responsibility goes. Maybe it's just the volume from Jira. [12:14] Jim Bosch: Yeah, same here. Though the volume from JIRA hasn't been so bad, so I don't think that's it. Maybe my brain just has to get used to the new email format. [12:14] K-T Lim: (In my case, I'm mostly paying attention to the RSS feed although the mailbox serves as a backup.) [12:22] Robert Lupton: One of the things that made gnats a good bug tracker was that the emails contained the right amount of information (I did have source code...), and trac was pretty good too when we tuned it; bugzilla always used to be awful. I bet we can fiddle with Jira to make its mail more useful; I don't just mean filtering what it sends, but making sure that each email is self contained, but not too long",1 +"DM-505","04/16/2014 13:16:34","improve initialization of kvMap in testQueryAnalysis","Build the kvMap at build-time and embed it into the executable. (this was brought up in DM-225)",1 +"DM-506","04/16/2014 13:19:40","improve generating kvMap in testFacade.cc","Generating the kvmap file, and pasting it into a string inside the test program. (this was brought up in DM-225)",1 +"DM-508","04/16/2014 13:23:37","shorten internal names in zookeeper","rename DATABASE_PARTITIONING to PARTITIONING rename DATABASES to DBS",2 +"DM-509","04/16/2014 13:25:28","rename ""dbGroup"" to ""storageClass"" in CSS metadata","It is meant to be used to indicate L1, L2, L3... At Qserv design week we decided to rename it (original plan was to remove it all together) ",1 +"DM-510","04/16/2014 13:39:14","Tweak metadata structure for driving table and secondary index","There seem to be confusion about driving table and secondary index. At the moment in zookeeper structure we have {code} /DATABASES//objIdIndex /DATABASES//TABLES//partitioning/secIndexColName /DATABASES//TABLES//partitioning/drivingTable /DATABASES//TABLES//partitioning/latColName /DATABASES//TABLES//partitioning/lonColName /DATABASES//TABLES//partitioning/keyColName {code} Issues to think about: * we can't call it objIdIndex, it is too lsst-specific. * drivingTable and keyColName - perhaps these should be at database level, which means we would only allow one drivingTable and one secondary index per database? * or, maybe instead of database level, it is a partitioning parameter? Note that two databases might use different name for secondary index or driving table, yet they might be joinable. That argues for introducing a new group, something like /DATABASE/partitioning in addition to /DATABASE_PARTITIONING. * consider renaming drivingTable to keyTable * do we really need secIndexColName and keyColName? Can't we get rid of one, and rename to keyColName? ",2 +"DM-513","04/16/2014 14:29:19","fix threading issues in CSS watcher","Fix problems with threads in watcher.py brought up in DM-225 by Serge: * A thread per database doesn't scale * There is a thread leak when a database is deleted * There is another design problem, in that each database thread looks like it is holding on to the same lsst.db.Db instance under the hood. I don't remember any consideration for thread safety from the lsst.db code when I reviewed it. Note for one that it is not safe to use a MySQL connection simultaneously from multiple threads (and I seem to recall that you are caching a connection inside Db instances). In practice, even the Python GIL may not save you, since calls into C code (i.e. the mysql client library) may very well release it.",8 +"DM-518","04/16/2014 16:52:18","Rework exceptions in qserv client","There is a bunch of (I think) unnecessary translation from KvException to QservAdmException. Can't you just handle printing KvException in CommandParser.receiveCommands(), and get rid of the CSSERR error code? (This is in /admin/bin/qserv-admin.py) (this came up in DM-225)",1 +"DM-520","04/16/2014 17:21:32","Remove old partitioner/ loader and duplicator","Once Fabrice has migrated the integrated tests towards using the new partitioner and duplicator, we should delete the old partitioner/duplicator (in {{client/examples}}).",1 +"DM-521","04/16/2014 17:49:23","Confusing error message (non-existing column referenced)","A query that references non existing column for non-partitioned table results in a confusing message: ""read failed for chunk(s): 1234567890"". To repeat, run something like {code} SELECT whatever FROM ; {code} Similar error occurs when we try to reference non-existing table, try something like: {code} SELECT sce.filterName FROM StrangeTable AS s, Science_Ccd_Exposure AS sce WHERE (s.scienceCcdExposureId = sce.scienceCcdExposureId); {code} ",5 +"DM-527","04/17/2014 12:48:33","make Image construction robust against integer overflow","I just fixed a bug on the HSC side (DM-523) in which integer overflow in the multiplication of width and height in image construction caused problems. We should backport this fix to LSST.",1 +"DM-530","04/17/2014 21:31:18","Table column names in new parser","Running tests (qserv-testdata.sh) on pre-loaded data I have observed that many test fail for the only reason that the column names in the dumped query results are different between mysql and qserv. Here is an example of query reqult returned from mysql: {code} mysql> SELECT sce.filterName, sce.field, sce.camcol, sce.run FROM Science_Ccd_Exposure AS sce WHERE sce.filterName = 'g' AND sce.field = 670 AND sce.camcol = 2 AND sce.run = 7202; +------------+-------+--------+------+ | filterName | field | camcol | run | +------------+-------+--------+------+ | g | 670 | 2 | 7202 | +------------+-------+--------+------+ {code} and this is the same query processed by qserv: {code} mysql> SELECT sce.filterName, sce.field, sce.camcol, sce.run FROM Science_Ccd_Exposure AS sce WHERE sce.filterName = 'g' AND sce.field = 670 AND sce.camcol = 2 AND sce.run = 7202; +----------+----------+----------+----------+ | QS1_PASS | QS2_PASS | QS3_PASS | QS4_PASS | +----------+----------+----------+----------+ | g | 670 | 2 | 7202 | +----------+----------+----------+----------+ {code} We discussed this already with Daniel yesterday and at qserv meeting today, here I just want to collect what we know so far so that we can return to this again later. As Daniel explained to me this is the result of the new parser assigning aliases to the columns which do not define aliases for themselves. This helps with tracking query proceeding through the processing pipeline. Daniel's observation is that different database engines may assign different names to result columns (or some may not even assign any names), there is no standard in that respect so there is no point in trying to follow what one particular implementation does. Additionally there are issues with conflicting column names and names which are complex expressions. Difference in column names breaks our tests which dump complete results including table header. The tests could be fixed easily, we could just ignore table headers when dumping the data. More interesting issue is that there may be use cases for better compatibility between mysql and qserv including result column naming. In particular standard Python mysql interface allows one to use column names to retrieve values from queiry result. If qserv assigns arbitrary aliases to the columns it may confuse this kind of clients. This issue depends very much on what kind of API qserv is going to provide to clients. If mysql (wire-level) protocol is going to be the main API (which would allow all kinds of mysql clients to talk to qserv directly) then we should probably think more about compatibility with mysql. OTOH if we decide to provide our own API then this may not be an issue at all (but we still need to fix current test setup which is based on mysql). We probably should discuss API question at our dev meeting.",5 +"DM-546","04/22/2014 13:20:30","scons rebuilds targets without changes","I'm seeing something strange when I run scons from current master - running 'scons install' after 'scons build' re-compiles several C++ files even though nothing has changed between these two runs: {code:bash} $ scons build scons: Reading SConscript files ... ... scons: Building targets ... scons: `build' is up to date. scons: done building targets. $ scons install scons: Reading SConscript files ... ... scons: Building targets ... swig -o build/czar/masterLib_wrap.cc -Ibuild -I/usr/include/python2.6 -python -c++ -Iinclude build/czar/masterLib.i g++ -o build/czar/masterLib_wrap.os -c -g -fPIC -D_FILE_OFFSET_BITS=64 -fPIC -I/u2/salnikov/STACK/stack/Linux64/protobuf/2.4.1/include -I/u2/salnikov/STACK/stack/Linux64/xrootd/qs5/include/xrootd -I/u2/salnikov/STACK/stack/Linux64/mysql/5.1.65/include -Ibuild -I/usr/include/python2.6 build/czar/masterLib_wrap.cc g++ -o build/control/AsyncQueryManager.os -c -g -fPIC -D_FILE_OFFSET_BITS=64 -fPIC -I/u2/salnikov/STACK/stack/Linux64/protobuf/2.4.1/include -I/u2/salnikov/STACK/stack/Linux64/xrootd/qs5/include/xrootd -I/u2/salnikov/STACK/stack/Linux64/mysql/5.1.65/include -Ibuild -I/usr/include/python2.6 build/control/AsyncQueryManager.cc g++ -o build/control/dispatcher.os -c -g -fPIC -D_FILE_OFFSET_BITS=64 -fPIC -I/u2/salnikov/STACK/stack/Linux64/protobuf/2.4.1/include -I/u2/salnikov/STACK/stack/Linux64/xrootd/qs5/include/xrootd -I/u2/salnikov/STACK/stack/Linux64/mysql/5.1.65/include -Ibuild -I/usr/include/python2.6 build/control/dispatcher.cc g++ -o build/control/thread.os -c -g -fPIC -D_FILE_OFFSET_BITS=64 -fPIC -I/u2/salnikov/STACK/stack/Linux64/protobuf/2.4.1/include -I/u2/salnikov/STACK/stack/Linux64/xrootd/qs5/include/xrootd -I/u2/salnikov/STACK/stack/Linux64/mysql/5.1.65/include -Ibuild -I/usr/include/python2.6 build/control/thread.cc g++ -o build/merger/TableMerger.os -c -g -fPIC -D_FILE_OFFSET_BITS=64 -fPIC -I/u2/salnikov/STACK/stack/Linux64/protobuf/2.4.1/include -I/u2/salnikov/STACK/stack/Linux64/xrootd/qs5/include/xrootd -I/u2/salnikov/STACK/stack/Linux64/mysql/5.1.65/include -Ibuild -I/usr/include/python2.6 build/merger/TableMerger.cc scons: `install' is up to date. scons: done building targets. {code} This is kind of unexpected, or at least I can't understand now why it happens. Trying to run with --debug=explain shows that some dependencies have disappeared and in some dependencies order is different. No clue yet what that means and how it could happen. Need to study our scons scripts to understand what is going on.",3 +"DM-559","04/24/2014 12:44:30","clean up include <> --> """" for third party includes","According to our coding standard 4.15: https://dev.lsstcorp.org/trac/wiki/C%2B%2BStandard/Files#a4-15.OnlysystemincludefilepathsSHALLbedelimitedwith we should be using """" for boost, but in quite a few places we do not: {code} grep 'include SELECT objectId FROM Object WHERE qserv_areaspec_box(0, 0, 3, 10) ORDER BY objectId; +-----------------+ | objectId | +-----------------+ | 417857368235490 | | 417861663199589 | | 417865958163688 | | 420949744686724 | | 420954039650823 | | 424042121137958 | | 424046416102057 | | 427134497589192 | | 430226874040426 | +-----------------+ 9 rows in set (1.27 sec) mysql> SELECT objectId FROM Object WHERE qserv_areaspec_box(0, 0, 3, 10) ORDER BY objectId; +-----------------+ | objectId | +-----------------+ | 424042121137958 | | 424046416102057 | | 427134497589192 | | 430226874040426 | | 417857368235490 | | 417861663199589 | | 417865958163688 | | 420949744686724 | | 420954039650823 | +-----------------+ 9 rows in set (1.24 sec) {code} This was done with testdata/case01 data, let me know if you need to load that data. Also, using case03 data, e.g. {code} SELECT distinct run, field FROM Science_Ccd_Exposure WHERE run = 94 AND field = 535; {code} returns 6 rows in qserv (vs 1 in mysql). The full list of DISTINCT failures is (all with testdata/case03): - 0002_fetchRunAndFieldById.txt - 0021_selectScienceCCDExposure.txt - 0030_selectScienceCCDExposureByRunField.txt ",1 +"DM-630","05/02/2014 16:11:25","Non-partitioned table query returns duplicated rows","Running automated test I noticed that a query on non-partitioned table returns multiple copies of the same row, one copy per chunk. Here is example: {code} mysql> SELECT offset, mjdRef, drift FROM LeapSeconds where offset = 10; +--------+--------+-------+ | offset | mjdRef | drift | +--------+--------+-------+ | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | | 10 | 41317 | 0 | +--------+--------+-------+ 13 rows in set (5.62 sec) {code} Czar log file shows that it correctly finds that table is non-chunked but sends query to each chunk anyway: {code} 20140502 16:22:08.745081 0x3172430 INF *** KvInterfaceImplZoo::exist(), key: /DATABASES/LSST/TABLES/LeapSeconds/partitioning 20140502 16:22:08.745735 0x3172430 INF *** LSST.LeapSeconds is NOT chunked. 20140502 16:22:08.745762 0x3172430 INF *** KvInterfaceImplZoo::get2(), key: /DATABASES/LSST/TABLES/LeapSeconds/partitioning/subChunks 20140502 16:22:08.746393 0x3172430 INF *** LSST.LeapSeconds is NOT subchunked. 20140502 16:22:08.746409 0x3172430 INF getChunkLevel returns 0 ..... 20140502 16:22:08.757832 0x3172430 INF Using 85 stripes and 12 substripes. 20140502 16:22:08.775586 0x3172430 INF Using /usr/local/home/salnikov/dm-613/build/dist/etc/emptyChunks.txt as default empty chunks file. 20140502 16:22:08.791559 0x3172430 INF empty_LSST.txt not found while loading empty chunks file. 20140502 16:22:08.791592 0x3172430 ERR Couldn't find empty_LSST.txt, using /usr/local/home/salnikov/dm-613/build/dist/etc/emptyChunks.txt. 20140502 16:22:08.891239 0x7fbb28003660 INF QuerySession::_buildChunkQueries() : Non-subchunked 20140502 16:22:08.891498 0x7fbb28003660 INF Msg cid=6630 with size=153 20140502 16:22:08.891682 0x7fbb28003660 INF Added query id=6630 url=xroot://qsmaster@127.0.0.1:1094//q/LSST/6630 with save /dev/shm/qserv-salnikov-b93a8b7cca1128f50fa5531feb93f8f24a185f162d36c10ee76b8dca/1_6630_0 20140502 16:22:08.891694 0x7fbb28003660 INF Opening xroot://qsmaster@127.0.0.1:1094//q/LSST/6630 20140502 16:22:08.891705 0x7fbb28003660 INF QuerySession::_buildChunkQueries() : Non-subchunked 20140502 16:22:08.891882 0x7fbb28003660 INF Msg cid=6631 with size=153 20140502 16:22:08.892077 0x7fbb28003660 INF Added query id=6631 url=xroot://qsmaster@127.0.0.1:1094//q/LSST/6631 with save /dev/shm/qserv-salnikov-b93a8b7cca1128f50fa5531feb93f8f24a185f162d36c10ee76b8dca/1_6631_0 20140502 16:22:08.892087 0x7fbb28003660 INF Opening xroot://qsmaster@127.0.0.1:1094//q/LSST/6631 20140502 16:22:08.892097 0x7fbb28003660 INF QuerySession::_buildChunkQueries() : Non-subchunked 20140502 16:22:08.892275 0x7fbb28003660 INF Msg cid=6800 with size=153 20140502 16:22:08.892462 0x7fbb28003660 INF Added query id=6800 url=xroot://qsmaster@127.0.0.1:1094//q/LSST/6800 with save /dev/shm/qserv-salnikov-b93a8b7cca1128f50fa5531feb93f8f24a185f162d36c10ee76b8dca/1_6800_0 ... {code} Looking at the code together with Daniel we found that at the Python level (czar/app.py) the code that dispatches query does not check for chunkLevel, this is likely why this happens. The code to look at is in {{InbandQueryAction._applyConstraints()}} method.",5 +"DM-633","05/03/2014 18:03:39","Query sessions are never destroyed","Please see DM-625, when I run say 10 ""select count(*) from LSST.Object"" queries, for each query a new AsyncQueryManager is created in dispatcher, but the sessions are never destroyed.",3 +"DM-637","05/06/2014 01:36:08","complexity of eups dependencies relationships for db package","Hello, I'm currently trying to use the very last version of db package (the one which relies on sconsUtils), but, in order to make it works with Qserv, I had to introduce next update : {code:bash} fjammes@clrlsstwn02-vm:~/src/qserv-packager/dist/dependencies/db (master) $ git diff HEAD~1 diff --git a/ups/db.cfg b/ups/db.cfg index e1ae31b..a469061 100644 --- a/ups/db.cfg +++ b/ups/db.cfg @@ -3,7 +3,7 @@ import lsst.sconsUtils dependencies = { - ""required"": [""mysqlclient"", ], + ""required"": [""mysql"", ], } config = lsst.sconsUtils.Configuration( diff --git a/ups/db.table b/ups/db.table index 8c8d831..9e770a3 100644 --- a/ups/db.table +++ b/ups/db.table @@ -1,5 +1,5 @@ setupRequired(python) -setupRequired(mysqlclient) +setupRequired(mysql) setupRequired(mysqlpython) setupRequired(sconsUtils) {code} Is there a solution to describe in eups that mysqlclient is included in mysql ? Thanks, Fabrice",1 +"DM-646","05/08/2014 14:21:01","Implement DISTINCT aggregate in qserv","It looks like DISTINCT aggregate is not supported yet in qserv. Daniel told me that this should be relatively straightforward to add. Adding this ticket so that we do not forget it.",2 +"DM-648","05/08/2014 15:38:36","Add support for running unit tests in scons","Add code in scons that runs unit tests for Qserv.",5 +"DM-661","05/09/2014 23:13:12","Parser has inverted order for ""limit"" and ""order by""","{code} SELECT run FROM LSST.Science_Ccd_Exposure order by field limit 2 {code} Works in MySQL, fails in Qserv (ERROR 4120 (Proxy): Error executing query using qserv.) {code} SELECT run FROM LSST.Science_Ccd_Exposure limit 2 order by field {code} Works in Qserv, fails in MySQL (limit should be after order by) ",0.5 +"DM-664","05/12/2014 08:55:44","""out of range value"" message when running qserv-testdata (loader.py)","Fabrice I am getting ""out of range value"" when I run the qserv-testdata: Are you seeing that too? 2014-05-09 18:11:55,975 {/usr/local/home/becla/qserv/1/qserv/build/dist/lib/python/lsst/qserv/admin/commons.py:134} INFO stderr : /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'calib_detected' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'calib_psf_candidate' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'calib_psf_used' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_negative' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_badcentroid' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'centroid_sdss_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_edge' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_interpolated_any' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_interpolated_center' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_saturated_any' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_saturated_center' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_cr_any' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_cr_center' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'centroid_gaussian_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'centroid_naive_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'shape_sdss_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'shape_sdss_centroid_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'shape_sdss_flags_unweightedbad' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'shape_sdss_flags_unweighted' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'shape_sdss_flags_shift' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'shape_sdss_flags_maxiter' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flux_psf_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flux_psf_flags_psffactor' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flux_psf_flags_badcorr' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flux_naive_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flux_gaussian_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flux_gaussian_flags_psffactor' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flux_gaussian_flags_badcorr' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flux_sinc_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_psf_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_psf_flags_maxiter' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_psf_flags_tinystep' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_psf_flags_constraint_r' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_psf_flags_constraint_q' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_dev_flux_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_dev_flags_psffactor' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_dev_flags_badcorr' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_dev_flags_maxiter' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_dev_flags_tinystep' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_dev_flags_constraint_r' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_dev_flags_constraint_q' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_dev_flags_largearea' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_exp_flux_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_exp_flags_psffactor' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_exp_flags_badcorr' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_exp_flags_maxiter' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_exp_flags_tinystep' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_exp_flags_constraint_r' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_exp_flags_constraint_q' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_exp_flags_largearea' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_combo_flux_flags' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_combo_flags_psffactor' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'multishapelet_combo_flags_badcorr' at row 1 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'calib_detected' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'calib_psf_candidate' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'calib_psf_used' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_negative' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_badcentroid' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'centroid_sdss_flags' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_edge' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_interpolated_any' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_interpolated_center' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_saturated_any' at row 2 self.cursor.execute(stmt) /usr/local/home/becla/qserv/1/qserv/build/dist/bin/loader.py:99: Warning: Out of range value for column 'flags_pixel_saturated_center' at row 2 self.cursor.execute(stmt) ",2 +"DM-666","05/12/2014 09:59:30","partition package has to detect eups-related boost","partition package doesn't detect eups-related boost. This has to be fixed by using sconsUtils, or hand-made procedure. {code:bash} [fjammes@lsst-dev lsstsw]$ export LD_LIBRARY_PATH=""$LSSTSW/anaconda/lib:$LD_LIBRARY_PATH"" [fjammes@lsst-dev lsstsw]$ setup boost 1.55.0.1+1 [fjammes@lsst-dev lsstsw]$ rebuild partition partition: ok (0.5 sec). boost: ok (0.3 sec). python: ok (0.3 sec). scons: ok (0.4 sec). # BUILD ID: b49 python: master-gcbf93ab65b (already installed). scons: 2.1.0+8 (already installed). boost: 1.55.0.1+1 (already installed). partition: master-gf2ef2cf2dc ERROR (1 sec). *** error building product partition. *** exit code = 1 *** log is in /lsst/home/fjammes/src/lsstsw/build/partition/_build.log *** last few lines: ::::: scons: Reading SConscript files ... ::::: Checking for C++ library boost_system-mt... no ::::: Checking for C++ library boost_system... no ::::: Checking for C++ library boost_thread-mt... no ::::: Checking for C++ library boost_thread... no ::::: Checking for C++ library boost_filesystem-mt... no ::::: Checking for C++ library boost_filesystem... no ::::: Checking for C++ library boost_program_options-mt... no ::::: Checking for C++ library boost_program_options... no ::::: Missing required boost library! # BUILD b49 completed. {code}",3 +"DM-674","05/13/2014 18:28:02","fix handling of nested control objects","Work on the HSC side has revealed some problems with nested control objects being wrapped into config objects. This is a pull request for those changes (along with writing a unit test for some of them). Some (but not all of these changes) are part of Trac ticket #3163 (https://dev.lsstcorp.org/trac/ticket/3163), which I'll now close as a duplicate.",2 +"DM-676","05/14/2014 08:40:56","Implement HTCondor dynamic classad solution for Slot based values","The HTCondor team will be updating their HOWTO for managing Slot based classads/dynamic classads set by a cron startd process. We currently have a technique for dynamic slot based values that is iinefficient from a negotiation perspective, and we will want to update to a more optimal approach that the HTCondor team plans to provide.",2 +"DM-689","05/16/2014 09:13:25","During scons configure : check if mysql isn't runing","Mysqld can't be configured is its running before configuration step.",1 +"DM-703","05/19/2014 15:17:24","Use of HipChat for Buildbot CI failure notifications should be explored","K-T recommended the use of HipChat rather than email when notifying users of a buildbot build failure. The purpose was twofold: get immediate attention from the developers and help change the culture towards using HipChat more. This Issue is to explore the feasibility of using HipChat for the notifications.",1 +"DM-704","05/19/2014 16:23:16","Better review notification e-mails","Russell writes: {quote} I think our system for getting code reviewed using JIRA needs some improvements. It seems that people don't always know that they have been assigned to review a ticket. Also, even if I know I have been assigned to review a ticket, I find it hard to find on JIRA. More concretely, I would like to see these improvements: - Much clearer notification that one has been assigned as a reviewer. Presently the email is quite generic and easy to miss. In fact I find that most JIRA notifications are rather hard to read -- it's not always easy to see what has changed and thus why I should care. The signal to noise ratio is poor. - By default a user should see which issues they have been assigned as reviewer when they log into JIRA. (If there is a way to reconfigure the dashboard for this, I'd like to know about it, but it really should be the default). One way to fix this, of course, is to reassig the ticket when putting it into review, but we have good reasons to avoid that. -- Russell {quote} and I added: {quote} In fact, you don't know that the ticket has passed into review unless you scroll all the way to the bottom of the comment. If the comment associated with the change in status is long and you don't scroll all the way down, then you may not know that you were assigned to review. With Trac, the important information was at the top of the e-mail. {quote}",2 +"DM-706","05/19/2014 21:56:26","cleanup extra file names in docstring","Reported by Serge in email: When using doxygen to document C++ source, you can mark a comment block with just: {code} /** @file * Blah blah */ {code} in which case doxygen assumes you want the comment block tied to the file it appears in. We seem to have lots of ""@file ” statements all over the place, which is an extra thing we have to remember to change when renaming files. Is there some reason to do it that way that I’m missing?",1 +"DM-707","05/19/2014 21:59:41","cleanup exception code in CSS","Reported by Serge: In CssException.h you’ve got: {code} class CssRunTimeException: public std::runtime_error { … }; class CssException_XXXX : public CssRunTimeException { … }; {code} This is inconsistent (shouldn’t it be CssRunTimeException_XXX, or maybe even CssRunTimeError?), lengthy, violates the LSST C++ naming conventions, and doesn’t match the KvInterface docs, which all still talk about a CssException class that does not exist. Can we consider changing this to something more like: {code} class CssError : public std::runtime_error class KeyError : public CssError class NoSuchTable : public KeyError class NoSuchDb : public KeyError class AuthError : public CssError class ConnError : public CssError {code} ? Then we can succinctly throw and catch css::NoSuchTable, css::AuthError etc…",2 +"DM-710","05/20/2014 09:59:00","Reduce and comment client configuration file","Client configuration file '~/.lsst/qserv.conf) is used by integration test procedure. Next improvments are required : 1. use templates in it 2. client config file should retrieve templated values from mother config file""",1 +"DM-720","05/21/2014 12:30:01","Upgrade various external packages","I note that the LSST and HSC versions of external packages are slightly out of sync. I propose uprevving the LSST packages to match as HSC has tested these versions. {quote} cfitsio 3.360 HSC cfitsio 3310+2 sims Winter2014 current b4 b5 b6 b3 doxygen 1.8.2+2 sims b4 Winter2014 current b5 b6 b3 doxygen 1.8.5 HSC eigen 3.1.1+2 Winter2014 current b5 b6 b3 b4 eigen 3.2 HSC fftw 3.3.2+2 Winter2014 current b5 b6 b3 b4 fftw 3.3.3 HSC gsl 1.15+2 Winter2014 current b5 b6 b3 b4 gsl 1.16 HSC minuit2 5.22.00+2 Winter2014 current b5 b6 b3 b4 minuit2 5.28.00 HSC mysqlclient 5.1.65+3 Winter2014 current b5 b6 b3 b4 mysqlclient 5.1.73 HSC pyfits 3.1.2+2 sims b4 Winter2014 current b5 b6 b3 pyfits 3.2 HSC scons 2.1.0+7 sims b4 Winter2014 current b5 b6 b3 scons 2.3.0 HSC sqlite 3.7.14+2 Winter2014 current b5 b6 b3 b4 sqlite 3.8.2 HSC wcslib 4.14 wcslib 4.14+3 b4 Winter2014 current b5 b6 b3 xpa 2.1.14+2 Winter2014 current b5 b6 b3 b4 xpa 2.1.15 HSC {quote} ",20 +"DM-737","05/21/2014 16:49:28","Rendering an IR node tree should produce properly parenthesized output","It appears that rendering a tree of IR nodes doesn't always result in correct generation of parentheses. Consider the following tree: {panel} * OrTerm ** BoolFactor *** NullPredicate **** ValueExpr ***** ValueFactor: ColumnRef(""refObjectId"") ** BoolFactor *** CompPredicate **** ValueExpr ***** ValueFactor: ColumnRef(""flags"") **** Token(""<>"") **** ValueExpr ***** ValueFactor: Const(""2"") {panel} which corresponds to the SQL for: ""refObjectId IS NULL OR flags<>2"". If one prepends this (via {{WhereClause.prependAndTerm()}}) to the {{WhereClause}} obtained by parsing ""... WHERE foo!=bar AND baz<3.14159;"" and renders the result using {{QueryTemplate}}, one obtains: {{... WHERE refObjectId IS NULL OR flags<>2 AND foo!=bar AND baz<3.14159}} This is equivalent to {{... WHERE refObjectId IS NULL OR (flags<>2 AND foo!=bar AND baz<3.14159)}} which doesn't match the parse tree - one should obtain: {{... WHERE (refObjectId IS NULL OR flags<>2) AND foo!=bar AND baz<3.14159}} This issue involves surveying all IR node classes and making sure that they render parentheses properly. {color:gray}(One way we might test for this is to parse queries containing parenthesized expressions where removal of the parentheses changes the meaning of the query. This would give us some IR that we can render to a string and reparse back into IR. If the rendering logic is correct, one should obtain identical IR trees).{color} Other possibilities that might explain the behavior above is that the input tree is somehow invalid or that {{WhereClause.prependAndTerm}} creates invalid IR.",8 +"DM-742","05/23/2014 07:10:18","Use geom eups package for installing geometry","Use geom eups package instead of downloading geometry.py during Qserv configuration step.",3 +"DM-751","05/26/2014 10:02:12","Replacing boost system lib with eups libs breaks scons build","While detecting boost, Qserv build system checks for both system lib and then eups lib. This procedure use next code : {code:python} class BoostChecker: def __init__(self, env): self.env = env self.suffix = None self.suffixes = [""-gcc41-mt"", ""-gcc34-mt"", ""-mt"", """"] self.cache = {} pass def getLibName(self, libName): if libName in self.cache: return self.cache[libName] r = self._getLibName(libName) self.cache[libName] = r return r def _getLibName(self, libName): state.log.debug(""BoostChecker._getLibName() LIBPATH : %s, CPPPATH : %s"" % (self.env[""LIBPATH""], self.env[""CPPPATH""])) if self.suffix == None: conf = self.env.Configure() def checkSuffix(sfx): return conf.CheckLib(libName + sfx, language=""C++"", autoadd=0) {code} and this last line run next gcc command : {code:bash} g++ -o .sconf_temp/conftest_10.o -c -g -pedantic -Wall -Wno-long-long -D_FILE_OFFSET_BITS=64 -fPIC -I/data/fjammes/stack/Linux64/protobuf/master-g832d498170/include -I/data/fjammes/stack/Linux64/boost/1.55.0.1/include -I/data/fjammes/stack/Linux64/zookeeper/master-gc48457902f/c-binding/include -I/data/fjammes/stack/Linux64/mysql/master-g5d79af2a50/include -I/data/fjammes/stack/Linux64/antlr/master-gc05368a54f/include -I/data/fjammes/stack/Linux64/xrootd/master-gfc9bfb2059/include/xrootd -Ibuild -I/data/fjammes/stack/Linux64/anaconda/1.8.0/include/python2.7 .sconf_temp/conftest_10.cpp g++ -o .sconf_temp/conftest_10 .sconf_temp/conftest_10.o -L/data/fjammes/stack/Linux64/xrootd/master-gfc9bfb2059/lib -L/data/fjammes/stack/Linux64/protobuf/master-g832d498170/lib -L/data/fjammes/stack/Linux64/antlr/master-gc05368a54f/lib -L/data/fjammes/stack/Linux64/zookeeper/master-gc48457902f/c-binding/lib -L/data/fjammes/stack/Linux64/mysql/master-g5d79af2a50/lib -L/data/fjammes/stack/Linux64/boost/1.55.0.1/lib -lboost_regex-mt scons: Configure: yes {code} As the ""-mt"" suffix is searched before the empty suffix, previous command succeed.In my example boost_regex-mt is a system lib. When launching ""scons build"", then CheckLib only looks for boost in /data/fjammes/stack/Linux64/boost/1.55.0.1/lib, not in /usr/lib/. This behaviour is eups-correct, but prevents to find boost_regex-mt. In this example, a trivial solution is to reverse self.suffixes in python code, but a better solution would be to prevent g++ to use default search paths (e.g. : /usr/lib and /usr/include) in the second command. Is it possible to to it with scons ? Mario, did you meet the same problem with sconsUtils ? Thanks Fabrice",3 +"DM-764","05/27/2014 16:49:37","Exception naming convention","The naming convention for exceptions in pex_exceptions is quite redundant. This issue will make the convention more compact and update all packages that make use of pex_exceptions.",5 +"DM-766","05/27/2014 17:00:29","Improve afw::CameraGeom::utils code","Some of the utility code in CameraGeom was not completely ported in W13 and documentation is in need of updating.",3 +"DM-767","05/27/2014 17:13:18","Determine scope of XY0 convention update","It's unclear exactly how much effort will be involved in making a change to how the XY0 is used. If the parent/child argument is removed completely this change could be quite invasive and wide reaching.",2 +"DM-772","05/28/2014 10:18:07","Package log4cxx","Fabrice, can you package log4cxx? I should have asked you earlier, sorry I waited so long, not it becoming urgent! Bill is almost done with his logging prototype and will be turning it into a real package, and we need to have log4cxx packages. Many thanks. log4cxx version 0.10.0, which was released in 4/3/2008 but is still undergoing ""incubation"" at Apache. ",2 +"DM-775","05/28/2014 11:03:44","XLDB-2015 report","Writing the report, most work done by Daniel, with input from Jacek and K-T.",8 +"DM-778","05/28/2014 22:22:14","Restructure and package logging prototype","Restructure and package log4cxx-based prototype (currently in branch u/bchick/protolog). It should go into package called ""log""",8 +"DM-780","05/28/2014 22:37:50","Access patterns for data store that supports data distribution ","Data distribution related data store includes things like. chunk --> node mapping, locations of chunk replicas, runtime information about nodes (and maybe also node configuration?). Need to understand access patterns - who needs to access, how frequently etc. ",5 +"DM-781","05/28/2014 22:41:33","research mysql cluster ndb","Checkout mysql cluster ndb from the perspective of data distribution - could it be potentially useful to store data related to data distribution?",2 +"DM-783","05/29/2014 10:40:20","Disable failing test cases in automated tests","There are currently 4 test cases failing in out automated tests. Until we have a fix we want to disable them.",1 +"DM-786","05/29/2014 12:48:39","JOIN queries are broken","Running a simple query that does a join: {code} SELECT s.ra, s.decl, o.raRange, o.declRange FROM Object o JOIN Source s USING (objectId) WHERE o.objectId = 390034570102582 AND o.latestObsTime = s.taiMidPoint; {code} results in czar crashing with: {code} 2terminate called after throwing an instance of 'std::logic_error' what(): Attempted subchunk spec list without subchunks. {code} This query has been taken from integration tests (case01, 0003_selectMetadataForOneGalaxy.sql) ",3 +"DM-794","05/29/2014 19:02:42","SQL injection in czar/proxy.py","Running automated tests for some queries I observe python exceptions in czar log which look like this: {code} 20140529 19:47:19.364371 0x7faacc003550 INF Query dispatch (7) toUnhandled exception in thread started by Traceback (most recent call last): File ""/usr/local/home/salnikov/qserv-master/build/dist/lib/python/lsst/qserv/czar/proxy.py"", line 78, in waitAndUnlock lock.unlock() File ""/usr/local/home/salnikov/qserv-master/build/dist/lib/python/lsst/qserv/czar/proxy.py"", line 65, in unlock self._saveQueryMessages() File ""/usr/local/home/salnikov/qserv-master/build/dist/lib/python/lsst/qserv/czar/proxy.py"", line 87, in _saveQueryMessages self.db.applySql(Lock.writeTmpl % (self._tableName, chunkId, code, msg, timestamp)) File ""/usr/local/home/salnikov/qserv-master/build/dist/lib/python/lsst/qserv/czar/db.py"", line 95, in applySql c.execute(sql) File ""/u2/salnikov/STACK/Linux64/mysqlpython/1.2.3+8/lib/python/MySQL_python-1.2.3-py2.7-linux-x86_64.egg/MySQLdb/cursors.py"", line 174, in execute self.errorhandler(self, exc, value) File ""/u2/salnikov/STACK/Linux64/mysqlpython/1.2.3+8/lib/python/MySQL_python-1.2.3-py2.7-linux-x86_64.egg/MySQLdb/connections.py"", line 36, in defaulterrorhandler raise errorclass, errorvalue _mysql_exceptions.ProgrammingError: (1064, ""You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'r' AND sce.tract=0 AND sce.patch='159,3';', 1401410839.000000)' at line 1"") ok 0.000532 seconds {code} I believe this is due to how query string is being constructed in czar/proxy.py: {code:py} class Lock: writeTmpl = ""INSERT INTO %s VALUES (%d, %d, '%s', %f);"" # ................... self.db.applySql(Lock.writeTmpl % (self._tableName, chunkId, code, msg, timestamp)) {code} If {{msg}} happens to contain quotes then resulting query is broken. One should not use Python formatting to construct query strings, instead the parameters should be passed directly to {{cursor.execute()}} method. ",2 +"DM-800","05/31/2014 00:53:54","Zookeeper times out","I noticed running some queries, leaving system up and them returning few hours later and running more queries can result in: {code} ZOO_ERROR@handle_socket_error_msg@1723: Socket [127.0.0.1:12181] zk retcode=-4, errno=112(Host is down): failed while receiving a server response {code} It needs to be investigated (if we can reproduce) ",3 +"DM-814","06/03/2014 08:29:59","Cleanup in core/examples and core/doc","- core/examples and core/doc seems to be out of data. Some cleanup here would be welcome.",1 +"DM-817","06/03/2014 17:35:51","qserv have to use boost from stack","To quote Jacek and KT: {code} Andy, re dm-751, KT says never use the system version. J. {code} So we need to switch qserv to eups-boost. This should be easy once DM-751 is done, just add boost to qserv.table. Then one can remove conditional part of {{BoostChecker}} which works with system-installed boost. ",1 +"DM-827","06/09/2014 10:05:17","Reimplement C++/Python Exception Translation","I'd like to reimplement our Swig bindings for C++ exceptions to replace the ""LsstCppException"" class with a more user-friendly mechanism. We'd have a Python exception hierarchy that mirrors the C++ hierarchy (generated automatically with the help of a few Swig macros). These wrapped exceptions could be thrown in Python as if they were pure-Python exceptions, and could be caught in Python in the same language regardless of where they were thrown. We're doing this as part of a ""Measurement"" sprint because we'd like to define custom exceptions for different kinds of common measurement errors, and we want to be able to raise those exceptions in either language.",8 +"DM-829","06/09/2014 10:19:41","Algorithm API without (or with optional) Result objects","In this design prototype, I'll see how much simpler things could be made by making the main algorithm interface one that sets record values directly, instead of going through an intermediate Result object. Ideally the Result objects would still be an option, but they may not be standardized or reusable.",3 +"DM-832","06/09/2014 12:07:51","add persistable class for aperture corrections","We need to create a persistable, map-like container class to hold aperture corrections, with each element of the container being an instance of the class to be added in DM-740. A prototype has been developed on DM-797 on the HSC side: https://hsc-jira.astro.princeton.edu/jira/browse/HSC-797 and the corresponding code can be found on these changesets: https://github.com/HyperSuprime-Cam/afw/compare/32d7a8e7b75da6f5327fee65515ee59a5b09f6c7...tickets/DM-797",2 +"DM-833","06/09/2014 12:09:16","implement coaddition for aperture corrections","We need to be able to coadd aperture corrections in much the same way we coadd PSFs. See the HSC-side HSC-798 and HSC-897 implementation for a prototype: https://hsc-jira.astro.princeton.edu/jira/browse/HSC-798 https://hsc-jira.astro.princeton.edu/jira/browse/HSC-897 with code here: https://github.com/HyperSuprime-Cam/meas_algorithms/compare/d2782da175c...u/jbosch/DM-798 https://github.com/HyperSuprime-Cam/meas_algorithms/compare/c4fcab3251...u/price/HSC-897a https://github.com/HyperSuprime-Cam/pipe_tasks/compare/6eb48e90be12d...u/price/HSC-897a",3 +"DM-837","06/09/2014 13:00:49","Rewrite multiple-aperture photometry class","We've never figured out how to handle wrapping multiple-aperture photometry algorithms. They can't use the existing Result objects - at least not out of the box. We should try to write a new multiple-aperture photometry algorithm from the ground up, using the old ones on the HSC branch as a guide, but not trying to transfer the old code over. The new one should: - Have the option of using elliptical apertures (as defined by the shape slot) or circular apertures. - Have a transition radius at which we switch from the sinc photometry algorithm to the naive algorithm (for performance reasons).",2 +"DM-840","06/09/2014 16:27:00","Change code so ImageOrigin must be specified (temporary)","Image-like classes have a getBBox method and various constructors that use an ImageOrigin argument which in most or all cases defaults to LOCAL. As the first stage in cleaning this up, try to break code that uses the default as follows: * Remove the default from getBBox(ImageOrigin) so an origin must be specified. * Change the default origin of constructors to a temporary new value UNDEFINED * Modify code that uses image origin to fail if origin is needed (it is ignored if bbox is empty) and is UNDEFINED. Note: this is less safe than changing constructors to not have a default value for origin, because the error will be caught at runtime rather than compile time. However, that is messy because then the bounding box will also have to be always specified, and possibly an HDU, so it would be a much more intrusive change.",2 +"DM-841","06/09/2014 16:29:21","Change data butler I/O of image-like objects to require imageOrigin if bbox specified (temporary)","As part of making PARENT the default for image origin, change the data butler to require that imageOrigin be specified if bbox is specified when reading or writing image-like objects. Note: this ticket turns out to be unnecessary, as all the few necessary change are done as part of DM-840.",2 +"DM-843","06/09/2014 16:43:52","Restore names of methods that return pixel iterators and locators","Restore the names of methods that return pixel iterators and pixel locators on image-like classes. (This is part of the final stage of eliminating LOCAL pixel indexing).",2 +"DM-845","06/09/2014 16:46:45","Eliminate image origin argument from butler for (un)persisting image-like objects","Eliminate the image origin argument for butler get and put when dealing with image-like objects.",2 +"DM-854","06/10/2014 00:47:31","duplicate column name when running near neighbor query","Running a simplified version of near neighbor query on test data from case01: {code} SELECT DISTINCT o1.objectId, o2.objectId FROM Object o1, Object o2 WHERE scisql_angSep(o1.ra_PS, o1.decl_PS, o2.ra_PS, o2.decl_PS) < 1 AND o1.objectId <> o2.objectId {code} Result in an error on the worker: {code} Foreman:Broken! ,q_38f9QueryExec---Duplicate column name 'objectId' Unable to execute query: CREATE TABLE r_13237cd4cfc9e0fa01497bcf\ 67a91add2_6630_0 SELECT o1.objectId,o2.objectId FROM Subchunks_LSST_6630.Object_6630_0 AS o1,Subchunks_LSST_6630.Object_6630_0 AS o2\ WHERE scisql_angSep(o1.ra_PS,o1.decl_PS,o2.ra_PS,o2.decl_PS)<1 AND o1.objectId<>o2.objectId; {code} It is fairly obvious what is going on. ""SELECT t1.x, t2.x"" is perfectly valid, but if we add ""INSERT INTO SELECT t1.x, t2.x"", we need to add names, eg. something like ""INSERT INTO SELECT t1.x as x1, t2.x as x2""",8 +"DM-863","06/11/2014 17:51:37","near neighbor does not return results","A query from qserv_testdata (case01/queries/1051_nn.sql) runs through Qserv, but it returns no results, while the same query run on myql does return results. The exact query for qserv is: {code} SELECT o1.objectId AS objId FROM Object o1, Object o2 WHERE qserv_areaspec_box(0, 0, 0.2, 1) AND scisql_angSep(o1.ra_PS, o1.decl_PS, o2.ra_PS, o2.decl_PS) < 1 AND o1.objectId <> o2.objectId; {code}",1 +"DM-869","06/12/2014 14:20:20","disable extraneous warnings from boost (gcc 4.8)","Compiling qserv on ubuntu 14.04 (comes with gcc 4.8.2) results in huge number of warnings coming from boost. We should use the flag ""-Wno-unused-local-typedefs"".",0.5 +"DM-873","06/13/2014 19:18:55","XLDB - strategic positioning","Discussions with strategic partners. Improving website and adding new context (community, speakers). 1-pager document",3 +"DM-874","06/16/2014 07:46:13","W'14 newinstall.sh picks up wrong python?","newinstall.sh fails with: Installing the basic environment ... Traceback (most recent call last): File ""/tmp/test_lsst/eups/bin/eups_impl.py"", line 11, in ? import eups.cmd File ""/tmp/test_lsst/eups/python/eups/__init__.py"", line 5, in ? from cmd import commandCallbacks File ""/tmp/test_lsst/eups/python/eups/cmd.py"", line 38, in ? import distrib File ""/tmp/test_lsst/eups/python/eups/distrib/__init__.py"", line 30, in ? from Repositories import Repositories File ""/tmp/test_lsst/eups/python/eups/distrib/Repositories.py"", line 8, in ? import server File ""/tmp/test_lsst/eups/python/eups/distrib/server.py"", line 1498 mapping = self._noReinstall if outVersion and outVersion.lower() == ""noreinstall"" else self._mapping ^ SyntaxError: invalid syntax Perhaps from running the wrong version of python. Full script/log is attached. ",1 +"DM-875","06/16/2014 10:11:49","lsst_dm_stack_demo","lsst-dm_stack_demo has obsolete benchmark files (circa Release 7.0) which fail to serve the purpose of validating, for the user, the correct functioning of a freshly built Release v8.0 stack. At the very least, the benchmark files should be regenerated for each official Release. Tasks: (1) Build the benchmark files for Release v8.0 (2) Debate (a) recommending the use of 'numdiff' to check if the output is within realistic bounds. Or, (b) develop another procedure to better show how the current algorithms compare to the algorithms used at the benchmarked Release. (3) Depending on result of the debate on #2: for: (a) provide appropriate 'numdiff' command invocation in manual.; for (b) implement the new procedure.",40 +"DM-903","06/25/2014 19:14:47","SourceDetectionTask should only add flags.negative if config.thresholdParity == ""both""","The SourceDetectionTask always adds ""flags.negative"" to the schema (if provided) but it is only used if config.thresholdParity == ""both"". As adding a field to a schema requires that the table passed to the run method have that field this is a significant nuisance when reusing the task. Please change the code to only modify the schema if it's going to set it. ",1 +"DM-911","06/27/2014 20:20:25","Provide Task documentation for DipoleMeasurementTask","See Summary. ",2 +"DM-913","06/27/2014 20:26:35","Provide Task documentation for ImagePsfMatchTask","See summary",2 +"DM-914","06/27/2014 20:27:19","Provide Task documentation for SnapPsfMatchTask","See summary",2 +"DM-933","06/30/2014 12:52:38","Photometric calibration uses a column ""flux"" not the specified filter unless a colour term is active","The photometric calibration code uses a field ""flux"" in the reference catalog to impose a magnitude limit. If a colour term is specified, it uses the primary and secondary filters to calculate the reference magnitude, but if there is no colour term it uses the column labelled ""flux"" and ignores the filtername. Please change the code so that ""flux"" is ignored, and the flux associated with filterName is used.",1 +"DM-951","07/08/2014 13:28:14","Add Doxygen documentation on rebuilds","Master-branch doxygen documentation should be rebuild on every full master build.",20 +"DM-957","07/11/2014 11:00:01","Use aliases to clean up table version transition","The addition of schema aliases on DM-417 should allow us to clean up some of the transitional code added on DM-545, as we can now alias new versions of fields to the old ones and vice versa.",2 +"DM-964","07/15/2014 07:21:49","Include aliases in Schema introspection","Schema stringification and iteration should include aliases somehow. Likewise the extract() Python methods.",1 +"DM-966","07/15/2014 12:25:35","fix int/long conversion on 32-bit systems and selected 64-bit systems","tests/wrap.py fails in pex_config on 32-bit systems and some 64-bit systems (including Ubuntu 14.04) with the following: {code:no-linenum} tests/wrap.py ...EE.E. ====================================================================== ERROR: testDefaults (__main__.NestedWrapTest) Test that C++ Control object defaults are correctly used as defaults for Config objects. ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/wrap.py"", line 89, in testDefaults self.assert_(testLib.checkNestedControl(control, config.a.p, config.a.q, config.b)) File ""/home/boutigny/CFHT/stack_5/build/pex_config/tests/testLib.py"", line 987, in checkNestedControl return _testLib.checkNestedControl(*args) TypeError: in method 'checkNestedControl', argument 2 of type 'double' ====================================================================== ERROR: testInt64 (__main__.NestedWrapTest) Test that we can wrap C++ Control objects with int64 members. ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/wrap.py"", line 95, in testInt64 self.assert_(testLib.checkNestedControl(control, config.a.p, config.a.q, config.b)) File ""/home/boutigny/CFHT/stack_5/build/pex_config/tests/testLib.py"", line 987, in checkNestedControl return _testLib.checkNestedControl(*args) TypeError: in method 'checkNestedControl', argument 2 of type 'double' ====================================================================== ERROR: testReadControl (__main__.NestedWrapTest) Test reading the values from a C++ Control object into a Config object. ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/wrap.py"", line 82, in testReadControl config.readControl(control) File ""/home/boutigny/CFHT/stack_5/build/pex_config/python/lsst/pex/config/wrap.py"", line 212, in readControl __at=__at, __label=__label, __reset=__reset) File ""/home/boutigny/CFHT/stack_5/build/pex_config/python/lsst/pex/config/wrap.py"", line 217, in readControl self.update(__at=__at, __label=__label, **values) File ""/home/boutigny/CFHT/stack_5/build/pex_config/python/lsst/pex/config/config.py"", line 515, in update field.__set__(self, value, at=at, label=label) File ""/home/boutigny/CFHT/stack_5/build/pex_config/python/lsst/pex/config/config.py"", line 310, in __set__ raise FieldValidationError(self, instance, e.message) FieldValidationError: Field 'a.q' failed validation: Value 4 is of incorrect type long. Expected type int For more information read the Field definition at: File ""/home/boutigny/CFHT/stack_5/build/pex_config/python/lsst/pex/config/wrap.py"", line 184, in makeConfigClass fields[k] = FieldCls(doc=doc, dtype=dtype, optional=True) And the Config definition at: File ""/home/boutigny/CFHT/stack_5/build/pex_config/python/lsst/pex/config/wrap.py"", line 131, in makeConfigClass cls = type(name, (base,), {""__doc__"":doc}) ---------------------------------------------------------------------- Ran 8 tests in 0.017s FAILED (errors=3) {code} There is a partial fix on u/jbosch/intwrappers; this seems to work for Ubuntu 14.04, but not on 32-bit systems.",2 +"DM-967","07/16/2014 10:56:37","qserv-configure.py is broken in master","It looks like there was a bug introduced either during the merge of DM-622 with master or right before that. Running {{qserv-configure.py}} from master fails now: {code} $ qserv-configure.py File ""/usr/local/home/salnikov/qserv-master/build/dist/bin/qserv-configure.py"", line 229 (""Do you want to update user configuration file (currently pointing ^ SyntaxError: EOL while scanning string literal {code} I assign this to myself, Fabrice is on vacation now and we need to fix this quickly.",1 +"DM-976","07/18/2014 09:51:02","Detailed documentation for meas_base tasks","We should follow RHL's example for detailed task documentation and document all meas_base tasks.",2 +"DM-977","07/18/2014 09:52:29","Documentation audit and cleanup for meas_base plugins","Many meas_base Plugins and Algorithms have poor documentation, including several whose documentation is a copy/paste relic from some other algorithm. These need to be fixed.",2 +"DM-978","07/18/2014 09:56:09","add base class for measurement tasks","We should consider adding a base class for measurement tasks (SingleFrameMeasurementTask, ForcedMeasuremedTask) that includes the callMeasure methods. I'm hoping this will help cleanup callMeasure and improve code reuse.",1 +"DM-980","07/18/2014 14:40:06","convert measurement algorithms in ip_diffim","ip_diffim includes a few measurement algorithms which need to be converted to the new framework.",5 +"DM-981","07/18/2014 14:49:33","convert measurement algorithms in meas_extensions_shapeHSM","This is a low-priority ticket to replace the old-style plugins in meas_extensions_shapeHSM with new ones compatible with meas_base. As this isn't a part of the main-line stack, we should delay it until other the meas_base conversion is nearly (or perhaps fully) complete.",3 +"DM-982","07/18/2014 14:50:28","convert meas_extensions_photometryKron to new measurement framework","This is a low-priority ticket to replace the old-style plugins in meas_extensions_photometryKron with new ones compatible with meas_base. As this isn't a part of the main-line stack, we should delay it until other the meas_base conversion is nearly (or perhaps fully) complete.",3 +"DM-984","07/21/2014 14:48:27","allow partial measurement results to be set when error flag is set","We need to be able to return values at the same time that an error flag is set. The easiest way to do this is to have Algorithms take a Result object as an output argument rather than return it. We'll revisit this design later. ",2 +"DM-989","07/24/2014 12:05:05",".my.cnf in user HOME directory breaks setup script","Presence of {{.my.cnf}} file in the user HOME directory crashes {{qserv-configure.py}} script if parameters in {{.my.cnf}} conflict with parameters in {{qserv.conf}}. How to reproduce: * create .my.cnf file in the home directory: {code} [client] user = anything # host/port and/or socket host = 127.0.0.1 port = 3306 socket = /tmp/mysql.sock {code} * try to run {{qserv-configure}}, it fails with error: {code} /usr/local/home/salnikov/qserv-run/u.salnikov.DM-595/tmp/configure/mysql.sh: connect: Connection refused /usr/local/home/salnikov/qserv-run/u.salnikov.DM-595/tmp/configure/mysql.sh: line 13: /dev/tcp/127.0.0.1/23306: Connection refused ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111) {code} It looks like {{~/.my.cnf}} may be a left-over from some earlier qserv installation. If I remove it and re-run {{qserv-configure.py}} now it's not created anymore. Maybe worth adding some kind of protection to {{qserv-configure.py}} in case other users have this file in their home directory.",2 +"DM-991","07/24/2014 19:51:44","add query involving a blob to the integration tests","We need to add a query (or more?) to the qserv_testdata that involve blobs. Blobs are interesting because they might break some parts of the qserv if we failed to escape things properly etc. ",2 +"DM-993","07/24/2014 23:01:32","improve message from qserv_testdata","Currently, when I try to run qserv-benchmark but qserv_testdata was not setup, I am getting {code} CRITICAL Unable to find tests datasets. -- FOR EUPS USERS : Please run : eups distrib install qserv_testdata setup qserv_testdata FOR NON-EUPS USERS : Please fill 'testdata_dir' value in ~/.lsst/qserv.conf with the path of the directory containing tests datasets or use --testdata-dir option. {code} It is important to note in the section for eups users that this has to be called BEFORE qserv is setup, otherwise it has no effect. ",1 +"DM-999","07/28/2014 11:23:18","rename config file(s) in Qserv","Rename local.qserv.cnf to qserv-czar.cnf. It is quite likely there are some other config files that would make sense to rename. If you see some candidates, let's discussion on qserv-l and do the renames.",1 +"DM-1001","07/28/2014 12:29:08","Modify assertAlmostEqual in ip_diffim subtractExposures.py unit test","In unit test, the comparison self.assertAlmostEqual(skp1[nk][np], skp2[nk][np], 4) fails. However if changed to self.assertTrue(abs(skp1[nk][np]-skp2[nk][np]) < 10**-4) which is the desired test, this succeeds. This ticket will remove all assertAlmostEquals from subtractExposure.py and replace with the fundamental comparison operator of the absolute value of the differences.",1 +"DM-1004","07/29/2014 14:22:10","Provide Task documentation for ModelPsfMatchTask","See Description (it's currently called PsfMatch) ",2 +"DM-1010","08/05/2014 13:39:40","fix names of meas_base plugins to match new naming standards","Some meas_base plugins still have old-style algorithm names.",1 +"DM-1012","08/05/2014 13:44:44","remove temporary workaround in new SkyCoord algorithm","SingleFrameSkyCoordPlugin is using the Footprint Peak, not the centroid slot. According to comments in the code, this is a workaround for some problem with centroids. This needs to be fixed.",1 +"DM-1013","08/05/2014 13:51:07","Classification should set flags upon failure","The classification algorithm claims it can never fail. It can, and should report this.",2 +"DM-1015","08/05/2014 14:29:56","convert GaussianFlux to use shape, centroid slots","We should cleanup and simplify the GaussianFlux algorithm to simply use the shape and centroid slot values instead of either computing its own or having configurable field names for where to look these up.",1 +"DM-1017","08/05/2014 15:51:57","fix testForced.py","testForced.py is currently passing even though it probably should be failing: it's trying to get centroid values from a source which has neither a valid centroid slot or a Footprint with Peaks (I suspect because transforming a footprint might remove the peaks). Prior to DM-976, that would have caused a segfault; on DM-976, I've turned it into an exception, which is then turned into a warning by the measurement framework.",2 +"DM-1018","08/06/2014 11:40:26","Fix incorrect eupspkg config for astrometry_net","The clang patch from 8.0.0. version was (correctly) deleted. However, the patch identity was still left in the eupspkg config's protocol. This will delete the last vestige of the formerly necessary clang patch.",2 +"DM-1022","08/08/2014 16:10:29","fix warnings related to libraries pulled through dependent package","This came up during migrating qserv to the new logging system, and it can be reproduced by taking log4cxx, see DM-983, essentially: {code} eups distrib install -c log4cxx 0.10.0.lsst1 -r http://lsst-web.ncsa.illinois.edu/~becla/distrib -r http://sw.lsstcorp.org/eupspkg {code} cloning log package (contrib/log.git), building it and installing in your stack, and finally taking the branch u/jbecla/DM-207 of qserv and building it. The warnings looks like: {code}/usr/bin/ld: warning: libutils.so, needed by /usr/local/home/becla/qservDev/Linux64/log/1.0.0/lib/liblog.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libpex_exceptions.so, needed by /usr/local/home/becla/qservDev/Linux64/log/1.0.0/lib/liblog.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libbase.so, needed by /usr/local/home/becla/qservDev/Linux64/log/1.0.0/lib/liblog.so, not found (try using -rpath or -rpath-link) {code} and they show up when I build qserv package, and are triggered by the liblog. I suspect sconsUtils deal with that sort of issues, but since we have our own scons system for qserv it is not handled. Fabrice, can you try to find a reasonable solution for that? Thanks!",0.5 +"DM-1028","08/11/2014 14:46:59","qserv-version.sh produces incorrect version number","I have just installed qserv on a clean machine (this is in a new virtual machine running Ubuntu12.04) which got me version 2014_07.0 installed: {code} $ eups list qserv 2014_07.0 current b76 $ setup qserv $ eups list qserv 2014_07.0 current b76 setup $ echo $QSERV_DIR /opt/salnikov/STACK/Linux64/qserv/2014_07.0 {code} but the {{qserv-version.sh}} script still thinks that I'm running older version: {code} $ qserv-version.sh 2014_05.0 {code}",2 +"DM-1029","08/11/2014 15:12:47","""source"" command is not in standard shell","{{qserv-start.sh}} script fails when installed on Ubuntu12.04: {code} $ ~/qserv-run/2014_05.0/bin/qserv-start.sh /home/salnikov/qserv-run/2014_05.0/bin/qserv-start.sh: 4: /home/salnikov/qserv-run/2014_05.0/bin/qserv-start.sh: source: not found /home/salnikov/qserv-run/2014_05.0/bin/qserv-start.sh: 6: /home/salnikov/qserv-run/2014_05.0/bin/qserv-start.sh: check_qserv_run_dir: not found {code} It complains about {{source}} command. {{source}} is not standard POSIX shell command, it is an extension which exists in many shells. Apparently in older Ubuntu version {{/bin/sh}} is stricter about non-standard features. To fix the script one either has to use standard . (dot) command or change shebang to {{#!/bin/bash}}. This of course applies to all our executable scripts.",2 +"DM-1038","08/11/2014 22:29:12","S15 Implement Query Mgmt in Qserv","Initial version of system for managing queries run through qserv. This includes capturing information about queries running in Qserv. Note, we are not dealing with query cost estimate here, (it will be covered through DM-1490).",40 +"DM-1041","08/12/2014 14:13:17","eliminate confusing config side-effects in CalibrateTask","CalibrateTask does some unexpected things differently if you configure it certain ways, because it perceives certain processing as only being necessary to feed other steps. In particular, if you disable astrometry and photometric calibration, it only runs measurement once, because it assumes the only purpose of the post-PSF measurement is to feed those algorithms. This (as well as poor test coverage) made it easy to break CalibrateTask in the case where those options are disabled a few branches back. After conferring with Simon and Andy, we think the best solution is to remove this sort of conditional processing from CalibrateTask, which should also make it much easier to read. Instead, we'll always do both the initial and final phase of measurement, even if one of those phases is not explicitly being used within CalibrateTask itself.",1 +"DM-1045","08/13/2014 11:27:12","Create a permanent and accessible mapping of the BB# and the bNNN. ","Create a permanent and accessible mapping of the BB# and the bNNN. The users are interested in the BB# since is is used to point to the STDIO file form the entire stack build. The bNNN is needed because the daily life of the developer revolves around the stack tagged alternately by the bNNN tags and/or the DM Release tags. ",2 +"DM-1054","08/14/2014 11:14:23","init.d/qserv-czar needs LD_LIBRARY path","With the addition of log we now need to find some shared libraries from stack. Current version of qserv-czar init.d script does not capture LD_LIBRARY_PATH, so we should add it there. ",0.5 +"DM-1055","08/14/2014 13:44:22","Remove unnecessary pieces from qserv czar config","The config file for the qserv czar has some items that are no longer relevant, and in this issue, we focus on the ones that are clearly the responsibility of our qserv css. This ticket includes: -- removing these items from the installation/configuration templates -- removing these items from sample configuration files -- removing these items from the code that reads in the configuration file and sets defaults for these items -- fixing things that seem to break as a result of this cleanup. danielw volunteers to assist on the last item, as needed. ",2 +"DM-1058","08/18/2014 13:48:39","fix SubSchema handling of ""."" and ""_""","SubSchema didn't get included in the rest of the switch from ""."" to ""_"" as a field name separator. As part of fixing this, we should also be able to simplify the code in the slot definers in SourceTable.",1 +"DM-1059","08/18/2014 14:00:16","track down difference in SdssShape implementation","The meas_base version of SdssShape produces slightly different outputs from the original version in meas_algorithms, but these should be identical. We should understand this difference rather than assume its benign just because it's small.",2 +"DM-1067","08/19/2014 13:34:06","move algorithm implementations out of separate subdirectory","We should move the code in the algorithms subdirectory (and namespace) into the .cc files that correspond to individual algorithms. They should generally go into anonymous namespaces there. After doing so, we should do one more test to compare the meas_base and meas_algorithms implementations.",1 +"DM-1068","08/19/2014 14:11:39","audit and clean up algorithm flag and config usage","Check that meas_base plugins and algorithms have appropriate config options and flags (mainly, check that there are no unused config options or flags due to copy/paste relics).",1 +"DM-1070","08/19/2014 14:14:23","switch default table version to 1","Now that all tasks that use catalogs explicitly set the table version, it should be relatively straightforward to set the default version to 1 in afw. Code that cannot handle version > 0 tables should continue to explicitly set version=0.",2 +"DM-1071","08/19/2014 14:15:40","Switch default measurement tasks to meas_base","We should set the default measurement task in ProcessImageTask to SingleFrameMeasurementTask, and note that SourceMeasurementTask and the old forced photometry drivers are deprecated.",2 +"DM-1072","08/19/2014 14:17:02","create forced wrappers for algorithms","We have multiple algorithms in meas_base which could be used in forced mode but have no forced plugin. We should go through the algorithms we have implemented and create forced plugin wrappers for these.",1 +"DM-1073","08/19/2014 14:18:09","remove old forced photometry tasks","After meas_base has been fully integrated, remove the old forced photometry tasks from pipe_tasks",1 +"DM-1076","08/19/2014 14:52:21","convert afw::table unit tests to version 1","Most afw::table unit tests explicitly set version 0. We should change these to test the new behaviors, not the deprecated ones.",2 +"DM-1077","08/19/2014 15:01:35","Audit TCT recommendations to ensure that all standards updates were installed into Standards documents.","Audit TCT recommendations to ensure that all standards updates were installed into Standards documents. It was found that the meeting recorded in: [https://dev.lsstcorp.org/trac/wiki/Winter2012/CodingStandardsChanges] failed to include two recommendations: * recommended: 3-30: I find the Error suffix to be usually more appropriate than Exception. ** current: 3-30. Exception classes SHOULD be suffixed with Exception. * recommended but not specifically included: Namespaces in source files: we should use namespace blocks in source files, and prefer unqualified (or less-qualified) names within those blocks over global-namespace aliases. ** Rule 3-6 is an amalgam of namespace rules which doesn't quite have the particulars desired. FYI: The actual vote was to: ""Allow namespace blocks in source code (cc) files."" To simplify the future audit, all other recommendations in that specific meeting were verified as installed into the standards.",2 +"DM-1083","08/20/2014 12:50:52","Fix overload problems in SourceCatalog.append and .extend","This example fails with an exception: {code:py} import lsst.afw.table as afwTable schema = afwTable.SourceTable.makeMinimalSchema() st = afwTable.SourceTable.make(schema) cat = afwTable.SourceCatalog(st) tmp = afwTable.SourceCatalog(cat.getTable()) cat.extend(tmp) {code} Expected behavior is that the last line is equivalent to {{cat.extend(tmp, deep=False)}}.",1 +"DM-1088","08/21/2014 07:59:46","Investigate HTCondor config settings to control speed of ClassAd propagation","With default settings we do not have good visibility as to whether an updated ClassAd on a compute node (e.g., CacheDataList now has ccd ""S00"") will be in effect on the submit node in time for a Job to be matched to an optimal HTCondor node/slot. There are several components (negotiator, schedd, startd) and their associated activities that could impact the time that it takes for a new ClassAd on a worker node to 'propagate' back to the submit side. We investigate these configuration settings to try to determine what thresholds for configuration settings are required to meet a given time cadence of job submissions.",2 +"DM-1113","08/22/2014 10:39:51","Make the API for ISR explicit","The run method of the IsrTask currently takes a dataRef which has getters for calibration products. This makes the task hard to re-use because one needs a butler and because the interface is opaque. This task will make the IsrTask API more transparent. JK: In PMCS this would be Krughoff S",20 +"DM-1125","08/26/2014 13:38:38","avoid usage of measurement framework in star selectors","At least one of the star selectors uses the old measurement framework system to measure the moments of a cloud of points. With the new versions of all the measurement plugins, it should be much easier (and cleaner) to just call the SdssShape algorithm directly, instead of dealing with the complexity of applying the measurement framework to something that isn't really an image.",3 +"DM-1126","08/26/2014 15:29:40","design new Footprint API","This issue is for *planning* (not implementing) some changes to Footprint's interface, including the following: - make Footprint immutable - create a separate SpanRegion class that holds Spans and provides geometric operators does not hold Peaks or a ""region"" bbox (Footprint would then hold one of these). - many operations currently implemented as free functions should be moved to methods - we should switch the container from vector to simply vector, as Span is nonpolymorphic and at last as cheap to copy as a shared_ptr. The output of this issue will be a set of header files that define the new interface, signed off by an SAT design review. Other issues will be responsible for implementing the new interface and fixing code broken by the change.",8 +"DM-1135","08/26/2014 16:37:24","test how large pixel region used in galaxy fitting needs to be","Using simulations built on DM-1132 and driver code from DM-1133, test different pixel region sizes and shapes, and determine at what point shear bias due to finite fit region drops below a TBD threshold.",20 +"DM-1137","08/26/2014 19:07:34","Evaluate python/c++ documentation generation and publication tools ","This epic related to documentation that is provided as part of normal development activities. The desire is to keep this documentation in and near the codebase as this is best practice for it being maintainable. At the other end, we wish to publish this documentation in a coherent and searchable way for users. A number of tools exist in this area and this item requires a preliminary evaluation to be made. This is part of curating our documentation infrastructure. [FE 75% DOC 100% starting August 20th] ",20 +"DM-1138","08/26/2014 19:23:33","Demonstrate & iterate with team on documentation toolchain ","Following from DM-1137, this epic relates to demonstrating various options for documentation tools workflows to the team, gathering input as to the preferred solution, adopting a workflow, and defining any specific implementation choices. This is part of curating our documentation infrastructure. ",5 +"DM-1143","08/26/2014 19:33:31","Investigate candidates for Verification and Integration Data Sets","The task here is to develop a data set that can be used both for continuous integration (build tests) and automatic QA (integration tests). We want to maximise the richness of the data set in terms of its usefulness, but minimise it in terms of its size. DN to co-ordinate contributions. [DN 95% FE 5%]",40 +"DM-1147","08/29/2014 08:36:27","Create a top-level qserv_distrib package","qserv_distrib will be a meta-package embedding qserv, qserv_testdata and partition.",2 +"DM-1151","08/29/2014 15:19:09","Fix example of IsrTask to be callable with data on disk","Currently the example of the IsrTask takes a fake dataref. This is hard to use with real data. In DM-1113 we will update IsrTask to not take a dataRef. This will make it easy to update the example script to work with real data. This ticket will also include removing from the unit tests any fake dataRefs that have become unnecessary as a result of DM-1299. ",2 +"DM-1152","08/29/2014 18:32:13","Css C++ client needs to auto-reconnect","The zookeeper client in C++ that the czar uses doesn't auto-reconnect. This is a capability provided in the kazoo library that qserv's python layer provides, but isn't provided in the c++ client. The zookeeper client disconnects pretty easily: if you step through your code in gdb, the zk client will probably disconnect because its threads expect to keep running. zk sessions may expire too. Our layer should reconnect unless there is really no way to recover without assistance from the calling code (e.g. configuration is wrong, etc.). This ticket includes only basic reconnection attempting, throwing an exception only when some ""reconnection-is-impossible"" condition is met.",2 +"DM-1160","09/01/2014 14:29:41","SUI catalog and image interactive visualization with LSST data","Using the current software components developed in IPAC to put together a prototype of visualization capabilities. The purpose is to exercise the data access APIs developed by SLAC and get feedback from DM people and potential users of the tool. 20% Goldina, Zhang 10% Roby, Ly, Wu, Ciardi ",20 +"DM-1161","09/02/2014 12:12:47","Cleanup SdssShape","We should do a comprehensive cleanup of the SdssShapeAlgorithm class. This includes removing the SdssShapeImpl interface (never supposed to have been public, but it became public) from other code that uses it, and integrating this code directly into the algorithm class. We should also ensure that the source from which the algorithm is derived is clearly cited -- that's Bernstein and Jarvis (2002, http://adsabs.harvard.edu/abs/2002AJ....123..583B); see also DM-2304.",8 +"DM-1188","09/17/2014 13:25:49","rewrite low-level shapelet evaluation code","While trying to track down some bugs on DM-641, I've grown frustrated with the difficulty of testing the deeply-buried (i.e. interfaces I want to test are private) shapelet evaluation code there. That sort of code really belongs in the shapelet package (not meas_multifit) anyway, where I have a lot of similar code, so on this issue I'm going to move it there and refactor the existing code so it all fits together better.",2 +"DM-1192","09/18/2014 10:11:14","Write a transition plan to move gitolite and Stash repositories to GitHub","As recommended by the SAT meeting on 2014-09-16, we need this document to promote the use of GitHub by other subsystems within the project and to understand the impacts on DM. The plan should include, but is not limited to: * Whether and how the repositories should be reorganized. * How existing commit attributions will be translated. * Moving comments in Stash to GitHub",20 +"DM-1195","09/18/2014 18:09:24","There is a bug in the prescan bbox for megacam.","The bounding box of the prescan region in the megacam camera should have zero y extent (I think). Instead it goes from y=-1 to y=2. This is either a bug in the generation of the ampInfoTables or in the way the bounding boxes are interpreted.",1 +"DM-1196","09/18/2014 18:32:01","exampleUtils in ip_isr is wrong about read corner","https://dev.lsstcorp.org/cgit/LSST/DMS/ip_isr.git/tree/examples/exampleUtils.py#n95 Says that the read corner is in assembled coordinates. This is not true, it is in the coordinates of the raw amp. That is, if the raw amp is in electronic coordinates (like the lsstSim images) it is always LL, but if it is pre-assembled, it may be some other corner. This should probably use the methods in cameraGeom.utils to do the image generation.",1 +"DM-1197","09/18/2014 19:49:31","Support some mixed-type operations for Point and Extent","The current lack of automatic conversions in python is pretty irritating, and I think it's a big enough issue for people writing scripts that we should fix it. In particular, allow {code} Point2D + Extent2I Point2D - Extent2I Point2D - Point2I Extend2D + Extent2I Extend2D - Extent2I {code} (and the respective operations in the opposite order where well defined) It would also be good to allow the all functions expecting PointD to accept PointI, but I'm not sure if swig makes this possible. It's probably not worth providing C++ overloads for all of these functions (and to be consistent we should probably do all or none). I realize that you invented these types to avoid bare 2-tuples, but I'm not convinced that we shouldn't also provide overloads to transparently convert tuples to afwGeom objects.",2 +"DM-1211","09/24/2014 17:30:16","anaconda is too outdated to work with pip","The version of anaconda distributed with the stack is too outdated to be used with pip (and probably other things). The issue is an unsafe version of ssh. A workaround is to issue this command while anaconda is setup: {code} conda update conda {code} Warning: it is unwise to try to update anaconda itself (with ""conda update anaconda"") because that will revert some of the changes and may result in an unusable anaconda. I think what is required is an obvious change to ups/eupspkg.cfg.sh The current version of anaconda is 2.0.1 based on http://repo.continuum.io/archive/ Note: there is no component for anaconda. I will submit another ticket.",2 +"DM-1213","09/24/2014 17:41:06","cleanup order/grouping of header files","We want: * header for the class * then system * then third party * then lsst * then qserv We currently don't have the ""lsst"" group (with a few exceptions), and we call the last one ""local"" in most places.",1 +"DM-1217","09/25/2014 13:52:17","Refactor meas_base Python wrappers and plugin registration","meas_base currently has a single Swig library (like most packages), defined within a single .i file (like some packages). It also registers all of its plugins in a single python module, plugins.py. Instead, it should: - Have two Swig libraries: one for the interfaces and helper classes, and one for plugin algorithms. Most downstream packages will only want to %import (and hence #include) the interface, and having them build against everything slows the build down unnecessarily. The package __init__.py should import all symbols from both libraries, so the change would be transparent to the user. - Have separate .i files for each algorithm or small group of algorithms. Each of these could %import the interface library file and the pure-Python registry code, and then register the plugins wrapped there within a %pythoncode block. That'd make the implementation of the algorithms a bit less scattered throughout the package, making them easier to maintain and better examples for new plugins.",3 +"DM-1218","09/25/2014 16:08:29","Support multiple-aperture fluxes in slots","We should be able to use multiple-aperture flux results in slots. While this is technically possible already by setting specific aliases, it doesn't work through the usual mechanisms for setting up slots (the define methods in SourceTable and the SourceSlotConfig in meas_base). After addressing this, we should remove the old SincFlux and NaiveFlux algorithms, as the new CircularApertureFlux algorithm will be able to do everything they can do.",2 +"DM-1231","09/30/2014 03:25:03","LSE-69: Bring to Phase 3","Reflects work needed in Summer 2015",0 +"DM-1232","09/30/2014 03:27:04","LSE-72: Bring Summer 2014 work to CCB approval","Remaining work is to proofread the SysML-ization by Brian Selvy of the LSE-72 draft, do any required cleanup in conjunction with the OCS team, and advocate for LCR-202 (already exists) at the CCB.",3 +"DM-1233","09/30/2014 03:43:46","Refine requirements and use cases for Level 3 facilities","Refine the requirements and use cases for the three branches of Level 3 capabilities exposed to users: * Level 3 programming toolkit (user reconfiguration / extension of DM pipelines and stack) * Level 3 compute cycle delivery (user access to 10% of compute base) * Level 3 data product storage Deliverables: * Refinement, if necessary, to Level 3 requirements in DMSR * Flowed-down requirements as a separate document. Sufficient detail to allow a breakdown of the deliverables in the three areas of Level 3 by annual release cycle through construction period.",20 +"DM-1237","09/30/2014 06:58:01","LSE-75: Refine WCS and PSF requirements","Clarify the data format and precision requirements of the TCS (or other Telescope and Site components) on the reporting of WCS and PSF information by DM on a per-image basis. Depends on the ability of the T&S group to engage with this subject during the Winter 2015 period. Can be deferred to Summer 2015 without major impacts. Current PMCS deadline for Phase 3 readiness of LSE-75 is 29-Sep-2015.",8 +"DM-1242","09/30/2014 07:32:12","Risk Register refresh 1/2015","Periodic review of DM risk register contents. Covers preparation for a review expected at the end of January 2015, the only one during Winter 2015.",3 +"DM-1245","09/30/2014 07:37:06","Install scisql plugin (shared library) outside of eups stack.","sciSQL plugin is currently deployed in eups stack (i.e. $MYSQL_DIR/lib/plugin) during configuration step. Nevertheless eups stack should be immutable during configuration step. MySQL plugin-dir option may allow to deploy sciSQL plugin outside of eups stack (for example in QSERV_RUN_DIR).",3 +"DM-1251","09/30/2014 15:34:14","CSS design for query metadata v1","The goal of this ticket (and DM-1250) is to try to understand what kind of per-query metadata is necessary to provide client-transparent query processing in case when czar/proxy could die or be restarted. Some relevant info is in the Trac: https://dev.lsstcorp.org/trac/wiki/db/Qserv/CSS/RunTimeState",3 +"DM-1254","09/30/2014 15:41:35","Metadata Store - design v1","Research potential off-the-shelf candidates. Propose initial version of metadata design. ",5 +"DM-1281","10/01/2014 11:01:05","add Schema method to join strings using the appropriate delimiter","Delimiters in Schema field names are version-dependent. One can currently use {{schema[""a""][""b""].getPrefix()}} to join fields using the appropriate delimiter, but this is confusing to read.",1 +"DM-1282","10/01/2014 11:18:26","multi-level replacement in Schema aliases","Schema aliases should support more than one level (i.e. an alias may resolve to another alias).",2 +"DM-1285","10/01/2014 14:13:21","Improve Startup of HTCondor Jobs","Adjust configuration parameters of HTCondor config and/or submission files to improve speed at which HTCondor jobs start in both the replicator pool and worker pool.",2 +"DM-1287","10/03/2014 02:40:29","Propose and document a recipe to build Qserv in eups","In-place build is available and documented.",2 +"DM-1293","10/03/2014 15:13:03","Implement designed tests for processCcd","Implement the designed tests with the installed data.",8 +"DM-1305","10/05/2014 19:05:22","Tests fail in shapelet when building on OS X 10.9","When building the master on pugsley.ncsa.illinois.edu, shapelet builds successfully, but two tests fail: {code} pugsley:lsstsw mjuric$ cat build/shapelet/tests/.tests/*.failed tests/testMatrixBuilder.py .F..... ====================================================================== FAIL: testConvolvedCompoundMatrixBuilder (__main__.MatrixBuilderTestCase) ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/testMatrixBuilder.py"", line 310, in testConvolvedCompoundMatrixBuilder self.assertClose(numpy.dot(matrix1D, coefficients), checkVector, rtol=1E-14) File ""/Users/mjuric/test/lsstsw/stack/DarwinX86/utils/9.2+8/python/lsst/utils/tests.py"", line 328, in assertClose testCase.assertFalse(failed, msg=""\n"".join(msg)) AssertionError: 1/50 elements differ with rtol=1e-14, atol=2.22044604925e-16 0.175869366369 != 0.175869366369 (diff=1.99840144433e-15/0.175869366369=1.13629876856e-14) ---------------------------------------------------------------------- Ran 7 tests in 0.323s FAILED (failures=1) tests/testMultiShapelet.py ...F... ====================================================================== FAIL: testConvolveGaussians (__main__.MultiShapeletTestCase) ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/testMultiShapelet.py"", line 88, in testConvolveGaussians self.compareMultiShapeletFunctions(msf3a, msf3b) File ""/Users/mjuric/test/lsstsw/build/shapelet/python/lsst/shapelet/tests.py"", line 107, in compareMultiShapeletFunctions self.compareShapeletFunctions(sa, sb, rtolEllipse=rtolEllipse, rtolCoeff=rtolCoeff) File ""/Users/mjuric/test/lsstsw/build/shapelet/python/lsst/shapelet/tests.py"", line 86, in compareShapeletFunctions rtol=rtolEllipse) File ""/Users/mjuric/test/lsstsw/stack/DarwinX86/utils/9.2+8/python/lsst/utils/tests.py"", line 328, in assertClose testCase.assertFalse(failed, msg=""\n"".join(msg)) AssertionError: 1/5 elements differ with rtol=1e-14, atol=2.22044604925e-16 2.44929359829e-16 != 1.13310777953e-15 (diff=8.881784197e-16/1.13310777953e-15=0.783842839795) ---------------------------------------------------------------------- Ran 7 tests in 0.131s FAILED (failures=1) {code} =============== More info on pugsley.ncsa.illinois.edu: pugsley:lsstsw mjuric$ sw_vers ProductName: Mac OS X ProductVersion: 10.9.5 BuildVersion: 13F34 pugsley:lsstsw mjuric$ clang -v Apple LLVM version 5.1 (clang-503.0.40) (based on LLVM 3.4svn) Target: x86_64-apple-darwin13.4.0 Thread model: posix ============ The files are in {{/Users/mjuric/test/lsstsw/build/shapelet/}}.",1 +"DM-1309","10/06/2014 04:32:58","Edit agreed-upon changes into Word version of LSE-69","A meeting around 9/26/2014 agreed on a set of revisions to LSE-69, with some language still needed from [~gpdf]. This action is to edit the tracked-changes Word version of LSE-69 containing the notes from that meeting into a final copy that can be reviewed by the Camera team and used as input to editing the SysML version of the ICD.",3 +"DM-1310","10/06/2014 04:34:55","Create change request for LSE-69","Create a change request to bring LSE-69 up to date and capture the Summer 2014 work.",1 +"DM-1311","10/06/2014 04:36:53","Enter LSE-69 update into EA as SysML","Covers entering the contents of the LSE-69 update into EA as SysML, with associated updating of diagrams, and the creation of a docgen'ed version for CCB action.",1 +"DM-1312","10/06/2014 04:40:50","Proofread docgen'ed version of LSE-72","Brian Selvy is producing a SysML version of the LSE-72 updated edited by [~gpdf]. The action here is to proofread the docgen of that version once it is ready.",2 +"DM-1313","10/06/2014 05:01:34","Identify Conditions information in LSE-130 that is required for Alert Production","LSE-69 declares that there are two categories of Conditions data (telemetry) required by DM from the Camera: those items that are needed for Alert Production (for which the AP components at the Base will need a whitelist, and for which the Camera has a tighter latency requirement), and those that are not (but are then presumably needed in DRP or other deferred productions). It states that the subset needed for AP should be enumerated in LSE-130. The action here is to create an initial version of that list.",2 +"DM-1314","10/06/2014 07:51:25","Publish Qserv S14 version on lsst distribution server","In order to publish this version please tag Qserv master tip with ""2014_09.0"" and then run: {code:bash} ssh lsstsw@lsst-dev # command below can't be runned in buildbot, as it doesn't support qserv_distrib build rebuild -t 2014_09.0 qserv_distrib # bXXX is provided by previous command publish -t qserv -b bXXX qserv_distrib publish -t 2014_09 -b bXXX qserv_distrib {code} ",0.5 +"DM-1316","10/06/2014 13:34:23","Deploy LSST stack within OpenStack instances on ISL testbed","Deploy the LSST Stack within OpenStack instances within the ISL testbed -- this could be for multiple flavors CentOS, Ubuntu, etc, and this could be done by pulling Docker Images to the instances. There will also likely be some initial debugging of starting instances within the ISL platform as a new installation has been stood up Sept 2014. ",1 +"DM-1317","10/06/2014 13:40:59","Create Docker Image / Dockerfile for LSST Stack for ubuntu","Create an installation of the LSST Stack v9_2 within a Docker Image for ubuntu for easing the import of LSST software into an OpenStack instance, We create the image utilizing a Dockerfile to make systematic the creation of such images. ",2 +"DM-1318","10/06/2014 14:31:59","update expected results file in SDSS demo test","Update the expected outputs in the SDSS DM stack demo repo to match what we expect from the new meas_base framework.",1 +"DM-1331","10/06/2014 15:13:52","squash edge errors in SdssCentroid","SdssCentroid doesn't trap exceptions that are thrown due to being too close to the edge, resulting in noisy warnings in the logs. Instead, it should catch the low-level exception and re-throw as MeasurementError, after defining a flag field for this specific failure mode.",1 +"DM-1332","10/06/2014 15:17:49","address no-shape warnings in GaussianFlux","GaussianFlux relies on the shape slot, and puts noisy warnings in the logs when the shape slot fails. However, we probably don't want to add a new flag for GaussianFlux to indicate this failure mode, because it'd be entirely redundant with the shape slot flag. We should figure out some other way to squash this warning - how we do that may depend on whether this is addressed before or after the C++ redesign. We should also consider having GaussianFlux add an alias to the schema to point back at the shape slot flag, creating what looks like a specific flag for this failure while actually just being a link back to the shape slot flag. That's probably not worth doing within the current C++ interface, however, as it'd require some unpleasant mucking around with ResultMappers.",2 +"DM-1333","10/06/2014 15:28:05","resolve factor of two difference in GaussianFlux","After changing the implementation of GaussianFlux to use the shape slot rather than estimate the shape itself by re-running the SdssShape code, Perry saw a 5-15% difference in the fluxes (I'm not sure of the sign). The new behavior (using the shape) is consistent with what we'd have gotten with the old code when the little-used ""fixed"" config option was enabled (not surprising, as that just looked up the SdssShape measurement by name, instead of via slots). I suspect the difference is coming in because of the factor of two between SdssShape's ""raw"" measurements - the actual Gaussian-weighted moments - and the factor of 2 it applies to make its measurements equivalent to ideal unweighted moments. The correct weight function to use for GaussianFlux includes this factor of 2 (i.e. it's larger than the ""raw"" moments), and it's likely either the old code wasn't including this or the new code isn't. We need to determine which one, and if necessary, fix the new code.",2 +"DM-1334","10/06/2014 22:56:10","Test the creation of basic OpenStack instances on the new ISL testbed [IceHouse]","A new version & implementation of the ISL OpenStack testbed is up and running. The new cloud is using IceHouse, the ninth OpenStack release. We get started on this platform by verifying that basic instance creation is working. We target the creation of an instance through the (Horizon) GUI interface, and via the nova CLI. ",1 +"DM-1335","10/06/2014 23:18:19","Create instance with a Floating IP Associated through the nova CLI","We see that in working with the Horizon GUI, it is fairly straightforward to give an instance a public IP address by associating a Floating IP with the current local IP. However, we will want to be able to accomplish this task both remotely and programmatically within workflow. As a step towards this, we target the solution of this via the nova CLI.",2 +"DM-1340","10/07/2014 14:04:32","Read through log4cxx documentation and log.git code","Read through the log4cxx documentation and become familiar with how the log.git package is set up.",2 +"DM-1343","10/07/2014 14:11:29","Write Unit test for new DM message appender class","Write unit tests for DM message appender class. This might also require some tests for a configurator class, if that class is created.",2 +"DM-1347","10/07/2014 14:36:33","Refine Event base class to allow ActiveMQ filterable settings","The current Event.cc base class needs to be refined to remove and old-style data release terms that aren't used anymore. Plus, it needs to be easily extensible to allow other types of dictionaries of terms that will be used in the message headers to make them filterable on the server side.",2 +"DM-1348","10/07/2014 14:37:59","Update tests to use unit test framework","The tests for this package predate the unit test framework that other package use. Update the tests to uses the unit test framework and get rid of any duplicate or obsolete tests.",5 +"DM-1352","10/07/2014 15:31:29","ORIGINATORID value can churn too quickly.","The ORIGINATORID is a 64-bit word consisting of an IPv4 host address, 16-bit process id, and 16-bit local value. In addition to the 16-bit process id not being standard across platforms (Mac OS >Leopard goes to 99999), the churn rate for the local value should be much higher than just 16 bits. This could be fixed to changing ORIGINATORID to a 32-bit process id and a separate value for the local value, which would be specified together in the DM event selector. I have to look into this more to see if this is a viable solution. This might need to go to three separate values to future proof it (i.e., ipv6).",8 +"DM-1354","10/08/2014 14:46:38","Install docker 1.1.2 in an ISL OpenStack CentOS instance, perform basic checks","Install docker 1.1.2 in an ISL OpenStack CentOS 6.5 instance, and perform basic checks such stopping and starting the docker daemon, changing default settings such as size limit of containers/images, pulling standard images from docker hub, starting containers from these images, etc. ",2 +"DM-1355","10/08/2014 16:43:59","Re-arrange how Qserv directories are installed","we already touched question about installation directory structure at a meeting, maybe we can improve things by re-arranging how things are installed we are currently installing stuff into four directories: cfg, bin, lib, proxy to make it look more standard and to avoid clash with qserv source directories we could move cfg and proxy to a different location (something like share/qserv to make it more root-install-friendly in case we ever want to install under /usr) this change (if you want to do it) deserves separate ticket, do not do it in this ticket ",3 +"DM-1358","10/09/2014 10:11:29","End-to-end demo fails to exit with the correct status when Warning would be correct.","The output of demo2012 results in an output file which is compared against a benchmarked file. Currently the comparison allows a deviation from the benchmark based ""on the the number of digits in the significands used in multiple precision arithmetic""; that number is currently set to 11. An example using that setting is: @ Absolute error = 5.9973406400e-1, Relative error = 9.1865836000e-4 ##2566 #:25 <== 29.7751550737 ##2566 #:25 ==> 29.7478269835 Additionally, the current use returns: a 'Fatal' error if the code itself fails to execute correctly; a ""Warning' error if the any of the benchmarked quantities do not meet the comparison criteria; 'Success' if the comparison meets all criteria. It is noted that the buildbot 'Warning' color indicator is not currently being displayed when the comparison fails. That is a coding error. This Issue will: * ensure that BUILDBOT_WARNING(S) causes the correct display color when appropriate. This Ticket has been split into two parts. This is part 1. Part 2 is DM-1379 ",2 +"DM-1360","10/09/2014 16:42:13","Fix minor loose ends from new result plumbing","Some of the more minor issues raised in comments from DM-199 were left undone prior to merging. This ticket addresses them. For more information, please see: https://github.com/LSST/qserv/pull/2 . ",1 +"DM-1362","10/09/2014 23:27:06","Edit pull interface and other Summer 2014 work into LSE-68 in Word","Deliverable: circulate a Word-based draft of LSE-68 in which the ""push"" interface is removed, and the ""pull"" interface is refined to include the guider and other Summer 2014 work. Note that the use of ""pull"" for the guider applies whether or not the proposed guider redesign is accepted.",2 +"DM-1363","10/10/2014 01:06:28","Avoid use of ~/.my.cnf (used by css watcher)","See https://jira.lsstcorp.org/browse/DM-1258?focusedCommentId=29230&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-29230. {{~/.my.cnf}} is used by css watcher, an optional tool used for monitoring css. It is a symlink to $QSERV_RUN_DIR/etc/my-client.cnf. The css watcher could use MySQL credentials located in ~/.lsst/qserv.conf (used by integration tests wich are a Qserv/MySQL client)",2 +"DM-1364","10/10/2014 14:04:07","replace ""bad data"" flag in SdssCentroid","SdssCentroid has a ""bad data"" flag that doesn't actually convey any information about what went wrong. This should be replaced with one or more flags that provide more information.",3 +"DM-1372","10/13/2014 17:17:19","Errors in testHealpixSkyMap.py","There is a failing unit test when healpy is supplied. The problem is that the method boundary is not defined in the version of healpy we supply for the sims, however boundaries does exist. If I replace boundary with boundaries, the test passes.",0.5 +"DM-1375","10/13/2014 19:13:49","Create Docker Image / Dockerfile for LSST Stack for CentOS6.5","Make a Dockerfile for systematic generation of docker images using a Centos6.5 base image containing the LSST Stack (v9_2 at the moment) and library dependencies.",2 +"DM-1376","10/13/2014 19:25:38","Ensure that the partition package is C++11 clean and compiles on OSX 10.9","The LSST buildbot infrastructure recently changed to building everything with --std=c++0x, which broke the partition package, and hence automated Qserv builds. While debugging this, I discovered that the partition package does not build on OSX 10.9, and considering how minimal its dependencies are, it really should. The OSX issue can be fixed by avoiding {{using boost::make_shared}}. The partition package should be cleaned up to avoid all use of {{using}}. If we decide to use C++11 in Qserv, then the codebase should also be modernized (in particular, there are use-cases for static_assert, nullptr, etc... ). ",2 +"DM-1386","10/15/2014 16:30:54","Merge Footprints from different bands/epochs","The current concept of the deblender assumes that the inputs are - A merged set of Footprints that define which pixels are part of the blend - A merged set of Peaks within that merged Footprint Please generate these merged Footprints (which will be defined in (x, y) coordinates in the tract/patch coordinate system). ",5 +"DM-1387","10/15/2014 16:32:03","Generate a master list of Objects given detections in multiple bands/epochs","Once we've detected Sources in multiple bands we need to merge the positions to generate Objects. This is a little complicated (or at least messy): - The positions have errors - If the seeing is different in different visits, objects may be blended in some but not all exposures - If we use more than one detection pass (e.g. smoothing when looking for faint objects, not smoothing for bright) this has similar-but-different consequences (but we should probably deal with in the per-band processing) - Objects move, so even if the positions are within the errors the motion may still be detectable ",5 +"DM-1388","10/15/2014 18:48:03","Submit LCR for LSE-68","Create an LCR, including a summary of changes, for LSE-68.",2 +"DM-1394","10/16/2014 16:45:22","Eups 1.5.4 requires each new shell to source the eups setups.sh","eups v 1.5.4 requires each new shell to source ...eups/../bin/setups.sh. This requires the buildbot scripts: runManifestDemo.sh, create_xlinkdocs.sh, be updated to individually do that task. Add demo2012: bin/demo.sh . ",2 +"DM-1396","10/17/2014 00:26:11","Design CSS schema to support database deletion","Need to implement deleting databases. Deliverable: a design of the system that will be capable of deleting a distributed database including all copies of that database on all workers, all replicas of all chunks are deleted. It should be possible to ""create database x"" at any time later.",2 +"DM-1404","10/21/2014 12:58:03","Create suite of Dockerfiles / docker images for LSST Stack for ubuntu, CentOS","Building on issues DM-1317 and DM-1375 where initial images and Dockerfile's were constructed, we can now use these Dockerfile's as prototypes to extend the set of Dockerfiles & images. We observe that by making simple (scriptable) edits to the initial Dockerfile, we can run 'docker build' to make docker images for several combinations of OS and base compiler gcc version.",2 +"DM-1407","10/23/2014 10:24:18","Add return code for integration tests","Integration test have to returns non-zero when failing. This will ease use of CI or debugging tools ({{git bisect}}, buildbot).",0.5 +"DM-1424","10/24/2014 10:39:48","Create persistent volume of Cinder block storage and attach to instance","We create a persistent volume of Cinder block storage and attach to working instance. When it was created, the instance does have a specified amount of ephemeral disk, but this disk will be destroyed with the instance. We want to test that we can create a persistent volume of block storage, attach it to an instance, format the storage with a file system, and mount the volume for use with processing, where data/output can be retained after the instance is destroyed. ",1 +"DM-1431","10/27/2014 12:48:35","Process sample sdss data with LSST Stack in a Docker container in OpenStack","Process sample sdss data, starting with the lsst_dm_stack_demo, with LSST Stack in a Docker container in an OpenStack instance. ",2 +"DM-1434","10/28/2014 13:16:55","State diagram for jobs in progress","Build a state diagram showing job progress throughout a run.",13 +"DM-1439","10/28/2014 15:14:37","afw tests use the same name for a dummy file: test.fits","The afw package uses a file named: test.fits in multiple testers. If the user sets up the build to use multiple CPU (-j #), then there is the risk that the shared filename will be affected by more than one tester at a time. In the case which provoked this Issue, the tester: testSimpleTable.py, reported that the file was missing. A simple rerun managed to get past the error. I recommend that the different testers use uniquely named demo files.",1 +"DM-1440","10/28/2014 22:50:03","Github Transition: Naming conventions for repositories","The Simulations team has requested that repos in general and the numerous DM repos in particular are prefixed in a way that would make fitering them out of searches and display easy (for example, their repos are prefixed sims_*) This would be an evident useability aid to DM developers and outside contributors too. Obtain a decision on how to allow users to quickly isolate repositories they are interested in. ",3 +"DM-1443","10/28/2014 22:59:01","Github Transition: pull request discussion, retention - proof of concept","Store review comments / PR discussion into relevant git repositories Code & process to do this on a continuous basis post-transition will be a different issue, ",1 +"DM-1444","10/29/2014 09:57:16","Test absence of individual components","Test absence of individual components of the AP simulator. Bring each down, run the system, and restart just those components to see if the system still operates as expected",2 +"DM-1448","10/29/2014 17:44:31","Move code for mock images into afw so it reusable.","There is some code in the exampleUtils in $IP_ISR_DIR/examples that could be of wider use. Specifically there is code to generate mock darks, flats, and raw data from a mock camera. There is also code to generate a mock dataRef. It could be used more widely if moved someplace else. Russell suggested afw.cameraGeom.utils.",1 +"DM-1461","10/31/2014 18:35:40","C++ Redesign -- Result definition for custom algorithms","Additions to Jim's redesign to make it easier to define custom results.",3 +"DM-1462","10/31/2014 18:38:50","Add NaN check to PixelFlags","The test of PixelFlags in measureSources.py (from measAlg) requires a check to be sure that the center inputs are not NaN.",1 +"DM-1463","10/31/2014 18:40:20","SdssShape shiftMax config item is being ignored","The code we ported from meas_algorithms sets the maxShift to 2 without regard to the config item which is supposed to set that value.",1 +"DM-1464","11/03/2014 08:09:33","Design Review prep for C++ redesign","Write up the design developed on DM-829 and push it through review.",1 +"DM-1469","11/03/2014 15:30:20","Github Transition Plan: Reverse mirror for beta-tester repositories","Test the reverse mirror for beta-testers. Straw man: 1. Break the mirror for anybody beta-testing github workflow 2. Mirror back to new gitolite area: ""mirror"") Method: https://help.github.com/articles/duplicating-a-repository/ ",1 +"DM-1475","11/04/2014 16:10:54","Fix 2014_09 documentation","Replace {{NEWINSTALL_URL=http://sw.lsstcorp.org/pkgs/}} with: {{NEWINSTALL_URL=http://sw.lsstcorp.org/eupspkg/}} and {{eups distrib install qserv_distrib 2014_10.0}} with: {{eups distrib install qserv_distrib -t 2014_10}}",0.5 +"DM-1480","11/05/2014 14:19:46","Buildbot master takes exception when exiting from mail notifier after dynamic email sent.","Buildbot master exits without posting the required statically-addressed email notification if a dynamically-addressed was sent. This fix needs to ensure that the required (by buildbot specification) static email is sent even if it has to be directed to a dead-letter box (which it is).",3 +"DM-1481","11/05/2014 15:30:53","Discover/learn what others are doing in astronomy software","Attend the annual ADASS conference to keep up with the software development in the astronomy community. Trey Roby, Tatiana Goldina, Xiuqin Wu plan to attend the ADASS 24.",20 +"DM-1497","11/10/2014 15:02:24","Package Qserv mono-node instance in Docker","Learn Docker basics and then package a Qserv mono-node instance.",5 +"DM-1498","11/10/2014 15:03:13","Package Qserv master and worker instance in Docker","Learn Docker basics and then package a Qserv mono-node instance.",5 +"DM-1505","11/11/2014 13:20:13","confusing error message when enabling unregistered items in RegistryField","pex_config seems to split out this confusing error message when trying to enable (i.e. append to .names) a registry item that doesn't exist: {code:hide-linenum} File ""/home/lam3/tigress/LSST/obs_subaru/config/processCcd.py"", line 51, in root.measurement.algorithms.names |= [""jacobian"", ""focalplane""] File ""/tigress/HSC/LSST/lsstsw/anaconda/lib/python2.7/_abcoll.py"", line 330, in __ior__ self.add(value) File ""/tigress/HSC/LSST/lsstsw/stack/Linux64/pex_config/9.0+26/python/lsst/pex/config/configChoiceField.py"", line 72, in add r = self.__getitem__(value, at=at) AttributeError: 'SelectionSet' object has no attribute '__getitem__' {code}",1 +"DM-1506","11/11/2014 13:38:48","Support new version of newinstall.sh","newinstall.sh now creates loadLSST.bash instead of loadLSST.sh. This has to be taken in account in Qserv automated install script: qserv-install.sh and in Qserv documentation.",0.5 +"DM-1514","11/13/2014 12:40:33","calling extend with a SchemaMapper should support positional arguments","Calling {{catalog.extend(other, mapper)}} isn't equivalent to {{catalog.extend(other, mapper=mapper)}} because the second argument is the boolean {{deep}}. When a SchemaMapper is passed as the second argument, we should recognize it for what it is.",1 +"DM-1515","11/13/2014 12:57:11","sconsUtils fails to identify Ubuntu's gcc","On Ubuntu 12.04, gcc --version says: {code:hide-linenum} gcc (Ubuntu/Linaro 4.6.3-1ubuntu5) 4.6.3 Copyright (C) 2011 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. {code} This apparently isn't quite what sconsUtils expected, because it says: {code:hide-linenum} scons: Reading SConscript files ... Checking who built the CC compiler...(cached) error: no result CC is unknown version unknown {code} Happily, everything seems to work anyway, as the fall-back options for the unknown compiler work fine with this one.",1 +"DM-1524","11/17/2014 12:21:49","Switch readMetadata' return value from PropertySet to PropertyList","It'd be useful if readMetadata would return PropertyList instead of PropertySet, because in many situations order matters. For details, see discussion in DM-1517",2 +"DM-1529","11/17/2014 14:38:58","Reorganize docker image repositories and align with github","A heterogeneous collection of docker images have been accumulating within the public docker repository daues/lsstdistrib . Such a heterogeneous collection prevents the assignment of a ""latest"" tag to allow users to easily obtain the most recent image for a particular item (detailed version numbers, labels currently required.) Thus we should break out the single repository into multiple repositories where are ""latest"" tag will be effective. We also make github repositories of matching names to hold the Dockerfiles which produced images (a common pattern for github/dockerhub usage, especially with automated builds; so we start this practice.) ",2 +"DM-1538","11/18/2014 13:34:43","Fix qserv_testdata documentation","qserv_testdata relies on sconsUtils, and its build procedure has to be clearly documented.",0.5 +"DM-1539","11/18/2014 13:40:21","Add support for mysql JDBC driver","SUI which rely on JDBC fail because they internally issue some queries that are not yet supported by Qserv. Need to patch it (in the short term), and add proper support (in the long term). This story covers the patching only. The queries that upset Qserv are listed below. {code} SHOW VARIABLES WHERE Variable_name ='language' OR Variable_name = 'net_write_timeout' OR Variable_name = 'interactive_timeout' OR Variable_name = 'wait_timeout' OR Variable_name = 'character_set_client' OR Variable_name = 'character_set_connection' OR Variable_name = 'character_set' OR Variable_name = 'character_set_server' OR Variable_name = 'tx_isolation' OR Variable_name = 'transaction_isolation' OR Variable_name = 'character_set_results' OR Variable_name = 'timezone' OR Variable_name = 'time_zone' OR Variable_name = 'system_time_zone' OR Variable_name = 'lower_case_table_names' OR Variable_name = 'max_allowed_packet' OR Variable_name = 'net_buffer_length' OR Variable_name = 'sql_mode' OR Variable_name = 'query_cache_type' OR Variable_name = 'query_cache_size' OR Variable_name = 'license' OR Variable_name = 'init_connect' SELECT @@session.auto_increment_increment SET NAMES latin1 SET character_set_results = NULL SET autocommit=1 SET sql_mode='STRICT_TRANS_TABLES' {code}",2 +"DM-1541","11/18/2014 14:08:19","Add support for transmitting [VAR]BINARY column data","The code that pulls data out of a {{MYSQL_ROW}} and puts it into a protobuf {{RowBundle}} does not handle binary data correctly. See [_fillRows|https://github.com/LSST/qserv/blob/master/core/modules/wdb/QueryAction.cc#L217] for the relevant code. The issue is that the generated {{add_column(const char*)}} member function of {{RowBundle}} treats the input as C-style NULL-terminated strings. But in the case of {{BINARY}} column data (and {{VARBINARY}}/{{BLOB}} variants, maybe also {{BIT\(n\)}}), the contents can contain embedded NULLs. We are currently using such columns for user defined types in MySQL (e.g. image bounding polygons), so it's important to get this right. On the protobuf side the fix is as simple as calling {{add_column(const char* value, int size)}} instead. I'm not a MySQL C API expert, but the size will presumably have to be obtained/derived from the corresponding {{MYSQL_FIELD}}. ",8 +"DM-1545","11/19/2014 15:29:04","Span-based shrink operations for Footprint","Analogous to DM-1128, but shrinking rather than growing footprints.",5 +"DM-1571","11/25/2014 11:07:06","Setup Qserv for SUI tests","Setup Qserv on lsst-db2 with and load some reasonable data set (perhaps PT 1.2). One potential caveat: we need to setup access for some accounts that are ideally other than our internal qsmaster.",2 +"DM-1573","11/25/2014 14:13:38","Basic validation of LSST pipeline on HSC data","Get the pipeline running on HSC data to the point where nothing is obviously wrong.",8 +"DM-1578","11/25/2014 17:56:16","Move photocal out of meas_astrom","It is confusing that photocal is in meas_astrom. I assume that is historical. I think it could probably live in pipe_tasks.",2 +"DM-1579","11/25/2014 17:58:40","Implement replacement for A.net index files","Astrometry.net index files are hard to generate and hard to read. We need another, more flexible, more standard system for storing reference files. We should also be able to read FITS files and other formats, but having a standard format with the utilities to create and query them is a must. Coming up with a format that satisfies astrometry.net's solver may be too hard, because a.net requires quads, which a non-blind solver may not need. However, a format that we can convert to something suitable for a.net would probably suffice (conversion would presumably be a separate task that is run once). It will be easier to identify a suitable format once we have identified at least one solver other than than a.net that we wish to implement or adapt. hscAstrom appears to use a catalog of star positions, which is nice and simple.",5 +"DM-1582","11/25/2014 20:04:18","Qserv spatial restrictor names are case sensitive","The SQL grammar treats Qserv spatial restrictor names case insensitively, but {{qana/QservRestrictorPlugin.cc}} does not, which means that one must use e.g. {{qserv_areaspec_box}} rather than {{qserv_areaSpec_box}}. We are loose with case in a lot of our wiki pages, so we really should fix this to avoid confusing users. Also, case insensitivity is consistent with MySQL behavior for UDF names.",1 +"DM-1587","11/26/2014 11:37:24","Define structure of web form for collecting metadata about existing data sets","Web alpha version of the form (using django or Fermi Java webservices code) that collects input from users about data repositories. Authentication not covered in this version.",2 +"DM-1590","11/26/2014 11:46:07","Break down & discuss DM-1074","I will take the lead on DM-1074. First step will be to sit with [~jbosch], get a feeling for what needs to be done, and sketch out a set of stories.",1 +"DM-1600","12/01/2014 13:00:16","Determine if Astrometry class is desired","The question is whether the Astrometry class is the thing that is overridden or if the AstrometryTask has component configurables that are overridden. Also, determine location default implementation.",2 +"DM-1601","12/01/2014 15:04:13","Add support for c-style comments in front of queries sent to qserv","Connecting to qserv from java fails because the jdbc driver inserts comments. ""/* ... */"" in front of queries (example pasted below). The fix involves removing the comments at the proxy level {code} SQLException: Qserv error: 'ParseException:ANTLR parse error:unexpected token: /:' Query being executed when exception was thrown: /* mysql-connector-java-5.1.34 ( Revision: jess.balint@oracle.com-20141014163213-wqbwpf1ok2kvo1om ) */SHOW VARIABLES WHERE Variable_name ='language' OR Variable_name = 'net_write_timeout' OR Variable_name = 'interactive_timeout' OR Variable_name = 'wait_timeout' OR Variable_name = 'character_set_client' OR Variable_name = 'character_set_connection' OR Variable_name = 'character_set' OR Variable_name = 'character_set_server' OR Variable_name = 'tx_isolation' OR Variable_name = 'transaction_isolation' OR Variable_name = 'character_set_results' OR Variable_name = 'timezone' OR Variable_name = 'time_zone' OR Variable_name = 'system_time_zone' OR Variable_name = 'lower_case_table_names' OR Variable_name = 'max_allowed_packet' OR Variable_name = 'net_buffer_length' OR Variable_name = 'sql_mode' OR Variable_name = 'query_cache_type' OR Variable_name = 'query_cache_size' OR Variable_name = 'license' OR Variable_name = 'init_connect' {code}",1 +"DM-1603","12/02/2014 13:09:09","Qserv scons scripts do not pick up the version of swig provided by eups","See the summary. The consequence is that the Qserv integration tests fail on systems that provide swig 2.x+ - there seems to be some implicit dependency on SWIG 1.x. The reason may be that swig is getting confused about shared_ptr to objects that are defined in one module, but used in another (recent swig reorganization split czar and css into two separate swig modules).",2 +"DM-1607","12/02/2014 19:16:27","Add unit tests to test c-style comments in/around queries","I should have thought about it when doing DM-1601 but I didn't... it'd be good to add test queries that test comments before/after/inside query.",1 +"DM-1608","12/03/2014 11:06:30","Move meas_algorithms unit tests to meas_base framework","The following tests in meas_algorithms need to be ported to the meas_base framework: measure.py psfSelectTest.py testPsfDetermination.py ticket2019.py testCorrectFluxes.py (though this cannot be done until the algorithm exists)",2 +"DM-1614","12/03/2014 14:53:03","Add support for mysql JDBC driver (v2)","MySQL client 4.1 and higher is stripping out comments before sending them to server, so the fixes done in DM-1539 are not sufficient.",1 +"DM-1615","12/03/2014 15:48:52","Design and implement CSS structure for distributed Qserv setup","For management of the distributed databases/tables we need info in CSS about all workers and tables. The info will be created by data loader and updated by replicator which do not exist yet. For this issue we need to provide python API which can fill the same information in CSS so that we can build and test other pieces needed for this epic.",5 +"DM-1616","12/03/2014 15:56:25","Implement remote host access for management framework","To manage remote workers we need a way to access services on remote machines that run workers. Services may be something like mysql (which we would prefer to run without TCP port open) and optionally xrootd. This ticket will implement Python module which will hide a complexity of doing things like ssh/port forwarding/authentication from the client.",5 +"DM-1617","12/03/2014 16:05:22","Client library for accessing distributed database/table information from CSS","Provide Python interface for accessing information in CSS which is relevant to distributed management, such as database/table/node data. This interface can be used also by data loader and replicator.",8 +"DM-1621","12/03/2014 18:00:56","Add unit test to to verify zookeeper versioning works","This is a follow u pto DM-1453, we need to add a unit test that will prove that mismatched versions in zookeeper and software are properly handled.",1 +"DM-1626","12/03/2014 18:12:23","Build 2014_12 Qserv release","The title says it all. Please open ticket for next release when closing this one.",1 +"DM-1629","12/03/2014 22:55:57","Adopted/Retired RFCs are not counted as resolved","Also, marking an RFC as Adopted brings up a box with a message applicable to the Closed status.",1 +"DM-1630","12/03/2014 22:58:23","New RFCs should result in dm-devel E-mails and HipChat postings","Email notices of new RFCs filed with a DM component go to dm-devel Email notices of new RFCs filed with a DM component go to Data Management chat room Email notices of all new RFCs go to Bot: RFC room ",1 +"DM-1632","12/04/2014 11:33:57","Build 2014_11 Qserv release","The title says it all. Please open ticket for next release when closing this one. Tasks to do: - publish last buildbot build under a temporary eups-tag (""qserv-dev"")and test it, if it works fine: - create git-tags for Qserv and dependencies - publish the release with eups-tags ""qserv"" and ""YYYY_MM"" - generate and publish the doc for release ""YYYY_MM"" - update release number in Qserv code and set ""YYYY_MM+1"" as release in dev and ""YYYY_MM"" as stable release (update {{admin/bin/qserv-version.sh}}, {{doc/source/conf.py}}, {{doc/source/toplevel.rst}}) - generate and publish the doc for release ""YYYY_MM+1"" - look at the doc - commit {{admin/bin/qserv-version.sh}}, {{doc/source/conf.py}}, {{doc/source/toplevel.rst}} with current ticket number This procedure should be validated and documented.",1 +"DM-1635","12/04/2014 14:31:37","Remove redundant CORS headers from Firefly's http response","Make sure CORS related headers are not sent when the Origin header is missing. Firefox does not like it.",1 +"DM-1653","12/11/2014 12:32:38","Extend data loading script to support multi-node setup","Implement simplest use case for data loading with one or more worker database separate from czar database. Simplest means minimal functionality in what concerns access to workers, just assume for now that we can connect to every worker directly using regular TCP connection. It should be possible to just add a list of worker nodes as an option to loader script and send the chunks in some random order to the workers in that list. Of course chunks for different tables should end up on the same host, so some form of chunk management is needed (use for now whatever is defined in CSS doc on trac).",8 +"DM-1655","12/12/2014 12:49:08","Working with SLAC on definition of metadata store","Follow up metadata store schema development to ensure SUI will be able to use it. Define the fields that should go into Data Definition table. Define the fields that must be present in the image metadata table, which SUI will be searching.",2 +"DM-1658","12/12/2014 15:51:28","Add git bisect tool for Qserv repos","{code:bash} fjammes@clrlsst-dbmaster-vm:~/src/qserv (u/fjammes/DM-627 $%) $ qserv-test-head.sh -h Usage: qserv-test-head.sh [options] Available options: -h this message -q quick: only rebuild/install new Qserv code, and perform test case #01 Rebuild from scratch, configure and run integration tests against a Qserv git repository. Pre-requisite: source loadLSST.bash setup qserv_distrib -t qserv setup -k -r ${QSERV_SRC_DIR} Can be used with 'git bisect' : cd ${QSERV_SRC_DIR} git bisect start git bisect bad git bisect good git-commit-id git bisect run /home/fjammes/src/qserv_testdata/bin/qserv-test-head.sh {code} Code is in DM-627 ticket branch.",2 +"DM-1659","12/12/2014 16:42:48","Aperture Flux back into an abstract class","The last change to ApertureFlux to make it work with the new C++ design changed ApertureFluxAlgorithm into an instantiatable class. However, I have now figured out how to make this work with SWIG while still allowing measure and fail to be defined by default at the ApertureFlux level.. So this issue is to put things back in order.",1 +"DM-1661","12/12/2014 18:47:03","czar log4cxx link/load bug","Under ubuntu 14.04 (at least), the czar falls over at load time with an unresolved sym for typeinfo for log4cxx::helpers::ObjectPtrBase while loading the css python wrapper shared lib.",2 +"DM-1667","12/16/2014 16:22:02","Install PgMySQL and use to connect to local Qserv.","Used ""pip"" to install it. ""Conda"" should work as well. Therefore, it should be easy to make it part of the delivered system: VM, container, tar file, after the fact download, etc. It has documentation, uses the MIT license, under active development and available from PyPI. DB connection is straight forward and requires little experience to get meaningful work done.",2 +"DM-1668","12/16/2014 16:23:43","Create SQL code to read Qserv into Python Pandas data frame","This works well, at least for a simple case. You can move directly from a query statement to a Pandas data frame for analysis in just a few lines of code. Here is the start of an iPython Qserv session showing how easy it is. In [6]: import pandas as pd In [7]: import pymysql as db In [8]: conn = db.connect(host='lsst-db1.ipac.caltech.edu',port=4040, user='qsmaster', passwd='', db='LSST') In [11]: df = pd.read_sql(""select deepCoaddId, tract, patch, ra, decl from DeepCoadd"", conn) In [12]: df Out[12]: deepCoaddId tract patch ra decl 0 26607706 0 406,11 0.669945 1.152218 1 26673242 0 407,11 0.449945 1.152218 2 26804242 0 409,2 0.011595 -0.734160 3 26673154 0 407,0 0.449945 -1.152108 … ",2 +"DM-1670","12/16/2014 16:26:12","Begin looking at how Python Pandas can be used for LSST data analysis.","Pandas is well integrated with the other parts of SciPy: numpy, matlibpy, etc. It’s a good candidate for data analysis, especially where time series are involved. However, there are no multidimensional columns, poor metadata support for FITS files and a need to use masks instead of NaN values. These may, or may not, be problems. There is a 400 page book about Pandas, so it will take some further time to learn its value, especially with astronomical data in different situations.",5 +"DM-1673","12/16/2014 20:45:01","Allow SWIG override for broken SWIG installations","Dependency on SWIG 2.0+ was introduced into Qserv, and this broke Qserv building on systems relying on SWIG 1.3.x. This ticket introduces basic code to override SWIG_LIB on those systems to allow use of the broken installation (some SWIG search paths are fixed during its build process otherwise).",1 +"DM-1685","12/17/2014 17:22:32","Minor bug in a test","tests/centroid.py has a bug in testMeasureCentroid: ""c"" is undefined in the following bit of code: {code} if display: ds9.dot(""x"", c.getX(), c.getY(), ctype=ds9.GREEN) {code}",1 +"DM-1695","12/17/2014 20:13:14","Implement interfaces for Data Access Services","Implement proof of concept, skeleton of the prototype. The work will continue in follow up stories in February and in S15.",8 +"DM-1705","12/18/2014 13:22:14","S15 Tune Qserv","Fix scalability and performance issues uncovered through large scale tests DM-1704",100 +"DM-1706","12/18/2014 13:24:21","S15 Analyze Qserv Performance","Final analysis of Qserv performance, measure KPIs. Based on LDM-240, we are aiming to demonstrate: * 50 simultaneous low volume queries, 18 sec/query * 5 simultaneous high-volume queries, 24 h/query * data size: 10% of DR1 level. * Continuous running for 24 h with no software failures. ",5 +"DM-1709","12/18/2014 15:07:07","Implement result sorting for integration tests","We need to be able to sort results, because we can't always rely on ORDER BY. So we need a formatting per query in the integration tests (sort result for some, don't sort for others etc.) The following queries have been disabled because we don't have result sorting, so once it is implemented, we will need to re-enabled them prior to closing this ticket: {code} case02/queries/0003_selectMetadataForOneGalaxy_withUSING.sql case02/queries/3001_query_035.sql case02/queries/3008_selectObjectWithColorMagnitudeGreaterThan.sql case02/queries/3011_selectObjectWithMagnitudes.sql case02/queries/3011_selectObjectWithMagnitudes_noalias.sql {code}",8 +"DM-1710","12/18/2014 16:04:45","ValueError in lsst.afw.table.Catalog.extend()","{code} from lsst.afw.table import BaseCatalog, Schema s = Schema() c1 = BaseCatalog(s) c2 = BaseCatalog(s) c1.extend(c2) {code} The above fails, saying: {code} Traceback (most recent call last): File ""test.py"", line 7, in c1.extend(c2) File ""/Users/jds/Projects/Astronomy/LSST/stack/DarwinX86/afw/10.0+3/python/lsst/afw/table/tableLib.py"", line 6909, in extend _tableLib.BaseCatalog_extend(self, iterable, deep) ValueError: invalid null reference in method 'BaseCatalog_extend', argument 3 of type 'lsst::afw::table::SchemaMapper const &' {code}",1 +"DM-1713","12/18/2014 20:49:43","S15 Image & File Archive v2","System for tracking existing image data sets integrated with metadata services.",5 +"DM-1715","12/18/2014 22:23:17","Disable query killing","Apparently killing a query through Ctrl-C is confusing xrootd. Disable query killing (which seems to be only partly implemented).",1 +"DM-1720","12/19/2014 14:07:22","Make secondary index for director table only","Following discussion on qserv-l, we only need to generate ""secondary"" index for director table, no other table is supposed to have it. Need to modify data loader to recognize which table is director table and generate index only for that table. ",2 +"DM-1721","12/19/2014 14:13:30","S15 Improve Query Coverage in Qserv","Query coverage in the qserv integration testing is very limited, we have been turning off more and more queries and we were making the qserv code and the data loader more strict. This epic covers work (fixes and improvements) related to * re-enabling test queries marked as ""fixme"" (when it make sense, some queries are for features that are not implemented yet) * adding more queries to test interfaces and features that are implemented but are not currently tested.",40 +"DM-1731","12/31/2014 12:11:44","fix table file handling of MANPATH in dependencies","As discussed on DM-1220, the table files for: - mysqlproxy - protobuf - lua - expat should have the MANPATH entry removed entirely, while: - xrootd should have "":"" added to the end of its MANPATH value, to allow the default paths to be searched as well.",1 +"DM-1733","01/05/2015 14:46:48","Build 2015_01 Qserv release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",1 +"DM-1735","01/06/2015 10:16:17","Have newinstall.sh check itself against distrib version","We want to alert people who are just using a newinstall.sh they have lying around (old or hacked up or...) that they are not using the official server version. ",1 +"DM-1738","01/06/2015 14:59:08","deblender artifacts in noise-replaced images","We still see noise artifacts in some deblended images on the LSST side when running the M31 HSC data. They look like the result of running NoiseReplacer on HeavyFootprints in which the children can extend beyond the parents. This was fixed on the HSC side on DM-340 (before the HSC JIRA split off), and I *think* we just need to transfer the fix to LSST.",1 +"DM-1743","01/07/2015 18:26:54","CSV reader for Qserv partitioner doesn't handle no-escape and no-quote options properly","Both the no-quote and no-escape CSV formatting command line options should not have a default value, as specifying any value turns off field escaping and quoting. Furthermore, when quoting is turned off, the reader incorrectly treats embedded NUL characters as a quote character.",1 +"DM-1744","01/08/2015 12:37:07","Fix SWIG_SWIG_LIB empty list default value","See Serge message to Qserv-l ""xrootd premature death"": {quote} However, there are bigger problems. First of all, master doesn’t build for me. I get this error: File ""/home/lsstadm/qserv/SConstruct"", line 104: env.Alias(""dist-core"", get_install_targets()) File ""/home/lsstadm/qserv/SConstruct"", line 90: exports=['env', 'ARGUMENTS']) File ""/home/lsstadm/stack/Linux64/scons/2.3.0+1/lib/scons/SCons/Script/SConscript.py"", line 609: return method(*args, **kw) File ""/home/lsstadm/stack/Linux64/scons/2.3.0+1/lib/scons/SCons/Script/SConscript.py"", line 546: return _SConscript(self.fs, *files, **subst_kw) File ""/home/lsstadm/stack/Linux64/scons/2.3.0+1/lib/scons/SCons/Script/SConscript.py"", line 260: exec _file_ in call_stack[-1].globals File ""/home/lsstadm/qserv/build/SConscript"", line 39: canBuild = detect.checkMySql(env) and detect.setXrootd(env) and detect.checkXrootdLink(env) File ""/home/lsstadm/qserv/site_scons/detect.py"", line 225: xrdLibPath = findXrootdLibPath(""XrdCl"", env[""LIBPATH""]) File ""/home/lsstadm/qserv/site_scons/detect.py"", line 213: if os.access(os.path.join(path, fName), os.R_OK): File ""/home/lsstadm/stack/Linux64/anaconda/2.1.0/lib/python2.7/posixpath.py"", line 77: elif path == '' or path.endswith('/'): which is caused by the fact that env[“LIBPATH”] looks like: [[], '/home/lsstadm/stack/Linux64/antlr/2.7.7/lib', '/home/lsstadm/stack/Linux64/boost/1.55.0.1.lsst2/lib', '/home/lsstadm/stack/Linux64/log4cxx/0.10.0.lsst1+2/lib', '/home/lsstadm/stack/Linux64/xrootd/4.0.0rc4-qsClient2/lib', '/home/lsstadm/stack/Linux64/zookeeper/3.4.6/c-binding/lib', '/home/lsstadm/stack/Linux64/mysql/5.1.65.lsst1/lib', '/home/lsstadm/stack/Linux64/protobuf/2.4.1/lib', '/home/lsstadm/stack/Linux64/log/10.0+3/lib'] The first element is [], which comes from https://github.com/LSST/qserv/blob/master/site_scons/state.py#L173 where a PathVariable called SWIG_SWIG_LIB is given a default value of []. I can fix the build by changing the default to an empty string… but I don’t know enough scons to say whether that’s the right thing to do. Can one of the scons gurus confirm that’s the right fix? {quote}",1 +"DM-1754","01/09/2015 18:14:10","Update auto build tool to work with new split repositories ","After the repository split, changes are required to get the auto build tool to work properly. Firefly and Firefly based applications are built using Gradle system. ",8 +"DM-1761","01/13/2015 08:56:06","Provide input data for exampleCmdLineTask.py","{{pipe_tasks/examples/exampleCmdLineTask.py}} reads data from a repository. The comments in {{pipe_tasks/python/lsst/pipe/tasks/exampleCmdLineTask.py}} suggest that {code} # The following will work on an NCSA lsst* computer: examples/exampleCmdLineTask.py /lsst8/krughoff/diffim_data/sparse_diffim_output_v7_2 --id visit=6866601 {code} There are a few problems with that: * External contributors don't have access to {{lsst*}}; * Even though that data exists now, it's unclear how long it will remain there, or what steps are being taken to preserve it; * The mention of this data is fairly well buried -- it does appear in the documentation, but it's certainly not the first thing a new user will stumble upon. At least the first two points could be addressed by referring to a publicly available data repository. For example, the following works once {{afwdata}} has been set up: {code} examples/exampleCmdLineTask.py ${AFWDATA_DIR}/ImSim --id visit=85408556 {code} Although this has the downside of only providing a single image.",1 +"DM-1762","01/13/2015 09:49:03","Export SUI data (DC_W13_Stripe82_subset)","- import sui.sql.bzip2.out (produced by Serge) into MySQL for DeepSource and DeepForcedSource tables: - remove columns chunkId and subChunkId for each chunk table - merge all chunk table into the main table - join DeepSource and DeepForcedSource to add coordinates of DeepSource (director) object in DeepForcedSource table. then dump DeepSource and DeepForcedSource to files DeepSource.csv and DeepForcedSource.csv {code:sql} SELECT f.*, COALESCE(s.ra, f.ra), COALESCE(s.decl, f.decl) FROM DeepForcedSource f LEFT JOIN DeepSource s ON (f.deepSourceId = s.deepSourceId) INTO OUTFILE '/db1/dump/DeepForcedSource.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '""' LINES TERMINATED BY '\n'; {code} - Load this file using Qserv loader. A sample should be made and tested first to validate this procedure. This sample could be added in qserv_testdata",3 +"DM-1770","01/13/2015 12:50:27","Support DDL in MetaServ - design","DDL information is embedded as comments in the master version of the schema (in ""cat"" repo). Currently we are only using it for schema browser. This story involves designing the procedure involving loading DDL information into MetaServ. We need to be ready to support a variety of scenarios: * we are getting already preloaded database, need to just load metadata about it to metaserv (we might have the original ascii file with extra information, or not) * we are starting from scratch, need to initialize database (including loading schema), and need to load the information to the metaserv * we already have the database and metadata in metaserv, but we want to change something (eg. alter table, or delete table, or delete database).",2 +"DM-1771","01/13/2015 15:02:01","move executionOrder from plugin config class to plugin class","We originally put the executionOrder parameter (which determines when a plugin is run, relative to others), in the config object, simply because that's where it was in the old framework. But it's really not something that should be configurable, as it depends only on the inputs the algorithm needs, which don't change.",1 +"DM-1783","01/16/2015 15:06:25","fix faint source and minimum-radius problems in Kron photometry","This transfers some improvements to the Kron photometry from the HSC side: - HSC-983: address failures on faint sources - HSC-989: fix the minimum radius - HSC-865: switch to determinant radius instead of semimajor axis - HSC-962: bad radius flag was not being used - HSC-121: fix scaling in forced photometry The story points estimate here is 50% of the actual effort, as the work (already done) also benefited HSC.",5 +"DM-1785","01/16/2015 19:25:55","Add rotAngle to baseline schema","Add ""rotAngle DOUBLE"" to every table that has image ra/decl. ",1 +"DM-1792","01/20/2015 11:02:20","Update documentation and automatic install script w.r.t. new newinstall.sh script","newinstall.sh script has evolved and breaks Qserv install procedure.",1 +"DM-1793","01/20/2015 11:21:45","remove reference data members from FootprintFunctor","The FootprintFunctor class uses references for data members, which could cause memory problems if the class (or a subclass) is ever initialized with a temporary. Fixing this would probably require changing the constructor to take a shared_ptr, however, so it would break a lot of downstream code. I'd rather actually rewrite FootprintFunctor entirely (one of the goals for Epic DM-1107), but it's not clear when that will happen; if it slips too much, this issue is to remind us to fix at least this problem.",0 +"DM-1797","01/21/2015 10:06:46","Package flask","The Data Access Webservice APIs are relying on flask, so we need to package flask according to the LSST standards. For my initial testing, I just run ""sudo aptitude install python-flask"". ",1 +"DM-1802","01/21/2015 15:46:09","remove unused local typedefs","gcc 4.8 now warns about locally-defined typedefs that aren't used. We have a few of these in ndarray and afw::gpu that should be removed.",1 +"DM-1803","01/21/2015 16:32:13","S15 Explore Qserv Authorization","Explore authorization centrally: use information generated by parser. Either generate dummy query and run on mysql that runs near czar, or use info produced by parser to determine if user is authorized. Note, we want to limit this to ~1 week, just to reveal potential problems, or do a quick proof of concept.",8 +"DM-1810","01/22/2015 12:49:53","segfaults in ip_diffim on gcc 4.8","I'm seeing test segfaults in ip_diffim on gcc 4.8, similar to those resolved on DM-1725, but with no similar smoking gun yet. Preliminary indication is that the problem is actually in meas_algorithms.",2 +"DM-1812","01/23/2015 10:28:09","Determine LSE-130 impact of collimated projector calibration plan","During a working meeting with Robert Lupton and Chris Stubbs, determine the impact on LSE-130 of the introduction of the collimated projector for calibration.",8 +"DM-1814","01/23/2015 10:30:39","Support Camera CD-2 (mainly re: LSE-130)","Provide slides and other information needed for CD-2, mainly relative to the open questions around LSE-130",2 +"DM-1816","01/23/2015 10:33:43","Convert LSE-130 to SysML","Following CCB recommendation of approval of LSE-130 draft, convert Word draft to SysML and provide a docgen to Robert McKercher for final posting. ",2 +"DM-1818","01/23/2015 11:35:40","Support completion of final document","Based on CCB approval of LSE-72 on 10 October, support the completion of the final copy of the document for posting on Docushare.",1 +"DM-1819","01/23/2015 11:53:38","Complete LSE-140 work as needed to produce final document","Complete any review-driven revisions of LSE-140 and support the CCB meeting and following final document preparation.",2 +"DM-1820","01/23/2015 11:57:39","LSE-140: Collect desired changes for future release","Prepare for a future revision (Phase 3) of LSE-140. Collect issues to be addressed in the revision. Determine if any affect Phase 2 scope (which would require a prompt revision). It is not anticipated that there will be an actual revision of LSE-140 during the Winter 2015 cycle, because additional detail on calibration requirements will not be available in time.",1 +"DM-1821","01/23/2015 12:05:46","Clarify scope of DM data quality analysis requirement","Clarify in LSE-140 that the DM data quality analysis referred to is primarily that of the Level 1 data products.",0 +"DM-1824","01/23/2015 16:14:23","Define issues to be addressed","Work with TCS contacts (Jacques Sebag, Paul Lotz, etc.) to define the principal issues",1 +"DM-1841","01/26/2015 13:10:12","Fix query error on case03: ""SELECT scienceCcdExposureId FROM Science_Ccd_Exposure_Metadata"" ","Xrootd prevents the worker to return more than 2MB data. On GB-sized data: {code} mysql --host=127.0.0.1 --port=4040 --user=qsmaster --batch -e ""SELECT scienceCcdExposureId FROM Science_Ccd_Exposure_Metadata"" ERROR 4120 (Proxy) at line 1: Error during execution: -1 Ref=1 Resource(/chk/qservTest_case03_qserv/1234567890): 20150123-16:27:45, Error merging result, 1420, Result message MD5 mismatch (-1) {code} On integration test case 04: {code} qserv@clrinfoport09:~/src/qserv (u/fjammes/DM-1841 *)⟫ mysql --host=127.0.0.1 --port=4040 --user=qsmaster qservTest_case04_qserv -e ""SELECT * FROM DeepForcedSource"" ERROR 4120 (Proxy) at line 1: Error during execution: -1 Ref=1 Resource(/chk/qservTest_case04_qserv/6970): 20150204-16:23:43, Error merging result, 1420, Result message MD5 mismatch Ref=2 Resource(/chk/qservTest_case04_qserv/7138): 20150204-16:23:43, Error merging result, 1420, Result message MD5 mismatch Ref=3 (-1) {code}",5 +"DM-1843","01/27/2015 09:47:02","Permit PropertySets to be represented in event payloads","In the old marshalling code, property sets were representable within the payload of the event. This was removed in the new marshalling scheme. There are things (ctrl_orca) that still used this, so this needs to be added to the new marshaling code. At the same time, new new filtering code can not allow this to be added, because the JMS headers only take simple data types.",2 +"DM-1844","01/27/2015 11:13:51","Test Qserv on SL7","Needed to run Qserv on CC-IN2P3 cluster.",2 +"DM-1854","01/27/2015 22:56:11","SUI propose a structure definition for user workspace","Workspace is an integral part of SUI. We want to start the discussion and definition of workspace concept and structure. SUI team had several discussions and Xiuqin presented the results at the DM AHM at SLAC. The slides and the discussion notes are here: https://confluence.lsstcorp.org/display/DM/Workspace+discussion",20 +"DM-1860","01/28/2015 00:28:21","Update documentation for v10_0 release","All done bar obtaining some release notes. ",2 +"DM-1868","01/28/2015 10:17:49","Define JSON Results for Data Access Services","As discussed at [Data Access Hangout 2015-02-23|https://confluence.lsstcorp.org/display/DM/Data+Access+Hangout+2015-02-23], we should support json format. This story covers defining structure of JSON results for Data Access Services (dbserv, imgserv, metaserv) ",3 +"DM-1873","01/28/2015 19:29:35","SUI 2D data visualization (XY plot)","Better algorithm in spatial binning to visualize large number of catalog sources Plot histogram for tabular data Plot basic light curve ",40 +"DM-1875","01/28/2015 23:35:15","SUI infrastructure implementation","Identify the hardware resources needed at NCSA for short term development and Set up the basic git repository and build system Explore multi resolution images display for background iamge",40 +"DM-1878","01/29/2015 09:21:33","Collect, understand, and define more use cases","This is an on-going effort. The collected use cases will be posted at confluence page https://confluence.lsstcorp.org/pages/viewpage.action?pageId=41784036. ",20 +"DM-1880","01/29/2015 11:20:19","Implement RESTful interfaces for Database (GET)","Implement RESTful interfaces for Database (see all D* in https://confluence.lsstcorp.org/display/DM/API), based on the first prototype developed through DM-1695. The work includes adding support for returning appropriately formatted results (support the most common formats). This covers ""GET"" type requests only, ""POST"" will be handled separately.",5 +"DM-1885","01/29/2015 11:57:21","Contribute to the workspace capability discussion ","This include past experience, collection of use cases. ",2 +"DM-1887","01/29/2015 17:34:40","HDF5 file format study","Xiquin, Loi, Trey, and myself discussed HDF5 as a default format to return result set and metadata from lower-level database services vs. traditional IPAC table. Here is the summary: Advantages of IPAC Table format - Simple and human-readable, contains a single table - Fixed length rows (easy to page through) - Supported by many astronomical tools - Provides a way to pass data type, units, and null values in the header - More metadata can be added through keywords (attributes) Disadvantages of IPAC table format - Steaming can not be started before all data are received – need to know column width before the table can be written (csv is better alternative) - Only alpha-numeric and '_' characters are allowed in column names (small subset of available characters) - Only predefined datatypes and one attribute type (string) - ASCII representation requires about twice as much storage to represent floating-point number data than the binary equivalent. Advantages of HDF5 - Can represent complex data and metadata (according to LOFAR, good to represent time series) - Structured data, arbitrary attribute types, datatypes can be combined to create structured datatypes - Flexible datatypes: can be enumerations, bit strings, pointers, composite datatypes, custom atomic datatypes - Access time and storage space optimizations - Partial I/O: “Chunked” data for faster access - Supports parallel I/O (reading and writing) - Built-in compression (GNU zlib, but can be replaced with others) - Existing inspection and visualization tools (HDFView, MATLAB, etc.) Disadvantages of HDF5 - Complex - Tuned to do efficient I/O and storage for ""big"" data (hundreds of megabytes and more), not efficient for small reads/writes. - Requires native libraries (available in prepackaged jars, see below) - Not human readable - (?) Not yet widely supported by astronomical tools (counter-examples: AstroPy, IDL, more at hdfgroup site) Tools and Java wrappers: * JHI5 - the low level JNI wrappers: very flexible, but also quite tedious to use. * Java HDF object package - a high-level interface based on JHI5. * HDFView - a Java-based viewer application based on the Java HDF object package. * JHDF5 - a high-level interface building on the JHI5 layer which provides most of the functionality of HDF5 to Java. The API has a shallow learning curve and hides most of the house-keeping work from the developer. You can run the Java HDF object package (and HDFView) on the JHI5 interface that is part of JHDF5, so the two APIs can co-exist within one Java program. (from StackOverflow answer, 2012) * NetCDF-Java is a Pure Java Library, that reads HDF5. However, it's hard to keep pure java version up-to-date with the standard, does not support all the features. A way to set up native libraries (3rd option from JHDF5 FAQ): ""Use a library packaged in a jar file and provided as a resource (by putting the jar file on the class path). Internally this uses the same directory structure as method 2., but packaged in a jar file so you don't have to care about it. Jar files with the appropriate structure are cisd-jhdf5-batteries_included.jar and lib/nativejar/.jar (one file for each platform). This is the simplest way to use the library."" ",1 +"DM-1897","01/30/2015 13:06:15","Modify CSS structure to support table deletion","Modify CSS structures to support DROP TABLE, as defined in DM-1896.",2 +"DM-1900","01/30/2015 13:20:39","Worker management service - design","We need to replace direct worker-mysql communication and other administrative channels with a special service which will control all worker communication. Some light-weight service running alongside other worker servers, probably HTTP-based. Data loading, start/stop should be handled by this service.",5 +"DM-1901","01/30/2015 13:25:08","Re-implement data loading scripts based on new worker control service","Once we have new service that controls worker communication we'll need to reimplement WorkerAdmin class based on that.",8 +"DM-1903","02/02/2015 06:35:40","Implementation of calibration transformation framework","Following DM-1598 there will be a detailed design and prototype implementation for the calibration & ingest system. This issue covers cleaning up that code, documenting it, having it reviewed, and merging to master.",2 +"DM-1904","02/02/2015 10:20:18","Continued footprint improvements","A redesigned API and support for topological operations within the Footprint class. This continues the work started in DM-1107 in W15. Breakdown: jbosch 15%; swinbank 85%",8 +"DM-1917","02/02/2015 16:35:02","Fix missing virtual destructors","The compiler is warning about some derived class hierarchies that are lacking virtual destructors. We should add at least empty implementations to the base classes of these hierarchies.",1 +"DM-1919","02/02/2015 16:55:46","Address misc. compiler warnings","Fix places where compiler is warning about some things we are doing on purpose and which we don't intend to change. This helps keep compiler noise down so its easier to notice ""real"" warnings.",1 +"DM-1943","02/04/2015 13:11:15","HSC backport: convert Peak to PeakRecord","This issue covers transferring all changesets from [HSC-1074|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1074] and its subtasks, as well as: - An RFC to propose the API change, and any requested modifications generated by the RFC. - Additional fixes to downstream code that's broken by this change (HSC-side changesets should be present for most of downstream fixes, but perhaps not all).",8 +"DM-1945","02/04/2015 13:36:48","HSC backport: multiband processing for coadds","This issue includes transferring changesets from many HSC issues: - HSC-1060 - HSC-1064 - HSC-1065 - HSC-1061 Most of this is in multiBand.py in pipe_tasks, but there are scattered changes elsewhere (including updates to camera mappers to include the new datasets, for which we'll need to modify more than just obs_subaru). However, before we make these changes, we'll need to open an RFC to gather comments on the design of this task. We should qualify there that this is not a long-term plan for consistent multiband processing (which we'll be starting to design on DM-1908), but a step towards better processing in the interim. Note: while I've assigned this to [~lauren], as I think it will be very helpful for her to get familiar with this code by doing the transfers, the RFC will have to involve a collaboration with [~jbosch], [~price], and Bob Armstrong, as we can't expect someone who wasn't involved in the design to be able to write a document justifying it.",8 +"DM-1952","02/04/2015 16:13:25","Change log priority for message ""Unknown column 'whatever' in 'field list'"" ","Next message should be logged with ERROR priority: {code} 0204 15:08:03.748 [0x7f1f4b4f4700] INFO Foreman (build/wdb/QueryAction.cc:250) - [1054] Unknown column 'whatever' in 'field list' {code}",0.5 +"DM-1953","02/04/2015 18:37:18","Post meas_base move changes to Kron","These are to note leftovers from DM-982. They could be done in a single issue. 1. I commented code out referring to correctfluxes, but it will need to be restored once it is available in the new framework. 2. Jim asked me to replace the computeSincFlux which is currently in PsfImage.cc in meas_algorithms with a similar call in meas_base/ApertureFlux.cc. I did not do this because it became rather complicated, and can just as easily be done when the meas_algorithms routine is moved or removed. Basically, the templating in ApertureFlux is on Pixel type, whereas in meas_algorithms it is on ImageT (where ImageT is not necessarily a single class hierarchy -- e.g., Image and MaskedImage). So I left this for now.",1 +"DM-1954","02/04/2015 23:47:11","HSC backport: deblended HeavyFootprints in forced photometry","This is a transfer for changesets for [HSC-1062|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1062]. Unlike most of the HSC backport issues for multiband deblending, these changes will require significant modification the LSST side, because we need to apply them to the new forced measurement framework in meas_base rather than the old, HSC-only one in meas_algorithms and pipe_tasks. Also include [HSC-1256|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1256], [HSC-1218|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1218], [HSC-1235|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1235], [HSC-1216|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1216].",20 +"DM-1972","02/06/2015 15:12:29","upgrade SWIG to 3.0.8 or later","SWIG 3.0.5 is now out and has several useful fixes w.r.t. 3.0.2 (which we are presently using) including: - A bug we've had to work around involving templated methods of classes - improved handling of new-style enums (they are no longer hoisted into the global namespace, which was a serious misfeature of SWIG 3.0.2) I propose we try it out using buildbot (when we have some time), and if it works, we adopt it. Adopting it will help us relax the restrictions on what C++11 features can be used in C++ header files.",2 +"DM-1973","02/06/2015 15:56:23","Build 2015_02 Qserv release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",1 +"DM-1974","02/06/2015 18:30:02","Fix enclose, escape, and line termination characters in qserv-data-loader","Add this string to mysql loader 'LOAD DATA INFILE' command: {code} q += ""ENCLOSED BY '%s' ESCAPED BY '%s' LINES TERMINATED BY '%s'"" % (enclose, escape, newline) {code} and add params in cfg file.",2 +"DM-1982","02/09/2015 12:30:15","Fix JDBC timestamp error","JDBC driver returns an error on next query: {code:sql} sql> select * from Science_Ccd_Exposure [2015-02-06 13:39:37] 1 row(s) retrieved starting from 0 in 927/970 ms [2015-02-06 13:39:37] [S1009] Cannot convert value '0000-00-00 00:00:00' from column 32 to TIMESTAMP. [2015-02-06 13:39:37] [S1009] Value '[B@548997d1' can not be represented as java.sql.Timestamp {code}",1 +"DM-1987","02/10/2015 23:04:48","Redesign/Refactor WCS and Coord","%50 KSK, %50 RO Currently WCS is mutable and Coord objects are heavyweight. Refactor WCS to be immutable and make Coord less heavyweight. Include lists of Coord objects. It's possible astropy could inform in that area. Also, remove TanWcs in favor of TanSipWcs since TanWcs can have SIP terms.",40 +"DM-1994","02/11/2015 14:30:29","Story point display and roll-up in epic display","I understand that there is a pending request to display the story points for individual story issues in the mini-table in which they are displayed for an epic. It would also be useful to see a rolled-up total of the story points for the defined set of stories - so that, among other things, this could be compared to the story point value for the epic. Ideally the story points for the roll-up might be displayed as ""nn (mm)"" where nn is the total points and mm is the number of points remaining to do (or done already - I don't care which as long as the definition is clear).",1 +"DM-2005","02/12/2015 18:07:11","switch ndarray to external package","There is already an external ndarray project on GitHub (we've been using a fork of that). We should merge the forks and switch to using the external package. ",2 +"DM-2009","02/12/2015 18:12:43","Please add cbegin and cend to afw tables","It would be helpful if afw tables had the C++11 iterator methods cbegin and cend that return iterators-to-const.",2 +"DM-2029","02/13/2015 10:32:12","Update Confluence build instructions to match github move","The Build tool documentation https://confluence.lsstcorp.org/display/LDMDG/The+LSST+Software+Build+Tool refers to git clone git@git.lsstcorp.org:LSST/DMS/devenv/lsstsw.git This should be updated to reflect the move to GitHub git clone https://github.com/lsst/lsstsw.git ",0 +"DM-2050","02/18/2015 04:09:10"," Integration and test monitoring architecture Part I","[retitled to better capture cycle scope] Develop and deploy a layer to capture the outputs, initially numeric, of integration testing afterburners such as sdss_demo, hsc_demo, and others developed this cycle. Also capture meta-information such as execution time and memory footprint. Propose log format to standardise production of such informations. Investigate notification system based on trending away from expected values. Investigate data provisioning of integration tests such as storage of test data in GithubLFS. [75% JMP 25% JH] ",100 +"DM-2052","02/18/2015 04:32:29","Maintain list of OSes that pass build and integration testing ","Provide an automatiically generated and updated pages showing operating systems that are successfully building and integrating the stack from source. [FE at 75%, JH at 75%]",20 +"DM-2054","02/18/2015 04:39:22","Release engineering Part One","Bucket for public stack releases [FE at 75%, JH at 75%]",40 +"DM-2057","02/18/2015 13:36:11","Attend Scale 13x conference","Attend database talks, in particular the MaxScale proxy talk (http://www.socallinuxexpo.org/scale/13x/presentations/advanced-query-routing-and-proxying-maxscale?utm_campaign=north-american-trade-shows&utm_source=hs_email&utm_medium=email&utm_content=16099082&_hsenc=p2ANqtz-_MFjfxvpCdmV_Ax2RKDdOGypHPQ85UL-UMuy0eRs_MrlJ2qJVp-MXx-g7_-dAQsq0trpA61hkZrzO-3gp6bKVkpK52fQ&_hsmi=16099082). If anyone has questions they would like me to ask, please post them here as well. I will post notes to this issue. ",2 +"DM-2058","02/18/2015 14:14:32","Data loader should always create overlap tables"," We have discovered that some overlap tables that are supposed to exist were not actually created. It looks like partitioner is not creating overlap files when there is no overlap data and loader is not creating overlap table if there is no input file. Situation is actually symmetric, there could be non-empty overlap table but empty/missing chunk table. When we create one table we should always make another as well. ",2 +"DM-2060","02/18/2015 15:17:18","Rename TaskMsgFactory2","Rename TaskMsgFactory2 to TaskMsgFactory. Please see DM-211 for more information.",0.5 +"DM-2094","02/19/2015 17:50:20","Port metaREST.py to db","metaREST_v0.py in metaserv is currently using MySQLdb instead of going through the db API, because we need to use parameter binding for security reasons. We should switch to using db, once the db interfaces will support it. ",1 +"DM-2095","02/19/2015 18:27:38","Port dbREST.py to db","dbREST_v0.py in dbserv is currently using MySQLdb instead of going through the db API, because we need to use parameter binding for security reasons. We should switch to using db, once the db interfaces will support it. ",1 +"DM-2096","02/19/2015 23:16:01","Long term database work planning","Long term planning (updating LDM-240).",8 +"DM-2097","02/19/2015 23:47:00","Package andyH xssi fixed version (>2MB answer pb) in eups","See DM-1847 - Andy made a patch, it'd be good to the xrootd we use for our stack.",1 +"DM-2129","02/20/2015 14:54:26","S19 Improve Query Coverage in Qserv","This epic holds budgeted effort for work directed at improving query coverage (additional or previously unsupported query types) in Qserv",40 +"DM-2131","02/20/2015 17:10:14","Resolve compiler warnings in new measurement framework","When building {{meas_base}}, or any other measurement plugins which follow the same interface, with clang, I see a bunch of warnings along the lines of: {code} In file included from src/ApertureFlux.cc:34: include/lsst/meas/base/ApertureFlux.h:197:18: warning: 'lsst::meas::base::ApertureFluxAlgorithm::measure' hides overloaded virtual function [-Woverloaded-virtual] virtual void measure( ^ include/lsst/meas/base/Algorithm.h:183:18: note: hidden overloaded virtual function 'lsst::meas::base::SimpleAlgorithm::measure' declared here: different number of parameters (4 vs 2) virtual void measure( {code} This is an artefact of a [workaround for SWIG issues|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20284390]; the warnings aren't indicative of a fundamental problem, but if we can avoid them we should. While we're at it, we should also fix: {code} include/lsst/meas/base/ApertureFlux.h:233:1: warning: 'ApertureFluxResult' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct ApertureFluxResult : public FluxResult { ^ include/lsst/meas/base/ApertureFlux.h:65:1: note: did you mean struct here? class ApertureFluxResult; ^~~~~ struct {code}",1 +"DM-2139","02/22/2015 01:24:23","Support DDL in MetaServ - implementation","DDL information is embedded as comments in the master version of the schema (in ""cat"" repo). Currently we are only using it for schema browser. This story involves building tools that will load the DDL schema into MetaServ. Design aspects are covered in DM-1770.",8 +"DM-2141","02/22/2015 15:21:45","Add meas_extensions_shapeHSM to lsstsw, lsst_distrib","meas_extensions_shapeHSM has just been resurrected from bitrot, and should be included in our distribution. Contrary to DM-2140, it should probably not be included in lsst_apps, as it's not clear we want to add a dependency on tmv and GalSim there.",1 +"DM-2157","02/23/2015 17:37:12","Data loader crashes on uncompressed data.","Vaikunth just mentioned to me that the is a crash in data loader when it tries to load uncompressed data: {noformat} root - CRITICAL - Exception occured: local variable 'outfile' referenced before assignment Traceback (most recent call last): File ""/home/vaikunth/src/qserv/bin/qserv-data-loader.py"", line 312, in sys.exit(loader.run()) File ""/home/vaikunth/src/qserv/bin/qserv-data-loader.py"", line 248, in run self.loader.load(self.args.database, self.args.table, self.args.schema, self.args.data) File ""/home/vaikunth/src/qserv/lib/python/lsst/qserv/admin/dataLoader.py"", line 168, in load return self._run(d atabase, table, schema, data) File ""/home/vaikunth/src/qserv/lib/python/lsst/qserv/admin/dataLoader.py"", line 192, in _run files = self._gunzip(data) File ""/home/vaikunth/src/qserv/lib/python/lsst/qserv/admin/dataLoader.py"", line 388, in _gunzip result.append(outfile) UnboundLocalError: local variable 'outfile' referenced before assignment {noformat} It looks like we never tested loader on uncompressed data and there is a bug in handling uncompressed data. ",1 +"DM-2159","02/23/2015 19:17:07","Implement Image Response for ImgServ","This story covers implementing proper response, and the header metadata for the fits image response.",3 +"DM-2161","02/23/2015 20:12:54","Setup webserv for SUI tests","We need to setup a service (eg on lsst-dev) that can be used by the IPAC team to play with our webserv/metaserv/dbserv/imgserv. The server runs on lsst-dev machine, port 5000. To ssh-tunnel, try: {code} ssh -L 5000:localhost:5000 lsst-dev.ncsa.illinois.edu {code} An example usage: {code} curl 'http://localhost:5000/db/v0/query?sql=SHOW+DATABASES+LIKE+""%Stripe%""' curl 'http://localhost:5000/db/v0/query?sql=SHOW+TABLES+IN+DC_W13_Stripe82' curl 'http://localhost:5000/db/v0/query?sql=DESCRIBE+DC_W13_Stripe82.DeepForcedSource' curl 'http://localhost:5000/db/v0/query?sql=DESCRIBE+DC_W13_Stripe82.Science_Ccd_Exposure' curl 'http://localhost:5000/db/v0/query?sql=SELECT+deepForcedSourceId,scienceCcdExposureId+FROM+DC_W13_Stripe82.DeepForcedSource+LIMIT+10' curl 'http://localhost:5000/db/v0/query?sql=SELECT+ra,decl,filterName+FROM+DC_W13_Stripe82.Science_Ccd_Exposure+WHERE+scienceCcdExposureId=125230127' curl 'http://localhost:5000/image/v0/raw/cutout?ra=7.90481567257&dec=-0.299951669961&filter=r&width=30.0&height=45.0' {code} ",2 +"DM-2171","02/24/2015 20:55:15","Implement JSON Results for MetaServ and DbServ","Implement JSON results for Metadata Service (see all M* in https://confluence.lsstcorp.org/display/DM/API), and Database Service (see all D*) as defined in DM-1868",3 +"DM-2173","02/25/2015 12:08:08","Disable testDbLocal.py in db if auth file not found","tests/testDbLocal.py can easily fail if required mysql authorization file is not found in user home dir. Skip the test instead of failing in such case.",1 +"DM-2186","02/26/2015 13:56:25","Move astrometry_net wrapper code from meas_astrom to a new package","Remove all astrometry.net wrapper code from {{meas_astrom}} and put it in a new package named {{meas_extensions_astrometryNet}}. ",2 +"DM-2190","02/26/2015 14:58:07","Documentation for data loader","Vaikunth had some ""expected"" troubles playing with data loader options for his DM-1570 ticket. Main issue I believe is the absence of the documented use cases and their corresponding data loader options. I'll try to add a bunch of common use cases to RST documentation and also verify that all options behave as expected.",2 +"DM-2193","02/27/2015 11:15:38","Add assertXNearlyEqual to afw","We often want to compare two WCS for approximate equality. afw/image/testUtils has similar functions to compare images and masks and I would like to add one for WCS This ended up being expanded to adding functions for many afw classes (not yet including image-like classes, though existing functions in image/testUtils for that purpose should probably be wrapped or rewritten on a different ticket)",5 +"DM-2199","02/27/2015 16:20:43","Build 2015_03 Qserv release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe. ",1 +"DM-2241","03/02/2015 17:40:38","Raw image definition and usage","SUI needs to serve raw data to the user community. We want to understand the use cases and definition of raw data. More specifically, what meta data will be available in the FITS file that we call raw image? ",1 +"DM-2243","03/02/2015 22:00:41","Extend API: expose cursor","Extend API to expose cursor. This was brought up by Andy in DM-2137. ",1 +"DM-2257","03/03/2015 16:07:11","Allow eups xrootd install script to be relocatable","xrootd lib/ directory should be s relative symlink to lib64, no a full path link.",1 +"DM-2270","03/04/2015 15:50:35","Move VMs to Docker containers","We anticipate being able to move from the VMs that we currently use to using docker. This will require some coordination with Greg Daues to see how HTCondor is configured. ",2 +"DM-2277","03/05/2015 15:09:34","Document HOW-TO setup-up krb5 for easy cluster access","{code:bash} su aptitude install krb5-user # edit /etc/krb5.conf w.r.t ccage one # then as desktop user kinit ssh ccqservxxx {code} /etc/krb5.conf {code:bash} [libdefaults] default_realm = IN2P3.FR ... allow_weak_crypto = true ... [realms] IN2P3.FR = { kdc = kerberos-1.in2p3.fr:88 kdc = kerberos-2.in2p3.fr:88 kdc = kerberos-3.in2p3.fr:88 master_kdc = kerberos-admin.in2p3.fr:88 admin_server = kerberos-admin.in2p3.fr kpasswd_server = kerberos-admin.in2p3.fr default_domain = in2p3.fr {code} sshconfig: {code:bash} Host ccqservbuild GSSAPIAuthentication yes GSSAPIDelegateCredentials yes ForwardX11 yes HostName ccqservbuild.in2p3.fr #ProxyCommand ssh -W %h:%p cc Host ccqserv1* GSSAPIAuthentication yes GSSAPIDelegateCredentials yes ForwardX11 yes HostName %h.in2p3.fr ProxyCommand ssh -W %h:%p ccqservbuild {code}",2 +"DM-2279","03/05/2015 22:21:00","Fix problems with mysql timeout","We added some code for supporting reconnecting (see https://dev.lsstcorp.org/trac/ticket/3042) but clearly not enough to recover from connection timeouts. This needs to be addressed.",1 +"DM-2280","03/06/2015 09:19:00","The TAN_PIXELS cameraGeom coordinate system should be with respect to the center of the focal plane","The TAN_PIXELS cameraGeom coordinate system (the position on a detector if there is no optical distortion) is presently defined with respect to the center of the detector -- i.e. a star at the center of the detector will have the same position in PIXELS and TAN_PIXELS coordinates. That is a mistake. TAN_PIXELS should be defined with respect to the center of the focal plane, since it then reflects the effects of having optical distortion or not. Fixing this will help meas_astrom match stars. The effects of not fixing it are making the matcher search farther for a fit. As long as we allow sufficient offset in the matcher config the current system will work, but it is not ideal.",2 +"DM-2281","03/06/2015 11:08:04","Implement connection pool","Implement a class that manages a connection pool, and optionally, if configured, restarts connection as needed in case of timeout.",1 +"DM-2282","03/06/2015 11:09:52","Switch to using db connection pool","Switch to using the db connection pool. Note, in addition to getting auto-reconnect, in metaserv that would handy if we need to talk to multiple database servers simultaneously.",1 +"DM-2294","03/09/2015 13:31:01","Unable to start cmsd on Qserv worker node","Some build issues have qlready been fixed in commit: 9dd378829e8751a6852356967411c20580e2a1c3 Here's the log: {code:bash} [fjammes@ccqserv101 ~]$ cat /qserv/qserv-run/var/log/worker/cmsd.log 150309 21:19:46 9794 Starting on Linux 3.10.0-123.8.1.el7.x86_64 Copr. 2004-2012 Stanford University, xrd version v20140617-203cf45 ++++++ cmsd worker@ccqserv101.in2p3.fr initialization started. Config using configuration file /qserv/qserv-run/etc/lsp.cf =====> all.adminpath /qserv/qserv-run/tmp =====> xrd.port 1094 =====> xrd.network nodnr Config maximum number of connections restricted to 4096 Config maximum number of threads restricted to 2048 Copr. 2007 Stanford University/SLAC cmsd. ++++++ worker@ccqserv101.in2p3.fr phase 1 initialization started. =====> all.role server =====> ofs.osslib libxrdoss.so =====> oss.localroot /qserv/qserv-run/xrootd-run =====> cms.space linger 0 recalc 15 min 10m 11m =====> all.pidpath /qserv/qserv-run/var/run =====> all.adminpath /qserv/qserv-run/tmp =====> all.manager ccqserv100.in2p3.fr:2131 =====> all.export / nolock The following paths are available to the redirector: w / ------ worker@ccqserv101.in2p3.fr phase 1 server initialization completed. ++++++ worker@ccqserv101.in2p3.fr phase 2 server initialization started. Plugin Unable to find required version information for XrdOssGetStorageSystem in osslib libxrdoss.so ------ worker@ccqserv101.in2p3.fr phase 2 server initialization failed. 150309 21:19:46 9794 XrdProtocol: Protocol cmsd could not be loaded ------ cmsd worker@ccqserv101.in2p3.fr:1094 initialization failed {code}",2 +"DM-2305","03/11/2015 11:10:03","Measurement transforms for Flux","Provide calibration transforms for flux measurements to magnitudes.",3 +"DM-2306","03/11/2015 11:10:56","Measurement transforms for centroids","Provide calibration transforms for all algorithms measuring centroids.",5 +"DM-2307","03/11/2015 11:11:34","Measurement transforms for shapes","Provide calibration transforms for algorithms measuring shapes.",2 +"DM-2309","03/11/2015 15:12:51","Update dev quick-start guide to new git repositories","The quick-start documentation for developers still points to the old git repositories. The RST document needs to be updated to the GitHub repos.",0.5 +"DM-2312","03/12/2015 12:19:41","obs_test's table file is out of date","obs_test's table file is somewhat out of date. Problems include: - afw is required but missing - meas_algorithms and skypix are used by bin/genInputRegistry.py, which is only used to create the input repo so these can be optional - daf_persistence is not used - daf_base is only used by bin/genInputRegistry.py, so it can be optional (though it is presumably setup by daf_butlerUtils in any case)",1 +"DM-2316","03/12/2015 19:47:27","Clarify expectations for unauthenticated user data access","h4. Short version: Clarify what existing community practices, notably including VO interfaces, appear to rely on the availability of unauthenticated access to information in astronomical archives. h4. Details: At the February DM All Hands, [~frossie] raised an objection when it was mentioned that there is a presumption that all user access to LSST data through the DM interfaces (as opposed to through EPO) will be authenticated. We don't appear to have ever documented an explicit requirement that all access be authenticated. The basic controlling requirement is OSS-REQ-0176, ""The LSST Data Management System shall provide open access to all LSST Level 1 and Level 2 Data Products, as defined in the LSST System Requirements and herein, in accordance with LSSTC Board approved policies. ..."", which was a carefully crafted indirection at a time when the policy for non-US/Chile access was still being developed. However, this presumption has been around for a long time. It is inherent to the project policy that access to the non-Alert data will be limited to individuals who are entitled to it. No matter what we think the final policy might be, we do have to design a system that can be consistent with this policy. [~frossie] stated that the astronomical community relies on certain types of data and metadata - she mentioned coverage maps, among others - being available through unauthenticated interfaces. This ticket is to ask her (and others) to collect documentation of those existing practices, so that we can figure out what the expectations may be and how to respond to them in our design.",2 +"DM-2334","03/16/2015 09:28:21","Simplify interactions with XrdOss","The qserv code is still using the old ssi scheme for the cmsd, this needs to be rewritten. For details, see https://listserv.slac.stanford.edu/cgi-bin/wa?A1=ind1503&L=QSERV-L#3",5 +"DM-2340","03/16/2015 11:08:46","Reprise SDRP processing metrics","In support of an SDRP-based science talk of Yusra AlSayyad, we spent some cycles gathering/summarizing processing middleware results and metrics from the US side of processing of the Split DRP. This information from notes, logs, databases, etc provided contextual information on the processing campaign that produced the SDRP science results. ",2 +"DM-2341","03/16/2015 11:24:43","Use parallel ssh to manage Qserv on IN2P3 cluster","IN2P3 sysadmin won't manage Qserv through puppet. So Qserv team has to provide ssh scripts to do this. ",5 +"DM-2343","03/16/2015 13:33:59","Move afw_extensions_rgb functionality into afw proper","See RFC-32 ",1 +"DM-2347","03/16/2015 17:35:27","(In)equality semantics of Coords are confusing","Viz: {code} In [1]: from lsst.afw.coord import Coord In [2]: c1 = Coord(""11:11:11"", ""22:22:22"") In [3]: c1 == c1, c1 != c1 Out[3]: (True, False) In [4]: c2 = Coord(""33:33:33"", ""44:44:44"") In [5]: c1 == c2, c1 != c2 Out[5]: (False, True) In [6]: c3 = Coord(""11:11:11"", ""22:22:22"") In [7]: c1 == c3, c1 != c3 Out[7]: (True, True) {code} {{c1}} is simultaneously equal to *and* not equal to {{c3}}!",1 +"DM-2349","03/17/2015 08:34:11","Add unit tests to SchemaToMeta","Add unit tests, also improve variable names as suggested by K-T in comments in DM-2139",1 +"DM-2356","03/18/2015 10:15:46","Identify the hardware resources needed at NCSA for short term development ","Supply the hardware resources needed at NCSA for short term development. It is captured in DM-2327 ",1 +"DM-2363","03/19/2015 08:56:12","RGB code introduces dependency on matplotlib","While the new RGB code looks like it's just calling NumPy, NumPy is actually delegating to matplotlib under the hood when it writes RGB(A) arrays. It also turns out that code is broken in matplotlib prior to 1.3.1 (though that shouldn't be a problem for anyone but those who - like me - are trying to use slightly older system Python packages). I think think this means we should add an optional dependency on matplotlib to the afw table file, and condition the running of the test code on matplotlib's presence (and, ideally, having the right version). I'm happy to do this myself (since I'm probably the only one affected by it right now).",1 +"DM-2364","03/19/2015 10:23:32","Revisit the choice of using flask","We should quickly revisit if flask is the right choice for us. Related: reportedly, our simple flask-based webserver is using more CPU in an idle state than expected. It might be useful to profile things, and look into that. ",1 +"DM-2367","03/19/2015 11:10:50","run lsstswBuild.sh in a clean sandbox","The ""driver"" script, lsstswBuild.sh, used by the buildbot slave on lsst-dev to initiate a ""CI run"" has a number of environment assumptions (binaries in the $PATH, paths to various components, hostnames, etc.). This prevents it from [easily] being invoked on any other host. As lsstswBuild.sh builds a number of packages that are not in the lsst_distrib product, the os level dependencies for these other products need to be determined. In addition, the current version of lsstswBuild.sh and related scripts on lsst-dev are not version controlled.",8 +"DM-2380","03/23/2015 10:27:45","Retrieve HSC engineering data","HSC data becomes public 18 months after it was taken, so data taken during commissioning are now available. We would like to use this data for testing the LSST pipeline. It needs to be downloaded from Japan.",2 +"DM-2382","03/23/2015 12:34:53","Make sure the command-line parser warns loudly enough if no data found","A user recently got confused when calling parseAndRun didn't call the task's run method. It turns out there was no data matching the specified data ID. Make sure this generates a loud and clear warning.",1 +"DM-2383","03/23/2015 16:19:09","migrate package deps from sandbox-stackbuild to a proper puppet module","There is a growing list of known package dependencies in the sandbox-stackbuild repo and a need to use this information for independent environments (such as CI). This list of packages should be lifted out into an independent puppet module that can be reused.",2 +"DM-2387","03/24/2015 11:46:00","Build testQDisp.cc on ubuntu","testQDisp.cc needs flags -lpthread -lboost_regex to build on ubuntu.",1 +"DM-2390","03/24/2015 16:08:54","Errors need to be checked in UserQueryFactory from QuerySession objects","UserQueryFactory doesn't check its QuerySession object for errors after setQuery. Thus it continues setting things up after the QuerySession knows the state is invalid.",1 +"DM-2411","03/25/2015 16:29:09","Allow qserv-admin.py to delete a node","Registered workers in CSS with qserv-admin.py are currently not able to be removed (no DELETE NODE type command). Also, changing node status from ACTIVE to INACTIVE needs to be fixed.",1 +"DM-2417","03/26/2015 17:23:45","Data loader script crashes trying to create chunk table","Vaikunth discovered a bug in data loader when trying to load a data into Object table: {noformat} [CRITICAL] root: Exception occured: Table 'Object_7480' already exists Traceback (most recent call last): File ""/usr/local/home/vaikunth/src/qserv/bin/qserv-data-loader.py"", line 318, in sys.exit(loader.run()) File ""/usr/local/home/vaikunth/src/qserv/bin/qserv-data-loader.py"", line 254, in run self.loader.load(self.args.database, self.args.table, self.args.schema, self.args.data) File ""/usr/local/home/vaikunth/src/qserv/lib/python/lsst/qserv/admin/dataLoader.py"", line 171, in load return self._run(database, table, schema, data) File ""/usr/local/home/vaikunth/src/qserv/lib/python/lsst/qserv/admin/dataLoader.py"", line 209, in _run self._loadData(database, table, files) File ""/usr/local/home/vaikunth/src/qserv/lib/python/lsst/qserv/admin/dataLoader.py"", line 586, in _loadData self._loadChunkedData(database, table) File ""/usr/local/home/vaikunth/src/qserv/lib/python/lsst/qserv/admin/dataLoader.py"", line 653, in _loadChunkedData self._makeChunkAndOverlapTable(conn, database, table, chunkId) File ""/usr/local/home/vaikunth/src/qserv/lib/python/lsst/qserv/admin/dataLoader.py"", line 727, in _makeChunkAndOverlapTable cursor.execute(q) File ""build/bdist.linux-x86_64/egg/MySQLdb/cursors.py"", line 176, in execute if not self._defer_warnings: self._warning_check() File ""build/bdist.linux-x86_64/egg/MySQLdb/cursors.py"", line 92, in _warning_check warn(w[-1], self.Warning, 3) Warning: Table 'Object_7480' already exists {noformat} It looks like I did not do enough testing after my recent improvement in creating chunk tables. It tries to create the chunk table with ""CREATE TABLE IF NOT EXISTS ..."" but that actually generates ""warning exception"" on mysql side when table is already there. Need to catch this exception and ignore it.",1 +"DM-2423","03/27/2015 11:32:02","Weighting in photometric calibration is incorrect","Dominique points out that the zero point calibration uses errors not inverse errors to calculate the zero point. git annotate reveals: bq. 24c9149f python/lsst/meas/photocal/PhotoCal.py (Robert Lupton the Good 2010-12-13 05:03:12 +0000 353) return np.average(dmag, weights=dmagErr), np.std(dmag, ddof=1), len(dmag) Please fix this. At the same time, we should add a config parameter to soften the errors. ",1 +"DM-2426","03/27/2015 13:34:31","Implement SpanSet+ellipse operations","Implement the following SpanSet operations: - Construct from an ellipse - note geom::ellipses::PixelRegion; this should do most of the work. - Compute centroid - see old Footprint implementation - Compute shape (quadrupole moments) - see old Footprint implementation One complication here is that this will introduce a circular dependency between afw::geom and afw::geom::ellipses. That's easy to address at the C++ level, but it's tricky in Python (which package imports the other?) I'll be emailing dm-devel shortly to start a discussion on how to address this problem.",2 +"DM-2427","03/27/2015 13:36:53","Implement SpanSet applyFunctor methods","Implement methods that apply arbitrary functors to pixels within a SpanSet, as described on RFC-37. The only tricky part of this implementation will be the ""traits"" classes that allow different target objects to interpreted differently. I'd be happy to consult on this; I have a rough idea in my head, but it needs to be fleshed out.",3 +"DM-2429","03/28/2015 12:29:12","Add aperture corrections to meas_extensions_photometryKron","When transitioning {{meas_extensions_photometryKron}} to the new measurement framework, aperture correction was omitted pending the completion of DM-85. It needs to be re-enabled when that epic is complete.",1 +"DM-2430","03/29/2015 07:48:52","Make qserv server-side log messages more standard","Qserv server-side Python logging appears to mostly use a common format: ""{{%(asctime)s %(name)s %(levelname)s: %(message)s}}"". It also mostly uses a common date format: ""{{%m/%d/%Y %I:%M:%S}}"". But I see instances of: * ""{{%(asctime)s %(levelname)s %(message)s}}"" * ""{{%(asctime)s - %(name)s - %(levelname)s - %(message)s}}"" * ""{{%(asctime)s \{%(pathname)s:%(lineno)d\} %(levelname)s %(message)s}}"" * and now, after DM-2176, ""{{%(asctime)s \[PID:%(process)d\] \[%(levelname)s\] (%(funcName)s() at %(filename)s:%(lineno)d) %(name)s: %(message)s}}"" Unless these are used in very different contexts, it will aid automated log processing for them to be more standardized. In addition, the date format is unacceptable as it does not use RFC 3339 (ISO8601) format and does not include a timezone indicator (which means the default {{datefmt}} is insufficient). This must be fixed. See also DM-1203.",1 +"DM-2435","03/29/2015 21:00:34","Reading an Exposure from disk aborts if the Psf is of an unknown type","Attempting to read an Exposure (in this case via the butler) fails if the PSF class isn't available. An exception would be reasonable, but an assertion failure is not. Running the attached script on tiger-sumire with bq. setup python anaconda; setup -T v10_1_rc2 lsst_apps; setup -j distEst -t HSC; setup -j -r ~/LSST/obs/subaru {code} WARNING: Could not read PSF; setting to null: PersistableFactory with name 'PsfexPsf' not found, and import of module 'lsst.meas.extensions.psfex' failed (possibly because Python calls were not available from C++). {0}; loading object with id=4, name='PsfexPsf' {1}; loading object with id=28, name='CoaddPsf' {2} python: src/table/io/InputArchive.cc:109: boost::shared_ptr lsst::afw::table::io::InputArchive::Impl::get(int, const lsst::afw::table::io::InputArchive&): Assertion `r.first->second' failed. Aborted {code}",1 +"DM-2436","03/30/2015 07:48:57","Cherry-pick ""fix makeRGB so it can replace saturated pixels and produce an image"" from HSC","HSC-1196 includes fixes and test cases for {{afw}}. After review on HSC, they should be checked/merged to LSST.",0.5 +"DM-2441","03/30/2015 11:53:09","iRODS test: Register data in place","In our first tests of iRODS, we have used ""iput"" to load data into iRODS cache spaces (the iRODS Vault). For large collections already in a well known location on a server, one may want to leave the data in place but still manage it with iRODS. To do this one can use ""ireg"" to register the data with IRODS without the upload process.",2 +"DM-2442","03/30/2015 11:55:51","iRODS usage, devel survey","Read up on current IRODS usage and development track. ",3 +"DM-2451","03/31/2015 15:35:09","Fix interface between QservOss and new cmsd version","QservOSS gives an error when attempting to run queries on the worker from the czar. Error log snippet: {code} QservOss (Qserv Oss for server cmsd) ""worker"" 150331 16:06:17 9904 Meter: Unable to calculate file system space; operation not supported 150331 16:06:17 9904 Meter: Write access and staging prohibited. ------ worker@lsst-dbdev3.ncsa.illinois.edu phase 2 server initialization completed. ------ cmsd worker@lsst-dbdev3.ncsa.illinois.edu:36050 initialization completed. {code} ",1 +"DM-2455","03/31/2015 19:12:48","uncaught exceptions in GaussianFlux","{{SdssShapeAlgorithm::computeFixedMomentsFlux}}, which is used to implement {{GaussianFlux}}, now throws an exception when the moments it is given are singular. That shouldn't have affected the behavior of {{GaussianFlux}}, as it contains an earlier check that should have detected all such bad input shapes. But that doesn't seem to be the case: we now see that exception being thrown and propagating up until it is caught and logged by the measurement framework, resulting in noisy logs. We need to investigate what's going wrong with these objects, and fix them, which may be in {{SdssShape}} or in the {{SafeShapeExtractor}} {{GaussianFlux}} uses to sanitize its inputs.",1 +"DM-2456","03/31/2015 23:36:12","Participate in April design process","Most work here was with designing firefly tools API related details.",8 +"DM-2466","04/01/2015 10:46:29","lsstsw ./bin/deploy needs LSSTSW set to install products in the right place","I cloned lsstsw into ~/Desktop/templsstsw and cd'd into it and typed ./bin/deploy and was shocked to find it installed everything into ~/lsstsw, leaving an unsable mess: some files were in templsstsw and some in ~/lsstsw. The short-term workaround is to manually set LSSTSW before running ./bin/deploy, but this should not be necessary; bin/deploy should either set LSSTSW or not rely on it. I don't recall this problem with earlier versions of lsstsw; I think this is a regression. For now I updated the instructions at https://confluence.lsstcorp.org/display/LDMDG/The+LSST+Software+Build+Tool but I look forward to being able to revert that change.",1 +"DM-2467","04/01/2015 13:45:00","Implement stitching multiple patches across tract boundaries in a coadd v2","* Find region that returns multiple tractPatchLists for testing. * Request region via central point (RA, Dec) with width and height definable in arcseconds and pixels. * May be extend web interface to other data sets, and/or good seeing SkyMaps. ",8 +"DM-2475","04/02/2015 11:14:05","Build 2015_04 Qserv release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",1 +"DM-2477","04/02/2015 16:34:27","Design API and RFC design","Use the HSC implementation of the base class as a point of reference for designing an integrated Approximate and Interpolate class. The design take into account Chebyshev, spline, and Gaussian process mechanisms. Want to take into consideration client code. I.e. it shouldn't make current consumers more complicated (background and aperture correction to name two). RFC the designed API.",8 +"DM-2491","04/06/2015 12:49:49","Initial survey of Datacat for LSST ","Jacek, Brian Van Klaveren have sent along some initial overview/description of their work on Datacat; https://confluence.slac.stanford.edu/display/~bvan/LSST+Datacat+Overview We start examining this in the context of our studies of managing data collections at NCSA.",1 +"DM-2492","04/06/2015 16:09:13","shapelet unit tests attempts to access display on failure","When tests/profiles.py tests fail, they attempt to create live plots without checking for any variables that indicate that the display should be used. These plots should be disabled, as they obscure the real error when the display is not available.",1 +"DM-2497","04/07/2015 18:06:53","Fix g++ 4.9 return value implicit conversion incompato","g++ 4.9 enforces the ""explicit"" keyword on type conversion operators in return value context. This mean bool checkers along the lines of bool isValidFoo() { return _smartPtrFoo; } require an explicit cast to compile under g++ 4.9 with -std=c++0x. There were a handful of these in our code; found and fixed.",1 +"DM-2506","04/09/2015 00:02:13","Document structure of our custom ddl ascii schema","Need to better document what is supported / accepted by schemaToMeta.py. We are currently relying on cat/sql/baselineSchema.sql as the guide.",2 +"DM-2508","04/09/2015 01:01:49","Information exchange between processes - implementation","Implement system for information exchange between cmsd and xrootd, per instructions in DM-2507",8 +"DM-2511","04/09/2015 09:35:09","The distance field of match lists should be set","The meas_astrom AstrometryTask returns a match list that has distance = 0 for all elements. Neither the matcher nor the WCS fitter are setting this field, and both ought to.",2 +"DM-2518","04/10/2015 02:25:44","Add a CFHT-based post-build integration test to the sandbox build","From [~boutigny] I have installed some simple stack validation tools working on CFHT data in {{/lsst8/boutigny/valid_cfht}} Here is the content of the README file : ------------------------------------------------------------------------------------------------------------------------ This directory contains a set of utilities to validate a stack release with CFHT data At the moment, only validation plots for the astrometry are produced Directories : ------------- rawDownload : contain raw CFHT images (flat, dark, bias, fringe,... corrected) reference_plots : contain reference plots corresponding to the best results obtain so far. Files : ------- setup.cfht : stack environment setup valid_cfht.sh : run processCcd taks on the cfht images valid_cfht.sh init : create the input/ouput directories, ingest raw images and run processCcd valid_cfht.sh : without the ""init"" argument, runs processCcd assuming that the directory structure exists and that the raw images have been ingested. valid_cfht.py : run some analysis on the output data produced by valid_cfht.sh processConfig.py : configuration parameters for processCcd run.list : list of vistits / ccd to be processed by processCcd Requirements : -------------- obs_cfht : tickets/DM-1593 astrometry_net_data : SDSS_DR9 reference catalog corresponding for CFHT Deep Field #3 ------------------------------------------------------------------------------------------------------------------------ Basically it produces a set of plots stored in a png image that can be compared to a reference plot corresponding to the best results obtained so far with stack_v10_0 I hope that this is useful. Just be careful that I wrote these scripts with my own ""fat hand full of fingers"" and that it is just basic code from a non expert. If it is useful, I can certainly add more plots to validate the psf determination, photometry, etc. Comments, suggestions and criticisms are very welcome.",1 +"DM-2521","04/10/2015 12:06:52","Update repo.yaml for first set of Sims Stash repo moves","The repos.yaml file needs to be updated with correct repository locations once SIM-1074 is completed.",1 +"DM-2533","04/13/2015 12:43:05","Remove version attribute from Schema","Remove the Schema attribute and its getters and setters. This change won't be something we can merge to master on its own, as it doesn't provide backwards-compatible FITS reading that will added in future tasks.",1 +"DM-2535","04/13/2015 12:47:14","Backwards compatibility for reading compound fields from FITS","Read old-style afw::table compound fields in as scalar fields, using the new FunctorKey conventions.",2 +"DM-2536","04/13/2015 12:48:33","Backwards compatibility for reading slots and measurements from FITS","Rename fields to match the new slot and measurement naming conventions.",2 +"DM-2538","04/13/2015 13:32:38","RESTful python client","Develop basic abstractions for restful apis in a python client",3 +"DM-2543","04/14/2015 10:58:35","Python APIs for Firefly ","We need Python APIs to interface with Firefly visualization components. This is the first set of many functions. ",8 +"DM-2545","04/15/2015 06:48:44","LaTeX support in Doxygen broken","LaTeX markup in Doxygen documentation ought to be rendered properly for display in HTML. It isn't: it's just dumped to the page as raw text. See, for example, [the documentation for {{AffineTransform}}|https://lsst-web.ncsa.illinois.edu/doxygen/xlink_master_2015_04_15_07.01.28/classlsst_1_1afw_1_1geom_1_1_affine_transform.html#details].",1 +"DM-2546","04/15/2015 11:02:52","Host.cc doesn't find gethostname and HOST_NAME_MAX under el7","el7 gives an error that it can't find HOST_NAME_MAX.",1 +"DM-2547","04/15/2015 13:29:21","Fix again interface between QservOss and new cmsd version","QservOSS gives an error when attempting to run queries on the worker from the czar. Error log snippet: {code} QservOss (Qserv Oss for server cmsd) ""worker"" 150331 16:06:17 9904 Meter: Unable to calculate file system space; operation not supported 150331 16:06:17 9904 Meter: Write access and staging prohibited. ------ worker@lsst-dbdev3.ncsa.illinois.edu phase 2 server initialization completed. ------ cmsd worker@lsst-dbdev3.ncsa.illinois.edu:36050 initialization completed. {code} ",8 +"DM-2549","04/15/2015 18:48:27","The string repr of Coord should show the coordsys and angles in degrees","The default string representation of Coord (e.g. std::cout << coord in C++ and str(coord) in Python) is to show class name and a pair of angles in radians. It would be much more useful if the default display showed the angles in degrees, as that is what people are used to. Also, it would be very helpful if the display included the name of the coordinate system. This is especially needed for the base class, as it is quite common to get shared_ptr to Coord and have no idea what coordinate system it is. At present there is a lot of code that unpacks the angles and explicitly displays them as degrees to get around this problem. But it seems silly to have to do that.",2 +"DM-2551","04/15/2015 19:08:56","ANetAstrometryTask's debug doesn't fully work","{{ANetAstrometryTask}}'s debug code calls (deprecated) method {{Task.display}}, which raises an AttributeError on this coce: {code} try: sources[0][0] except IndexError: # empty list pass except (TypeError, NotImplementedError): # not a list of sets of sources {code} ",1 +"DM-2552","04/16/2015 03:24:06","xrootd can't be started via ssh","{code:bash} qserv@clrinfopc04:~/src/qserv$ ssh localhost -vvv ""~qserv/qserv-run/2015_02/etc/init.d/xrootd start"" ... debug3: Ignored env _ debug1: Sending command: ~qserv/qserv-run/2015_02/etc/init.d/xrootd start debug2: channel 0: request exec confirm 1 debug2: callback done debug2: channel 0: open confirm rwindow 0 rmax 32768 debug2: channel 0: rcvd adjust 2097152 debug2: channel_input_status_confirm: type 99 id 0 debug2: exec request accepted on channel 0 Starting xrootd.. debug1: client_input_channel_req: channel 0 rtype exit-status reply 0 debug1: client_input_channel_req: channel 0 rtype eow@openssh.com reply 0 debug2: channel 0: rcvd eow debug2: channel 0: close_read debug2: channel 0: input open -> closed {code} Here ssh command freeze, it is possible to lauch xrootd with this (example) script: {code:bash} set -e set -x . /qserv/run/etc/sysconfig/qserv export QSW_XRDQUERYPATH=""/q"" export QSW_DBSOCK=""${MYSQLD_SOCK}"" export QSW_MYSQLDUMP=`which mysqldump` QSW_SCRATCHPATH=""${QSERV_RUN_DIR}/tmp"" QSW_SCRATCHDB=""qservScratch"" export QSW_RESULTPATH=""${XROOTD_RUN_DIR}/result"" export LSST_LOG_CONFIG=""${QSERV_RUN_DIR}/etc/log4xrootd.properties"" eval '/qserv/stack/Linux64/xrootd/xssi-1.0.0/bin/xrootd -c /qserv/run/etc/lsp.cf -l /qserv/run/var/log/xrootd.log -n worker -I v4 &' echo ""SCRIPT STARTED"" {code} and the same problem occurs. So the problem seems to be with xrootd, and not the startup scripts. ",5 +"DM-2554","04/16/2015 13:23:11","Remove most compound fields from afw::table","Remove all Point, Moment, Coord, and Covariance compound fields. Array fields should be retained for now; it's not clear if we want to remove it or not, or how to handle variable-length arrays if we do.",2 +"DM-2555","04/17/2015 12:23:15","Create and advertise Firefly mailing list","Create an IPAC mailing list for all users of Firefly. Advertise it to the interested communities (including the LSST Camera group) and through the Github site. The mailing list firefly@ipac.caltech.edu has been created and all the interested partied have been subscribed to the list.",1 +"DM-2579","04/20/2015 14:59:15","Calling AliasMap::get("""") can return incorrect results","It looks like empty string arguments can cause AliasMap to produce some incorrect results, probably due to the partial-match logic being overzealous.",1 +"DM-2580","04/21/2015 04:21:31","Implement user-friendly template customization","Qserv configuration tool has to be improved to allow developers/sysadmin to easily use their custom configuration files (with custom log level, ...) for each Qserv services. An optional custom/ config file directory will be added, and configuration files templates which will be here will override the ones in the install directory. This should be thinked alongside configuration management inside Docker container.",5 +"DM-2581","04/21/2015 06:32:49","log4cxx build failure on OS X","[~frossie] writes: {quote} I have a log4cxx failure on a Macp while building lsst_distrib. Attaching file in case someone has any bright ideas for me in the morning {quote}",1 +"DM-2582","04/21/2015 10:08:33","Research MaxScale as a mysql-proxy replacement","We have been told by Monty that MaxScale is the replacement of the mysql-proxy. Based on DM-2057 the sentiment is that it won't work for our needs. We should very briefly document what our needs are, how we use the proxy now, and if we think MaxScale is not good-enough, say it why, and discuss with Monty and his team.",5 +"DM-2593","04/22/2015 13:16:47","Client API for new worker management service","We have new worker management service which has HTTP interface, now we need to provide simple way to access it from Python basically wrapping all HTTP details into simple Python API. ",8 +"DM-2594","04/22/2015 14:26:14","Change repos.yaml for next set of Simulations Stash repos","The next set of Simulations Stash repository migrations is laid out in SIM-1121.",1 +"DM-2595","04/23/2015 03:36:21","Symlink data directory at configuration","We decided to introduce symlinks in order to protect data. This is in particular useful when we need to reinstall qserv, but we have valuable, large data set that we want to preserve. This story introduces symlinks to data: when Qserv is reinstalled, only the symlink is destroyed, and the data stay untouched.",5 +"DM-2599","04/23/2015 21:42:12","afw.Image.ExposureF('file.fits.fz[i]') returns the image in 'file.fits.fz[1]' ","It seems that afwImage.ExposureF ignores the extension number when this is passed on as part of the filename and uses the image in extension number 1. This is not the case with afwImage.MaskedImageF which correctly uses the input extension number passed in the same way. The problem has been checked on OSX Yosemite 10.10.3 with the is illustrated in the following code https://gist.github.com/anonymous/d10c4a79d94c1393a493 which also requires the following image in the working directory: http://www.astro.washington.edu/users/krughoff/data/c4d_130830_040651_ooi_g_d1.fits.fz ",3 +"DM-2606","04/24/2015 13:39:59","HSC backport: recent Footprint fixes","This is a backport issue to capture subsequent HSC-side work on features already backported to afw. It includes (so far) the following HSC issues: - [HSC-1135|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1135] - [HSC-1129|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1129] - [HSC-1215|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1215]",2 +"DM-2623","04/27/2015 16:32:06","Design Basic Watcher","Design watcher, including its interactions with other components (mysql, css, etc). In the near term, the watcher will handle deleting tables and databases.",2 +"DM-2624","04/27/2015 16:33:12","Implement DROP table in watcher","Implement DROP table using the watcher designed in DM-2623.",1 +"DM-2625","04/27/2015 16:34:43","Create service for managing watcher","We need to be able to start/stop the watcher implemented through DM-2624. This story involves extending our scripts for starting various qserv services to manage watcher.",1 +"DM-2627","04/28/2015 09:59:10","Add support for configuring multi-node integration tests","The multi-node integration test software produced through DM-2175 has hardcoded node names. This story will allow user to configure it. Current plan is to pre-set integration test for several different configurations, e.g., single-node, 2-node, 8-node (and maybe eg 24-node), and user would supply node names through a configuration file.",5 +"DM-2629","04/28/2015 10:38:52","Fix build for gcc 4.7.2 and gcc 4.8.2","#include is missing in threadSafe.h",1 +"DM-2630","04/28/2015 10:50:49","Document configuration tool main use cases","- Document main use case for qserv-configure.py: install Qserv master/worker node with externalized data directory - Hide complex configuration options? {code} Configuration steps: General configuration steps -d, --directory-tree Create directory tree in QSERV_RUN_DIR, eventually create symbolic link from QSERV_RUN_DIR/var/lib to QSERV_DATA_DIR. -e, --etc Create Qserv configuration files in QSERV_RUN_DIR using values issued from meta-config file QSERV_RUN_DIR/qserv-meta.conf -c, --client Create client configuration file (used by integration tests for example) Components configuration: Configuration of external components -X, --xrootd Create xrootd query and result directories -C, --css-watcher Configure CSS-watcher (i.e. MySQL credentials) Database components configuration: Configuration of external components impacting data, launched if and only if QSERV_DATA_DIR is empty -M, --mysql Remove MySQL previous data, install db and set password -Q, --qserv-czar Initialize Qserv master database -W, --qserv-worker Initialize Qserv worker database -S, --scisql Install and configure SciSQL {code} ",3 +"DM-2634","04/28/2015 11:37:14","add new image stretch algorithm to Firefly visualization ","There is a need to include two new stretch algorithms, which are asinh and power law gamma. The algorithm is as follow: * asinh ## input zp: zero point of data mp: maximum point of data dr: dynamic range scaling factor of data. It ranges from 1-100,000 bp: black point for image display wp: white point for image display ## calculate rescaled data value rd = dr *(xPix - zp)/mp ## calculate normalized stretch data value nsd = asinh(rd)/asinh(mp-zp) ## calculate display pixel value dPix = 255 * (nsd-bp)/wp Note: The bp, wp values specify how far outside of the scale data one wants the image to display. By default, setting bp=0 and wp=dr. * power law gamma ## input \br zp: zero point of data mp: maximum point of data gamma: gamma value for exponent ## calculate rescaled data value rd = xPix - zp ## calculate normalized stretch data value nsd = rd^(1/gamma) / (mp0zp)^(1/gamma) ## calculate display pixel data value dPix = 255 * nsd ",8 +"DM-2635","04/28/2015 13:35:00","Provide a function to return the path to a package, given its name","As per RFC-44 we want a simple function in utils that returns the path to a package given a package name. This has the same API as eups.getProductDir, but hides our dependence on eups, as per the RFC.",2 +"DM-2636","04/28/2015 13:37:13","Update code to use the function provided in DM-2635","As per RFC-44: update existing code that finds packages using eups.getProductDir or by using environment variables to use the function added in DM-2635",3 +"DM-2671","05/01/2015 10:13:55","Build 2015_05 Qserv Release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",1 +"DM-2672","05/01/2015 10:15:07","Build 2015_06 Qserv Release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",1 +"DM-2679","05/01/2015 17:28:45","Fix default LOAD DATA options","Integration tests in multi-node produced the following error during data loading: {code} 2015-05-01 17:03:03,030 - lsst.qserv.admin.dataLoader - CRITICAL - Failed to load data into non-partitioned table: Data truncated for column 'poly' at row 60 2015-05-01 17:03:03,031 - root - CRITICAL - Exception occured: Data truncated for column 'poly' at row 60 {code} The default options for MySQL LOAD DATA need to be fixed for this.",1 +"DM-2682","05/04/2015 09:25:52","Add missing empty-chunk-path on Ubuntu 14.04","QSERV_DATA_DIR/var/lib/qserv wasn't created on Ubuntu 14.04 and this was breaking loader script. It was working on SL7 for unknown reason. Creation of the directory has been added to qserv-czar config script.",1 +"DM-2683","05/04/2015 09:32:50","Fix case05 3009_countObjectInRegionWithZFlux freeze","This prevents 2014_05 release to pass integration tests.",1 +"DM-2692","05/04/2015 12:42:50","Rule for automatic replication in iRODS","Maintaining extra copies/replicas on separate resources is an important tenet in iRODS, with this practice considered key for prevention of data loss. The automatic replication of files upon ingest can be encoded via a system rule, so that data is preserved as a inherent part of storing in iRODS.",2 +"DM-2694","05/04/2015 14:47:46","Revisit mysql connections from worker","Revisit the code that handles mysql connections in qserv. At the moment Qserv will maintain a connection per chunk-query, up to a hardcoded limit (GroupScheduler: 4, ScanScheduler:32). Also, we have to gracefully handle connection issues (such as dropped connection, or if we hit the max_connections limit).",8 +"DM-2708","05/06/2015 11:09:56","Understand race condition in Executive::_dispatchQuery","Inserting a log (presumably just a delay) in Executive::_dispatchQuery after the new QueryResource but before the Provision call causes queries to fail. The particular test query was ""select count(*) from Object"" on test case 01.",2 +"DM-2710","05/06/2015 14:20:34","Mutex use before creation","qana/QueryPlugin.cc contains a static boost::mutex, that is used by static class member functions to register plugin implementations. Its constructor is not guaranteed to be called before the static registerXXXPlugin (see e.g. qana/AggregatePlugin.cc) instances use it to register plugin classes.",1 +"DM-2711","05/06/2015 14:23:05","Migrate boost:thread to std::thread","We are mixing boost and std threading libraries. This should be cleaned up - use std:thread consistently everywhere.",5 +"DM-2712","05/06/2015 14:25:34","Migrate boost::shared_ptr to std::shared_ptr","We are mixing boost and std shared_ptrs. This should be cleaned up - use std:shared_ptr consistently everywhere. In a few places we have other types of pointers, (e.g weak_ptr). Migrate these too.",2 +"DM-2716","05/07/2015 19:35:38","Fix connection leak (2nd iteration)","Fix connection leak (and memory leak and thread leak) -- we are leaking 2 per query.",2 +"DM-2718","05/08/2015 03:49:07","Upgrade EUPS used by lsstsw","As discussed, bump it up when you get a chance please. ",1 +"DM-2722","05/08/2015 13:55:59","Revisit design of query poisoner","As we discovered through DM-2698, poisoner tends to hold onto query resources even after the query completes. We should revisit whether than can be redesigned and improved, so that when query finishes, all resources related to that query are immediately automatically released. This story involves just the planning part, implementation will be done through separate stories.",1 +"DM-2728","05/09/2015 17:26:03","Build should fail if node.js is not present","Problem: I built Firefly by mistake w/o having node on my path. The build didn't signal any errors, but generated an unusable webapp that wouldn't load. Expected behavior: the build should have failed and warned the user that node.js is missing.",2 +"DM-2729","05/09/2015 17:43:54","Fix a few more g++ 4.9.2 compatos","Some of the recent boost -> std changes don't compile/link under gcc 4.9.2, because of some poor #include hygiene (including when we should include , not explicitly including , etc.) Also, -pthread linker option is required when using std::thread under gcc 4.9.2. ",1 +"DM-2734","05/11/2015 15:19:05","Add config file for test dataset 04 tables","Following the changes to default LOAD DATA settings in DM-2679, two tables in test case 04 need to have a config file to include their in.csv format.",1 +"DM-2737","05/12/2015 11:13:29","Build a DiscreteSkyMap that covers a collection of input exposures","This is essentially a rehash of the old trac Ticket #[2702| https://dev.lsstcorp.org/trac/ticket/2702], originally reported by [~jbosch], which reads: ""I'd like to add a Task and bin script to create a DiscreteSkyMap that bounds a set of calexps specified by their data IDs. This makeDiscreteSkyMap.py could be used instead of makeSkyMap.py when the user would rather compute the pointing and size of the skymap from the input data than decide it manually."" The work was done by [~jbosch] & [~price] and exists on branch {{u/price/2702}} in {{pipe_tasks}}, but it was never merged to master. I plan to simply rebase the commits in that branch onto master.",1 +"DM-2738","05/13/2015 01:41:35","Remove #include ""XrdOuc/XrdOucTrace.hh"" from Qserv code","See next emails: Hi Fabrice, Absolutely! Andy On Wed, 13 May 2015, Fabrice Jammes wrote: > Hi Andy, > > Thanks, > > In my understanding, you're ok if I remove the existing > #include ""XrdOuc/XrdOucTrace.hh"" > from Qserv source code. I'll do it soon. > > Have a nice day, > > Fabrice > > Le 12/05/2015 23:41, Andrew Hanushevsky a écrit : >> Hi Fabrice, >> >> Well, no. We have a long-standing approach that qserv should not depend on anything outside of XrdSsi public interfaces. This is the only way to easily protect sqserv code from infrastructure changes. So, I would not. If you want to copy something like that for >> >> qserv please do, it's simple enough. But in the end qserv needs to be self-contained in that it does not depend on xrootd code just the public ssi interfaces. >> >> Andy >> >> -----Original Message----- From: Fabrice Jammes >> Sent: Tuesday, May 12, 2015 9:06 AM >> To: Andrew Hanushevsky >> Subject: About xrdssi client logging >> >> Hi Andy, >> >> Hope you're doing well. >> Could you please tell me if its usefull to include >> #include ""XrdOuc/XrdOucTrace.hh"" >> in our xrdssi client code? >> >> Indeed client seems to only print DBG macro output, that's why I was >> wondering if XrdOucTrace was only use on the server side. >> If yes, I will remove it from our client. >> >> Thanks, and have a nice day, >> >> Fabrice ",1 +"DM-2740","05/13/2015 18:01:34","Make ANetAstrometryTask more configurable","The current ANetAstrometryTask has a solver that is not easy to retarget. This makes testing with hscAstrom needlessly difficult. My suggestion is to make the solver a true Task instead of a task-like object, and make it retargetable using a ConfigurableField instead of a ConfigField. This is very easy to do because the solver is already a task in all but name. ",2 +"DM-2748","05/15/2015 03:02:03","Add clear message when integration test fails","Integration test fails without printing a clear message at the end, and for now a query is broken: 0011_selectDeepCoadd.txt but it isn't printed at the end of tet output.",2 +"DM-2752","05/15/2015 13:42:46","db 10.1+4 tests randomly fail with python egg installation error","The unit tests for DB seem to fail at random and always pass on a second build attempt. My hunch is that multiple tests are running in parallel all attempting to install the mysql module but I haven't investigated. {code} db: 10.1+4 ERROR (0 sec). *** error building product db. *** exit code = 2 *** log is in /home/build0/lsstsw/build/db/_build.log *** last few lines: ::::: [2015-05-15T19:12:35.557258Z] scons: done reading SConscript files. ::::: [2015-05-15T19:12:35.558276Z] scons: Building targets ... ::::: [2015-05-15T19:12:35.558409Z] scons: Nothing to be done for `python'. ::::: [2015-05-15T19:12:35.570007Z] makeVersionModule([""python/lsst/db/version.py""], []) ::::: [2015-05-15T19:12:35.686733Z] running tests/testDbLocal.py... running tests/testDbRemote.py... running tests/testDbPool.py... failed ::::: [2015-05-15T19:12:35.695011Z] passed ::::: [2015-05-15T19:12:35.698811Z] passed ::::: [2015-05-15T19:12:35.706360Z] 1 tests failed ::::: [2015-05-15T19:12:35.706703Z] scons: *** [checkTestStatus] Error 1 ::::: [2015-05-15T19:12:35.708443Z] scons: building terminated because of errors. {code} {code} [root@ip-192-168-123-151 .tests]# cat * tests/testDbLocal.py Traceback (most recent call last): File ""tests/testDbLocal.py"", line 53, in from lsst.db.db import Db, DbException File ""/home/build0/lsstsw/build/db/python/lsst/db/db.py"", line 49, in import MySQLdb File ""build/bdist.linux-x86_64/egg/MySQLdb/__init__.py"", line 19, in File ""build/bdist.linux-x86_64/egg/_mysql.py"", line 7, in File ""build/bdist.linux-x86_64/egg/_mysql.py"", line 4, in __bootstrap__ File ""/home/build0/lsstsw/anaconda/lib/python2.7/site-packages/setuptools-5.8-py2.7.egg/pkg_resources.py"", line 937, in resource_filename File ""/home/build0/lsstsw/anaconda/lib/python2.7/site-packages/setuptools-5.8-py2.7.egg/pkg_resources.py"", line 1632, in get_resource_filename File ""/home/build0/lsstsw/anaconda/lib/python2.7/site-packages/setuptools-5.8-py2.7.egg/pkg_resources.py"", line 1662, in _extract_resource File ""/home/build0/lsstsw/anaconda/lib/python2.7/site-packages/setuptools-5.8-py2.7.egg/pkg_resources.py"", line 1003, in get_cache_path File ""/home/build0/lsstsw/anaconda/lib/python2.7/site-packages/setuptools-5.8-py2.7.egg/pkg_resources.py"", line 983, in extraction_error pkg_resources.ExtractionError: Can't extract file(s) to egg cache The following error occurred while trying to extract file(s) to the Python egg cache: [Errno 17] File exists: '/home/build0/.python-eggs' The Python egg cache directory is currently set to: /home/build0/.python-eggs Perhaps your account does not have write access to this directory? You can change the cache directory by setting the PYTHON_EGG_CACHE environment variable to point to an accessible directory. tests/testDbPool.py /home/build0/lsstsw/anaconda/lib/python2.7/site-packages/setuptools-5.8-py2.7.egg/pkg_resources.py:1032: UserWarning: /home/build0/.python-eggs is writable by group/others and vulnerable to attack when used with get_resource_filename. Consider a more secure location (set with .set_extraction_path or the PYTHON_EGG_CACHE environment variable). 05/15/2015 07:12:35 root WARNING: Required file with credentials '/home/build0/.lsst/dbAuth-test.txt' not found. tests/testDbRemote.py /home/build0/lsstsw/anaconda/lib/python2.7/site-packages/setuptools-5.8-py2.7.egg/pkg_resources.py:1032: UserWarning: /home/build0/.python-eggs is writable by group/others and vulnerable to attack when used with get_resource_filename. Consider a more secure location (set with .set_extraction_path or the PYTHON_EGG_CACHE environment variable). 05/15/2015 07:12:35 root WARNING: Required file with credentials '/home/build0/.lsst/dbAuth-testRemote.txt' not found. {code}",1 +"DM-2762","05/18/2015 16:54:16","Avoid leaking memory allocated by mysql_thread_init","mysql/MySqlConnection.cc contains the following comment: {code} // Dangerous to use mysql_thread_end(), because caller may belong to a // different thread other than the one that called mysql_init(). Suggest // using thread-local-storage to track users of mysql_init(), and to call // mysql_thread_end() appropriately. Not an easy thing to do right now, and // shouldn't be a big deal because we thread-pool anyway. {code} The comment is not really correct with regards to thread pooling. Instead, each rproc::InfileMerger has an rproc::InfileMerger::Mgr which contains a util::WorkQueue that spawns a thread, and so we are failing to call mysql_thread_end at least once per user query. This has been verified using the memcheck valgrind tool. ",3 +"DM-2770","05/20/2015 19:02:42","sconsUtil install target does not respond to either force=True or --force","I've been unable to figure out how to bypass the install 'force' check, but have confirmed that this is the correct expression by commenting it out: https://github.com/lsst/sconsUtils/blob/54c983ffe9714a33657c4388de3506fe7a40518d/python/lsst/sconsUtils/installation.py#L92 {code} $ SCONSUTILS_DIR=. scons -Q force=True install Unable to import eups; guessing flavor CC is gcc version 4.8.3 Checking for C++11 support C++11 supported with '-std=c++11' Error with git version: uncommitted changes Found problem with version number; update or specify force=True to proceed {code} ",1 +"DM-2777","05/21/2015 13:58:16","Fix races in BlendScheduler","_integrityHelper() from wsched/BlendScheduler inspects a map of tasks and is sometimes called without holding the corresponding mutex. My theory is that it is observing the map in an inconsistent state, leading to assert failure and hence worker death, and finally to hangs/timeouts on the czar.",2 +"DM-2779","05/21/2015 16:45:55","Fix race in Foreman","The Foreman implementation passes a TaskQueue pointer corresponding to running tasks down to the task scheduler without holding a lock. This means that the scheduler can inspect the running task list (usually to determine its size) while it is being mutated.",2 +"DM-2782","05/22/2015 13:13:06","Firefly Tools API: Add advance region support","Firefly Tools API: Add advance region support Improve firefly's region functionality to support a ""dynamic region"". Data can be added or removed from this region by API calls. Allow any amount of region lines to be added or removed. Make sure performance is good. Also, document the current Firefly region support.",2 +"DM-2787","05/22/2015 13:46:55","Footprint dilation performance regression","In DM-1128 we implemented span-based dilation for footprints. A brief test on synthetic data indicated that this was a performance win over the previous version of the code. In May 2015, this code was merged to HSC and applied to significant quantities of real data for the first time. A major performance regression was identified: {quote} [May-9 00:26] Paul Price: processCcd is now crazy slow. [May-9 00:29] Paul Price: Profiling... [May-9 00:40] Paul Price: I'm thinking it's the Footprint grow code... [May-9 00:44] Paul Price: And the winner is…. Footprint construction: [May-9 00:44] Paul Price: 2 0.000 0.000 702.280 351.140 /home/astro/hsc/products/Linux64/meas_algorithms/HSC-3.8.0/python/lsst/meas/algorithms/detection.py:191(makeSourceCatalog)
 2 0.005 0.002 702.274 351.137 /home/astro/hsc/products/Linux64/meas_algorithms/HSC-3.8.0/python/lsst/meas/algorithms/detection.py:228(detectFootprints)
 15 0.001 0.000 698.597 46.573 /home/pprice/hsc/afw/python/lsst/afw/detection/detectionLib.py:3448(__init__)
 15 698.596 46.573 698.596 46.573 {_detectionLib.new_FootprintSet} [May-9 00:53] Paul Price: If I revert HSC-1243 (""Port better Footprint-grow code from LSST""), then the performance regression goes away. @jbosch @jds may be interested... {quote} The source of the regression must be identified and resolved for both HSC and LSST.",5 +"DM-2789","05/22/2015 17:41:44","rename CameraMapper.getEupsProductName() to getPackageName() and convert to abstract method","Per discussion on this PR related to DM-2636: https://github.com/lsst/daf_butlerUtils/pull/1#issuecomment-104785055 The CameraMapper.getEupsProductName() should be renamed to getPackageName() and converted to an abstract method. This will eliminates a runtime, and thus ""test time"", dependency on EUPS. As part of the rename/conversion, all subclasses that are not already overriding getEupsProductName() will concurrently need to have getPackageName() implemented.",3 +"DM-2792","05/26/2015 12:45:59","Make the new astrometry task the default task","The new astrometry task should be the default astrometry task, but we need to make sure it is good enough first.",1 +"DM-2799","05/27/2015 12:32:07","Tests for daf_butlerUtils should not depend on obs_lsstSim","Currently two of the tests in {{daf_butlerUtils}} depend on {{obs_lsstSim}}. They will never run in a normal build because {{obs_}} packages can not be a dependency on {{daf_butlerUtils}}. After discussing the options with [~ktl] the feeling is that {{ticket1640}} should be rewritten to remove the dependency and {{ticket1580}} can probably be removed.",2 +"DM-2803","05/28/2015 00:14:31","Adapt multi-node tests to latest version of qserv / loader","The multi-node integration tests have to be updated to work with the latest changes to qserv, in particular the loader, which broke already working tests lately.",8 +"DM-2804","05/28/2015 00:24:27","Implement query metadata skeleton","Skeleton implementation of the Query Metadata - including the APIs and core functionality (accepting long running query and saving the info about it)",8 +"DM-2827","05/28/2015 14:39:08","Implement RESTful interfaces for Database (POST)","Implement RESTful interfaces for Database (see all D* in https://confluence.lsstcorp.org/display/DM/API), based on the first prototype developed through DM-1695. The work includes adding support for returning appropriately formatted results (support the most common formats). This covers ""POST"" type requests only, ""GET"" will be handled separately.",8 +"DM-2847","05/31/2015 22:46:55","SUI Firefly server side Python job management","In order to support Camera team needs and L3 data production, Firefly server needs to be able to start a Python job with proper input data and get the output data as a result of running the Python job. This will make the future integration of Firefly and DM pipeline stack much easier. ",40 +"DM-2849","06/01/2015 09:54:56","Tweaks to OO display interface","When I wrote the initial version of display_firefly I found a few minor issues in the way I'd designed the Display class; at the same time, [~lauren] found some missing functions in the backward-compatibility support for ds9. Please fix these; note that this implies changes to afw, display_ds9, and display_firefly. ",2 +"DM-2854","06/01/2015 14:29:39","Fix Qserv SsiSession worker race","The worker SsiSession implementation calls ReleaseRequestBuffer after handing the bound request to the foreman for processing. It therefore becomes possible for request processing to finish before ReleaseRequestBuffer is called by the submitting thread, resulting in a memory leak.",2 +"DM-2864","06/02/2015 10:37:30","Fix bug related to selecting rows by objectId from non-director table","The following example illustrates the problem: Let's select one raw from qservTest_case01_qserv {code} select sourceId, objectId FROM Source LIMIT 1; +-------------------+-----------------+ | sourceId | objectId | +-------------------+-----------------+ | 29763859300222250 | 386942193651348 | +-------------------+-----------------+ {code} Then select it, but use ""sourceId"" in the query, all good here: {code} select sourceId, objectId FROM Source WHERE sourceId=29763859300222250; +-------------------+-----------------+ | sourceId | objectId | +-------------------+-----------------+ | 29763859300222250 | 386942193651348 | +-------------------+-----------------+ {code} But if we add ""objectId"", the row is not found: {code} select sourceId, objectId FROM Source WHERE sourceId=29763859300222250 and objectId=386942193651348; Empty set (0.09 sec) {code} Similarly, even without sourceId constraint, the query fails: {code} select sourceId, objectId FROM Source WHERE objectId=386942193651348; Empty set (0.09 sec) {code} ",8 +"DM-2865","06/02/2015 12:00:18","Merge BoundedField from HSC as is","To make headway on aperture corrections, we are bringing the HSC implementation of BoundedField over.",2 +"DM-2866","06/02/2015 18:09:24","Learn about Butler","Transferring knowledge from K-T to the DB team.",2 +"DM-2867","06/02/2015 18:09:44","Learn about Butler","Transferring knowledge from K-T to the DB team.",2 +"DM-2868","06/02/2015 18:10:09","Learn about Butler","Transferring knowledge from K-T to the DB team.",2 +"DM-2869","06/02/2015 18:10:26","Learn about Butler","Transferring knowledge from K-T to the DB team.",2 +"DM-2870","06/02/2015 18:10:37","Learn about Butler","Transferring knowledge from K-T to the DB team.",2 +"DM-2883","06/03/2015 22:00:04","wcslib is unable to read PTF headers with PV1_{1..16} cards","SCAMP writes distortion headers in form of PVi_nn (i=1..x, nn=5..16) cards, but this is rejected (correctly) by wcslib 4.14; there is a discussion at https://github.com/astropy/astropy/issues/299 The simplest ""solution"" is to strip the values PV1_nn (nn=5..16) in makeWcs() for CTYPEs of TAN or TAN-SIP and this certainly works. I propose that we adopt this solution for now. ",1 +"DM-2885","06/05/2015 15:26:15","Improve confusing error message","Selecting a column that does not exist results in confusing error. Example: {code} SELECT badColumnName FROM qservTest_case01_qserv.Object WHERE objectId=386942193651348; {code} ERROR 4120 (Proxy): Error during execution: -1 Ref=1 Resource(/chk/qservTest_case01_qserv/6630): 20150605-16:23:42, Error in result data., 1, (-1) Similarly, {code} select whatever FROM qservTest_case01_qserv.Object; {code} prints ERROR 4120 (Proxy): Error during execution: -1 Ref=1 Resource(/chk/qservTest_case01_qserv/6630): 20150605-16:23:52, Error in result data., 1, Ref=2 Resource(/chk/qservTest_case01_qserv/6631): 20150605-16:23:52, Error merging result, 1990, Cancellation requested Ref=3 Resource(/chk/qservTest_case01_qs (-1) (note, sourceId does not exist in Object table) ",5 +"DM-2887","06/05/2015 22:46:25","Fix broken IN - it now takes first element only","IN is broken - it only uses the first element from the list. Here is the proof: {code} select COUNT(*) AS N FROM qservTest_case01_qserv.Source WHERE objectId=386950783579546; +------+ | N | +------+ | 56 | +------+ 1 row in set (0.10 sec) mysql> select count(*) AS N FROM qservTest_case01_qserv.Source WHERE objectId=386942193651348; +------+ | N | +------+ | 39 | +------+ 1 row in set (0.09 sec) mysql> select COUNT(*) AS N FROM qservTest_case01_qserv.Source WHERE objectId IN(386942193651348, 386950783579546); +------+ | N | +------+ | 39 | +------+ 1 row in set (0.09 sec) mysql> select COUNT(*) AS N FROM qservTest_case01_qserv.Source WHERE objectId IN(386950783579546, 386942193651348); +------+ | N | +------+ | 56 | +------+ 1 row in set (0.11 sec) {code}",8 +"DM-2890","06/06/2015 09:19:56","isrTask assumes that the Exposure has a Detector","While trying to use the isrTask to interpolate over bad columns in PTF data I discovered that the code assumes that the Exposure has a Detector attached. Please remove this restriction. ",1 +"DM-2891","06/06/2015 17:27:51","meas.algorithms.utils uses measurement algorithms that are no longer available","meas.algorithms.utils uses GaussianCentroid and SdssShape, but now that they have moved to meas_base the code no longer works. Please fix this. I'd prefer to leave the functionality to visualise PSFs in meas_algorithms, but if necessary file an RFC to move it elsewhere. ",2 +"DM-2892","06/06/2015 23:39:48","Keep track of database of the director table","An L3 child table might very well have an LSST data release Object table as its director, while almost certainly not living in the DR database. To support it, we should keep track of the database name holding director's table. Note, this is related to DM-2864 - the code touched in that ticket should be checking the director's db name. Don't forget to add a unit test that will exercise it!",1 +"DM-2895","06/08/2015 11:21:29","treat lsst_apps, lsst_libs and lsst_thirdparty as top level products not required by lsst_distrib","Per discussion on RFC-55, it was determined that lsst_apps and lsst_libs and lsst_thirdparty maybe be treated as separate top level products that lsst_distrib need not depend on them nor do they need to be included as part of CI builds.",1 +"DM-2900","06/08/2015 18:25:27","Add queries that exercise non-box spatial constraints","Qserv has code to support: * qserv_areaspec_box * qserv_areaspec_circle * qserv_areaspec_ellipse * qserv_areaspec_poly but only the first one (box) is exercised in our integration tests. This story involves adding queries to test the other 3.",2 +"DM-2905","06/09/2015 10:34:53","Update Scons to v2.3.4","Scons has not been updated in over a year. RFC-61 agreed that we should upgrade it now before tackling some other {{scons}} issues.",1 +"DM-2909","06/09/2015 17:17:45","Remove unused code from sconsUtils","The code in {{deprecated.py}} in {{sconsUtils}} is not used by anything anywhere. [~jbosch] has indicated that the file can simply be removed.",1 +"DM-2910","06/09/2015 18:45:12","obs_cfht is broken with the current stack","obs_cfht's camera mapper is missing the new packageName class variable, so it is not compatible with the current stack. I suggest fixing obs_sdss and obs_subaru as well, if they need it.",1 +"DM-2911","06/09/2015 23:17:40","Build 2015_07 Qserv Release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",1 +"DM-2917","06/11/2015 09:12:18","obs_cfht unit tests are broken","obs_cfht has one unit test ""testButler"" that uses git://git.lsstcorp.org/contrib/price/testdata_cfht. 4 of the tests fail, as shown below. In addition, testdata_cfht is huge, and the tests barely use any of it. It's worth considering making a new test repo that is smaller, or if the amount of data is small enough, move it into afwdata or obs_cfht itself. {code} localhost$ tests/testButler.py CameraMapper: Loading registry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/DATA/registry.sqlite3 CameraMapper: Loading calibRegistry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/CALIB/calibRegistry.sqlite3 ECameraMapper: Loading registry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/DATA/registry.sqlite3 CameraMapper: Loading calibRegistry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/CALIB/calibRegistry.sqlite3 ECameraMapper: Loading registry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/DATA/registry.sqlite3 CameraMapper: Loading calibRegistry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/CALIB/calibRegistry.sqlite3 ECameraMapper: Loading registry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/DATA/registry.sqlite3 CameraMapper: Loading calibRegistry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/CALIB/calibRegistry.sqlite3 .CameraMapper: Loading registry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/DATA/registry.sqlite3 CameraMapper: Loading calibRegistry registry from /Users/rowen/LSST/code/testdata/testdata_cfht/CALIB/calibRegistry.sqlite3 E. ====================================================================== ERROR: testBias (__main__.GetRawTestCase) ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/testButler.py"", line 122, in testBias self.getDetrend(""bias"") File ""tests/testButler.py"", line 110, in getDetrend flat = self.butler.get(detrend, self.dataId, ccd=ccd) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/butler.py"", line 218, in get location = self.mapper.map(datasetType, dataId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/mapper.py"", line 116, in map return func(self.validate(dataId), write) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/cameraMapper.py"", line 287, in mapClosure return mapping.map(mapper, dataId, write) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 118, in map actualId = self.need(self.keyDict.iterkeys(), dataId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 199, in need lookups = self.lookup(newProps, newId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 345, in lookup return Mapping.lookup(self, properties, newId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 168, in lookup where, self.range, values) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/registries.py"", line 120, in executeQuery c = self.conn.execute(cmd, values) OperationalError: no such column: extension ====================================================================== ERROR: testFlat (__main__.GetRawTestCase) ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/testButler.py"", line 117, in testFlat self.getDetrend(""flat"") File ""tests/testButler.py"", line 110, in getDetrend flat = self.butler.get(detrend, self.dataId, ccd=ccd) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/butler.py"", line 218, in get location = self.mapper.map(datasetType, dataId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/mapper.py"", line 116, in map return func(self.validate(dataId), write) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/cameraMapper.py"", line 287, in mapClosure return mapping.map(mapper, dataId, write) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 118, in map actualId = self.need(self.keyDict.iterkeys(), dataId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 199, in need lookups = self.lookup(newProps, newId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 345, in lookup return Mapping.lookup(self, properties, newId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 168, in lookup where, self.range, values) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/registries.py"", line 120, in executeQuery c = self.conn.execute(cmd, values) OperationalError: no such column: extension ====================================================================== ERROR: testFringe (__main__.GetRawTestCase) ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/testButler.py"", line 127, in testFringe self.getDetrend(""fringe"") File ""tests/testButler.py"", line 110, in getDetrend flat = self.butler.get(detrend, self.dataId, ccd=ccd) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/butler.py"", line 218, in get location = self.mapper.map(datasetType, dataId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/mapper.py"", line 116, in map return func(self.validate(dataId), write) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/cameraMapper.py"", line 287, in mapClosure return mapping.map(mapper, dataId, write) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 118, in map actualId = self.need(self.keyDict.iterkeys(), dataId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 199, in need lookups = self.lookup(newProps, newId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 345, in lookup return Mapping.lookup(self, properties, newId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/mapping.py"", line 168, in lookup where, self.range, values) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_butlerUtils/10.1-3-g302a9ed/python/lsst/daf/butlerUtils/registries.py"", line 120, in executeQuery c = self.conn.execute(cmd, values) OperationalError: no such column: extension ====================================================================== ERROR: testRaw (__main__.GetRawTestCase) Test retrieval of raw image ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/testButler.py"", line 101, in testRaw raw = self.butler.get(""raw"", self.dataId, ccd=ccd, immediate=True) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/butler.py"", line 244, in get return callback() File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/butler.py"", line 242, in innerCallback(), dataId) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/butler.py"", line 238, in callback = lambda: self._read(pythonType, location) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/butler.py"", line 426, in _read location.getCppType(), storageList, additionalData) File ""/Users/rowen/LSST/lsstsw/stack/DarwinX86/daf_persistence/10.1-1-g6edbc00+1/python/lsst/daf/persistence/persistenceLib.py"", line 1430, in unsafeRetrieve return _persistenceLib.Persistence_unsafeRetrieve(self, *args) FitsError: File ""src/fits.cc"", line 1064, in lsst::afw::fits::Fits::Fits(const std::string &, const std::string &, int) cfitsio error: could not open the named file (104) : Opening file '/Users/rowen/LSST/code/testdata/testdata_cfht/DATA/raw/08BL05/w2.+2+2/2008-11-01/i2/1038843o.fits.fz[1]' with mode 'r' {0} lsst::afw::fits::FitsError: 'cfitsio error: could not open the named file (104) : Opening file '/Users/rowen/LSST/code/testdata/testdata_cfht/DATA/raw/08BL05/w2.+2+2/2008-11-01/i2/1038843o.fits.fz[1]' with mode 'r'' ---------------------------------------------------------------------- Ran 6 tests in 3.544s FAILED (errors=4) {code}",1 +"DM-2919","06/11/2015 13:28:54","PhotoCalTask mis-calling Colorterm methods","When I implemented DM-2797 I made a few errors in pipe_tasks: - PhotoCalTask mis-calls two methods of Colorterm by providing filterName, which is not needed - ColortermLibrary.getColorterm mis-handles glob expressions (the two arguments to fnmatch.fnmatch are swapped). We also need a unit test for applying colorterms, but that will require enough work that I have made a separate ticket for it: DM-2918. Meanwhile I have tested my changes by running Dominique's CFHT demo. This proves that the colorterm code runs, but does not prove that the terms are correctly applied.",1 +"DM-2934","06/12/2015 14:40:49","Add RFD issue type to RFC project","To support the RFD process adopted in [RFC-53], an RFD issue type in the RFC project is required. While we could add RFD-specific fields to it, I think it's simplest if it's just generic with details provided in the Description.",1 +"DM-2927","06/12/2015 15:25:40","Modernize sconsUtils code to python 2.7 standard","As part of the work investigating DM-2839 I modernized the sconsUtils code to meet current coding standards (using {{in}} rather than {{has_key}}, using {{items()}} rather than {{iteritems}} etc). Since I'm highly doubtful that DM-2839 is going to be closed any time soon I will separate out the modernization patches into this ticket.",1 +"DM-2929","06/12/2015 23:09:46","Some AFW tests are not enabled with no explanation","Running {{coverage.py}} on the AFW test suite indicated that two test classes in {{tests/wcs1.py}} are disabled. {{WCSTestCaseCFHT}} was added by [~rhl] in 2007 but disabled during a merge a long time ago by [~jbosch] in 2010 but with no indication as to why. {{WCSRotateFlip}} appeared in 2012 (added by [~krughoff]) but doesn't appear in the {{suite}} list at the end and so does not execute. Similarly {{testSchema.py}} has two tests that are not run: {{xtestSchema}} and {{testJoin}}. I assume {{xtestSchema}} is deliberately disabled but could there at least be a comment in the test explaining why? My feeling is that we should either run the tests or they should be removed. Having them their gives the impression they are doing something useful. Less importantly, {{warpExposure.py}} has some support code for comparing masked images that was written in 2009 by [~rowen] but which is not used anywhere in the test.",2 +"DM-2930","06/13/2015 01:08:17","Fix problem with Qserv related to restarting mysql","I noticed some strange (reproducible!) behavior: if I run: {code}qserv-check-integration.py --case=01{code} then restart mysqld {code}/etc/init.d/mysqld restart{code} then the query: {code}mysql --host=127.0.0.1 --port=4040 --user=qsmaster qservTest_case01_qserv -e ""SELECT COUNT(*) as OBJ_COUNT FROM Object WHERE qserv_areaspec_box(0.1, -6, 4, 6)""{code} consistently fails every single time. To fix it, it is enough to restart xrootd.",5 +"DM-2931","06/13/2015 07:34:02","We write truncated Wcs data to extended HDU tables in Exposures","When we write Wcs to extra HDUs in Exposures they are truncated if other than TAN/TAN-SIP. Please don't write them. A better long term solution is needed. In particular, we shouldn't be duplicating this information unnecessarily, and we need to be able to persist e.g. TPV to the tables so as to support CoaddPsf. These issues are not included here.",1 +"DM-2936","06/15/2015 23:06:30","Refactor Histogram in edu.caltech.ipac.visualize.plot package.","The Histogram has 6 constructors to handle 6 bitpixel data types which are byte, short integer, integer, long integer, float and double. Since FitsRead has now only works on float, there the Histogram should be refactored accordingly.",3 +"DM-2938","06/16/2015 13:40:24","CalibrateTask has an unwanted ""raise"" in it","On 2014-06-30 commit 696b641 a developer added a bare ""raise"" as a debugging aid to the CalibrateTask in pipe_tasks. That change was accidentally merged to master. I confirmed it was an accident and am filing this ticket as a way to remove the raise and run buildbot before merging to master.",1 +"DM-2940","06/17/2015 15:34:11","DS9 tests fail if DS9 not running in some configurations","There are a few issues with the robustness of the {{testDs9.py}} tests in AFW. * The tests are skipped if the {{display_ds9}} package can not be loaded but they should also skip if {{ds9}} is missing or if {{ds9}} can not be loaded. The latter is especially important during builds that unset {{$DISPLAY}}. * The launching code in {{initDS9}} can not notice the simple case of {{ds9}} immediately failing to load. It simply assumes that there are delays in launch. The reason for this is that {{os.system}} does not return bad status if the command has been started in the background. Another scheme for starting {{ds9}} should be considered. Maybe a different exception could be raised specifically for failing to start it. * At the moment each test independently has a go at starting {{ds9}}. This makes the tests take a very long time (made worse by {{_mtv}} also trying multiple times) despite it being clear pretty quickly that {{ds9}} is never going to work. * Currently the {{mtv}} tests must run early as they are the only tests that attempt to start {{ds9}} if it is not running. If the two tests that call {{mtv}} are disabled two other tests fail. Ideally the {{initDS9}} code should be called in all cases.",1 +"DM-2944","06/18/2015 15:37:31","SourceMeasurementTask still referenced in our stack","SourceMeasurementTask is gone, but we still have code that refers to it, including: {code} /Users/rowen/LSST/lsstsw/build/meas_algorithms/python/lsst/meas/algorithms/debugger.py: 21: from lsst.meas.algorithms.measurement import SourceMeasurementTask 26: measurement = ConfigurableField(target=SourceMeasurementTask, doc=""Measurements"") /Users/rowen/LSST/lsstsw/build/meas_algorithms/python/lsst/meas/algorithms/detection.py: 209: The example also runs the SourceMeasurementTask; see \ref meas_algorithms_measurement_Example for more explanation. /Users/rowen/LSST/lsstsw/build/meas_deblender/examples/utils.py: 15: class DebugSourceMeasTask(measAlg.SourceMeasurementTask): 41: measAlg.SourceMeasurementTask.preMeasureHook(self, exposure, sources) 74: measAlg.SourceMeasurementTask.postMeasureHook(self, exposure, sources) 80: measAlg.SourceMeasurementTask.preSingleMeasureHook(self, exposure, sources, i) 102: measAlg.SourceMeasurementTask.postSingleMeasureHook(self, exposure, sources, i) /Users/rowen/LSST/lsstsw/build/meas_deblender/python/lsst/meas/deblender/deblendAndMeasure.py: 31: from lsst.meas.algorithms import SourceMeasurementTask 50: target = SourceMeasurementTask, /Users/rowen/LSST/lsstsw/build/pipe_tasks/python/lsst/pipe/tasks/calibrate.py: 180:
initialMeasurement \ref SourceMeasurementTask_ ""SourceMeasurementTask"" 189:
measurement \ref SourceMeasurementTask_ ""SourceMeasurementTask"" /Users/rowen/LSST/lsstsw/build/pipe_tasks/python/lsst/pipe/tasks/imageDifference.py: 36: from lsst.meas.algorithms import SourceDetectionTask, SourceMeasurementTask, \ 104: target=SourceMeasurementTask, /Users/rowen/LSST/lsstsw/build/pipe_tasks/python/lsst/pipe/tasks/measurePsf.py: 136: The example also runs SourceDetectionTask and SourceMeasurementTask; see \ref meas_algorithms_measurement_Example for more explanation. {code} I will handle pipe_tasks calibrate.py as part of DM-435.",1 +"DM-2945","06/18/2015 15:41:32","Wmgr refuses to serve queries from remote interface","Vaikunth discovered that wmgr returns 404 for all operations. It looks like wmgr can serve requests coming from 127.0.0.1 interface but returns 404 for queries from non-local interface.",1 +"DM-2948","06/19/2015 09:06:03","Remove explicit buildbot dependency on datarel","The buildbot scripts have an explicit dependency on the {{datarel}} package, which we'd like to remove from the stack. It uses {{datarel}} as the top-level product when building the cross-linked HTML documentation; {{lsstDoxygen}}'s {{makeDocs}} script takes a single package, and generates the list of packages to include in the Doxygen build by finding all dependencies of that package. So, to remove the explicit dependency on {{datarel}}, we need to either: - find a new top-level product with a Doxygen build to pass to {{makeDocs}} (e.g. by adding a trivial Doxygen build to {{lsst_distrib}}) - modify the argument parsing in {{lsstDoxygen}} to take a list of multiple products (it *looks* like the limitation to one package is only in the argument parsing), and pass it a list of top-level products in the buildbot scripts. This is currently a blocker for DM-2928, which itself a blocker for DM-1766, which has now been lingering for a few weeks now. I'm going to look for other ways to remove the block on the latter, but I don't have a solution yet.",3 +"DM-2949","06/19/2015 10:13:34","remove dead code and dependencies from datarel","Removing the {{datarel}} package entirely has proved to be difficult (DM-2928, DM-2948), so instead I'm simply going to remove non-ingest code (and dead ingest code) from the package, along with its dependencies on {{ap}} and {{testing_endToEnd}}. Other dependencies will be retained even if they aren't necessary for the code that will remain in {{datarel}}, to support {{lsstDoxygen}}'s use of {{datarel}} as a top-level package for documentation generation.",1 +"DM-2952","06/19/2015 15:38:58","Crop needs to be refactored","This class needs to be refactored to be in consist with FitsRead class which treats all data type as float. Thus the bitpix in this class does not have to be treated based on its value.",3 +"DM-2966","06/23/2015 02:07:48","Design CSS that supports updates","Design how to redesign CSS, we currently take a snapshot when char starts. It is too static. ",2 +"DM-2976","06/25/2015 12:51:12","SourceCatalog.getChildren requires preconditions but does not check them","This is a code transfer from HSC-1247.",2 +"DM-2977","06/25/2015 12:54:48","Miscellaneous CModel improvements from HSC","This improves handling of several edge case failure modes, tweaks the configuration to improve performance, and adds some introspection useful for Jose Garmilla's tests. Includes HSC-1288, HSC-1284, HSC-1228, HSC-1250, HSC-1264, HSC-1273, HSC-1240, HSC-1249, HSC-1238, HSC-990, HSC-1155, HSC-1191",2 +"DM-2980","06/25/2015 14:53:10","refactor coaddition code","The HSC fork has coaddition code in two places: pipe_tasks and hscPipe. The code in hscPipe is what we use (though that depends on the code in pipe_tasks in places), while the code in pipe_tasks is more similar to what's currently on the LSST side. We want to bring the refactored version in hscPipe back to LSST, but we want to put it directly in pipe_tasks to remove the code duplication that currently exists on the HSC side. Work on this issue should begin with an RFC that details the proposed changes. Note that this should not bring over the ""safe coadd clipping"" code, which is DM-2915.",5 +"DM-2981","06/25/2015 15:15:17","polygon masking in CoaddPsf","We need to create polygon-based masks of the usable area of the focal plane, persist them with exposure, and include them in coaddition of PSFs and aperture corrections. This includes HSC issues HSC-972, HSC-973, HSC-974, HSC-975, HSC-976. At least some of this will be blocked by DM-833, which is the port issue for coaddition of aperture corrections.",8 +"DM-2982","06/25/2015 16:38:56","Updating node status in qserv-admin to INACTIVE fails","In qserv-admin.py when attempting to update a node status from ACTIVE to INACTIVE the following error is produced: {code} > update node worker2 state=INACTIVE; Traceback (most recent call last): File ""/usr/local/home/vaikunth/src/qserv/bin/qserv-admin.py"", line 650, in main() File ""/usr/local/home/vaikunth/src/qserv/bin/qserv-admin.py"", line 645, in main parser.receiveCommands() File ""/usr/local/home/vaikunth/src/qserv/bin/qserv-admin.py"", line 163, in receiveCommands self.parse(cmd[:pos]) File ""/usr/l ocal/home/vaikunth/src/qserv/bin/qserv-admin.py"", line 180, in parse self._funcMap[t](tokens[1:]) File ""/usr/local/home/vaikunth/src/qserv/bin/qserv-admin.py"", line 380, in _parseUpdate self._parseUpdateNode(tokens[1:]) File ""/usr/local/home/vaikunth/src/qserv/bin/qserv-admin.py"", line 405, in _parseUpdateNode self._impl.setNodeState(**options) File ""/usr/local/home/vaikunth/src/qserv/lib/python/lsst/qserv/admin/qservAdmin.py"", line 660, in setNodeState self._kvI.set(nodeKey, state) File ""/usr/local/home/vaikunth/src/qserv/lib/python/lsst/qserv/css/kvInterface.py"", line 415, in set self._zk.set(k, v) File ""/usr/local/home/vaikunth/qserv/Linux64/kazoo/2.0b1+1/lib/python/kazoo-2.0b1-py2.7.egg/kazoo/client.py"", line 1170, in set return self.set_async(path, value, version).get() File ""/usr/local/home/vaikunth/qserv/Linux64/kazoo/2.0b1+1/lib/python/kazoo-2.0b1-py2.7.egg/kazoo/client.py"", line 1182, in set_async raise TypeError(""value must be a byte string"") {code}",1 +"DM-2985","06/26/2015 16:37:58","Integrate javascript build with gradle","Integrate javascript build tools webpack with gradle.",2 +"DM-2987","06/26/2015 17:03:12","Modify IpacTableParser to support extra wide table.","IpacTableParser fail to load IPAC table with extra wide headers and columns. Replace the logic for reading headers and columns information so that it will support any file/size.",2 +"DM-2989","06/26/2015 17:17:23","XY plot need to be able to handle multiple tables with the same name","XY plot was relying on a table request object to cache previously loaded tables. This was done for performance reason. However, table request is not reliable since the same request may be submitted multiple times.",2 +"DM-2992","06/26/2015 22:03:30","Search processors to get image, table, or json from an external task","Implement three search processors, which use the External Task Launcher (DM-2991): - to get a table (possibly in binary FITS format) - to get an image - to get JSON",8 +"DM-2993","06/28/2015 01:10:08","Products must not depend on anaconda","{{setupRequired(anaconda)}} should be removed from webservcommon.table. We want to keep the stack buildable with any python 2.7, and should not explicitly depend on anaconda.",1 +"DM-2997","06/29/2015 11:45:23","Bump eups anaconda package to 2.2","By popular request. ",1 +"DM-3029","07/01/2015 00:55:34","Set up Slack for evaluation"," Free account procured and tested by various volunteers; next step is to apply for non-profit status which gives us the first paid tier free to 100 users. ",1 +"DM-3030","07/01/2015 00:57:24","Set up Discourse for evaluation."," Server up on DO at community.lsst.org. Email needs fixing before volunteer users can be invited. ",1 +"DM-3031","07/01/2015 07:05:37","Addressing File corruption in iRODS 4.1.x","We examine solutions for repairing corrupt files within an iRODS 4.1.x zone.",2 +"DM-3037","07/02/2015 02:04:59","remove lsst/log wrapper from Qserv","lsst/log API looks stable now, so removing the wrapper would simplify the code.",1 +"DM-3090","07/02/2015 18:38:08","Implement test suite for new class SqlTransaction","Some test that shows that transactions are properly committed/aborted would be nice to have.",1 +"DM-3091","07/03/2015 10:39:02","Remove unused function populateState() ","Qserv doesn't seem to relaunch no more chunk query in case it fails (see DM-2643) And this function is now unused: {code:bash} qserv@clrinfopc04:~/src/qserv (master)$ grep -r populateState core/ core/modules/qdisp/Executive.cc:void populateState(lsst::qserv::qdisp::ExecStatus& es, {code} ",0.5 +"DM-3102","07/07/2015 10:58:52","Resolve segmentation fault in LoggingEvent destructor","There seems to be a possible race condition in log4cxx::spi::LoggingEvent::~LoggingEvent. I've had multiple segmentation faults in that function. In all cases, another thread was involved in writing. In at least 2 cases, the second thread was in XrdCl::LogOutFile::Write. ",5 +"DM-3104","07/08/2015 13:48:31","Add ""ORDER BY"" clause to lua SQL query on result table","If user query has ""ORDER BY"", then lua can't just execute ""SELECT * FROM result"" because the order for such query is not guaranteed. To fix that, we need to add ""ORDER BY"" clause to the ""SELECT * FROM result"" query on the lua side. Once we have the above, we might want to remove ""ORDER BY"" from the query class which runs a merge step on the czar (this has to be done in query analysis step).",8 +"DM-3106","07/09/2015 08:13:11","Add slot for calibration flux","This is a port of [HSC-1005|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1005].",2 +"DM-3108","07/09/2015 11:33:07","Use aperture flux for photometric calibration","This is a port of work performed on HSC but without a ticket. Relevant commits are: * [05bef6|https://github.com/HyperSuprime-Cam/meas_astrom/commit/05bef629adc37e44ea8482aab88e2eb38a47e3a0] * [4a6be5|https://github.com/HyperSuprime-Cam/meas_astrom/commit/4a6be51c53f61e70f151de7f29863cb723197a99] * [69d35a|https://github.com/HyperSuprime-Cam/obs_subaru/commit/69d35a890234e37c1142ddbeff43e62fe36e6c45] * [9c996d|https://github.com/HyperSuprime-Cam/obs_subaru/commit/9c996d75c423ce03fb54c4300d9c7561b5c1ea99]",1 +"DM-3109","07/09/2015 11:41:15","Add support for accessing schema from QueryContext","When we are analyzing a query, sometimes there are situations where we need to know the schema of tables involved in a query. It will also be useful for checking if user is authorized to run query, and for queries like ""SHOW CREATE TABLE"". This story involves writing code that will provide access to schema.",3 +"DM-3110","07/09/2015 14:44:37","qserv code cleanup","I made some random cleanup of the qserv code while playing with css v2. I want to push these changes to master, thus I am creating this story for this. It involves improvements to logging in UserQueryFactory and Facade (both are now per-module), removing unnecessary namespace qualifiers, and whitspace cleanup.",1 +"DM-3126","07/13/2015 16:24:18","gcc 4.8 package does not create a symlink bin/cc","I created a new lsst package named ""gcc"" that contains Mario's gcc 4.8 package. I used it to build lsst_distrib on lsst-dev and it worked just fine. Unfortunately the package does not include bin/cc (which should be a symlink to bin/gcc), and this is wanted because the LSST build system uses cc to build C code. The desired fix is to modify the installer to make a symlink bin/cc that points to bin/gcc.",2 +"DM-3133","07/13/2015 22:53:59","add ""dax_"" prefix to data access related packages","As agreed at [Data Access Mtg 2015/07/13|https://confluence.lsstcorp.org/display/DM/Data+Access+Meeting+2015-07-13], add dax_ prefix towebserv, webservcommon, webserv_client, dbserv, imgserv, metaserv",1 +"DM-3137","07/14/2015 10:29:00","Handle bad pixels in image stacker","We currently OR together all mask bits, but we need to be cleverer about how we handle pixels that are bad in some but not all inputs. This is a port of work carried out on [HSC-152|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-152].",1 +"DM-3139","07/14/2015 12:50:18","HSC backport: extra ""refColumn"" class attributes in multiband","This is a transfer for changesets for [HSC-1283|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1283]. ",0.5 +"DM-3140","07/14/2015 14:01:53","add gcc to list of packages in lsstsw","Add gcc to the list of packages in etc/repos.yaml in lsstsw",1 +"DM-3141","07/14/2015 15:39:02","Reduce verbosity of astrometry","The astrometry.net solver that runs by default in meas_astrom 10.1 is very verbose. Here's an example running HSC data with an SDSS reference catalog: {code} $ processCcd.py /tigress/HSC/HSC --output /tigress/pprice/lsst --id visit=904020 ccd=49 --clobber-config : Loading config overrride file '/home/pprice/LSST/obs/subaru/config/processCcd.py' WARNING: Unable to use psfex: No module named extensions.psfex.psfexPsfDeterminer hscAstrom is not setup; using LSST's meas_astrom instead Cannot import lsst.meas.multifit: disabling CModel measurements Cannot import lsst.meas.extensions.photometryKron: disabling Kron measurements Cannot enable shapeHSM ('MEAS_EXTENSIONS_SHAPEHSM_DIR'): disabling HSM shape measurements Cannot import lsst.meas.extensions.photometryKron: disabling Kron measurements Cannot enable shapeHSM ('MEAS_EXTENSIONS_SHAPEHSM_DIR'): disabling HSM shape measurements : Loading config overrride file '/home/pprice/LSST/obs/subaru/config/hsc/processCcd.py' : input=/tigress/HSC/HSC : calib=None : output=/tigress/pprice/lsst CameraMapper: Loading registry registry from /tigress/pprice/lsst/_parent/registry.sqlite3 CameraMapper: Loading calibRegistry registry from /tigress/HSC/HSC/CALIB/calibRegistry.sqlite3 processCcd: Processing {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 904020, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 49, 'expTime': 30.0} processCcd.isr: Performing ISR on sensor {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 904020, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 49, 'expTime': 30.0} processCcd.isr WARNING: Cannot write thumbnail image; hsc.fitsthumb could not be imported. afw.image.MaskedImage WARNING: Expected extension type not found: IMAGE processCcd.isr: Applying linearity corrections to Ccd 49 processCcd.isr.crosstalk: Applying crosstalk correction afw.image.MaskedImage WARNING: Expected extension type not found: IMAGE : Empty WCS extension, using FITS header processCcd.isr: Set 0 BAD pixels to 647.04 processCcd.isr WARNING: There were 6192 unmasked NaNs processCcd.isr WARNING: Cannot write thumbnail image; hsc.fitsthumb could not be imported. processCcd.isr: Flattened sky level: 647.130493 +/- 12.733898 processCcd.isr: Measuring sky levels in 8x16 grids: 648.106765 processCcd.isr: Sky flatness in 8x16 grids - pp: 0.024087 rms: 0.006057 processCcd.calibrate: installInitialPsf fwhm=5.88235294312 pixels; size=15 pixels processCcd.calibrate.repair: Identified 80 cosmic rays. processCcd.calibrate.detection: Detected 303 positive sources to 5 sigma. processCcd.calibrate.detection: Resubtracting the background after object detection processCcd.calibrate.initialMeasurement: Measuring 303 sources (303 parents, 0 children) processCcd.calibrate.astrometry: Applying distortion correction processCcd.calibrate.astrometry: Solving astrometry LoadReferenceObjects: read index files processCcd.calibrate.astrometry.solver: Number of selected sources for astrometry : 258 processCcd.calibrate.astrometry.solver: Got astrometric solution from Astrometry.net LoadReferenceObjects: getting reference objects using center (1023.5, 2084.5) pix = Fk5Coord(320.3431396, 0.5002365, 2000.00) sky and radius 0.00194896 rad LoadReferenceObjects: search for objects at Fk5Coord(320.3431396, 0.5002365, 2000.00) with radius 0.111667372351 deg LoadReferenceObjects: found 495 objects LoadReferenceObjects: trimmed 257 out-of-bbox objects, leaving 238 processCcd.calibrate.astrometry.solver: Fit WCS: use iter 2 because it had less linear scatter than the next iter: 0.307471 vs. 0.320229 pixels processCcd.calibrate.astrometry: 186 astrometric matches processCcd.calibrate.astrometry: Refitting WCS processCcd.calibrate.astrometry: Astrometric scatter: 0.047945 arcsec (with non-linear terms, 174 matches, 12 rejected) processCcd.calibrate.measurePsf: Measuring PSF /tigress/HSC/LSST/stack10_1/Linux64/anaconda/2.1.0-4-g35ca374/lib/python2.7/site-packages/numpy/core/_methods.py:59: RuntimeWarning: Mean of empty slice. warnings.warn(""Mean of empty slice."", RuntimeWarning) /tigress/HSC/LSST/stack10_1/Linux64/anaconda/2.1.0-4-g35ca374/lib/python2.7/site-packages/numpy/core/_methods.py:71: RuntimeWarning: invalid value encountered in double_scalars ret = ret.dtype.type(ret / rcount) /home/pprice/LSST/meas/algorithms/python/lsst/meas/algorithms/objectSizeStarSelector.py:143: RuntimeWarning: invalid value encountered in less update = dist < minDist processCcd.calibrate.measurePsf: PSF star selector found 163 candidates processCcd.calibrate.measurePsf: PSF determination using 114/163 stars. processCcd.calibrate.repair: Identified 92 cosmic rays. processCcd.calibrate: Fit and subtracted background processCcd.calibrate.measurement: Measuring 303 sources (303 parents, 0 children) processCcd.calibrate.astrometry: Applying distortion correction processCcd.calibrate.astrometry: Solving astrometry processCcd.calibrate.astrometry.solver: Number of selected sources for astrometry : 258 Solver: Arcsec per pix range: 0.153025, 0.18516 Image size: 2054 x 4186 Quad size range: 205.4, 4662.78 Objs: 0, 50 Parity: 0, normal Use_radec? yes, (320.343, 0.500178), radius 1 deg Verify_pix: 1 Code tol: 0.01 Dist from quad bonus: yes Distractor ratio: 0.25 Log tune-up threshold: inf Log bail threshold: -230.259 Log stoplooking threshold: inf Maxquads 0 Maxmatches 0 Set CRPIX? no Tweak? no Indexes: 3 /tigress/HSC/astrometry_net_data/sdss-dr9-fink-v5b/sdss-dr9-fink-v5b_and_263_0.fits /tigress/HSC/astrometry_net_data/sdss-dr9-fink-v5b/sdss-dr9-fink-v5b_and_263_1.fits /tigress/HSC/astrometry_net_data/sdss-dr9-fink-v5b/sdss-dr9-fink-v5b_and_263_2.fits Field: 258 stars Quad scale range: [641.674, 2208.56] pixels object 1 of 50: 0 quads tried, 0 matched. object 2 of 50: 0 quads tried, 0 matched. object 3 of 50: 0 quads tried, 0 matched. object 4 of 50: 0 quads tried, 0 matched. object 5 of 50: 0 quads tried, 0 matched. object 6 of 50: 0 quads tried, 0 matched. Got a new best match: logodds 787.099. log-odds ratio 787.099 (inf), 178 match, 1 conflict, 75 distractors, 220 index. RA,Dec = (320.343,0.500213), pixel scale 0.167612 arcsec/pix. Hit/miss: Hit/miss: ++-+++++-++++++++++++--++-+--+++++-+-+++++++++-+++++++-+++++-+++++++++++++-++++++-++++++-+++++++-+++ Pixel scale: 0.167612 arcsec/pix. Parity: pos. processCcd.calibrate.astrometry.solver: Got astrometric solution from Astrometry.net LoadReferenceObjects: getting reference objects using center (1023.5, 2084.5) pix = Fk5Coord(320.3431396, 0.5002365, 2000.00) sky and radius 0.00194896 rad LoadReferenceObjects: search for objects at Fk5Coord(320.3431396, 0.5002365, 2000.00) with radius 0.111667328272 deg LoadReferenceObjects: found 495 objects LoadReferenceObjects: trimmed 257 out-of-bbox objects, leaving 238 processCcd.calibrate.astrometry.solver: Fit WCS: use iter 2 because it had less linear scatter than the next iter: 0.306732 vs. 0.320115 pixels processCcd.calibrate.astrometry: 186 astrometric matches processCcd.calibrate.astrometry: Refitting WCS processCcd.calibrate.astrometry: Astrometric scatter: 0.048271 arcsec (with non-linear terms, 174 matches, 12 rejected) processCcd.calibrate.photocal: Not applying color terms because config.applyColorTerms is False processCcd.calibrate.photocal: Magnitude zero point: 30.685281 +/- 0.058711 from 173 stars processCcd.calibrate: Photometric zero-point: 30.685281 processCcd.detection: Detected 1194 positive sources to 5 sigma. processCcd.detection: Resubtracting the background after object detection processCcd.deblend: Deblending 1194 sources processCcd.deblend: Deblended: of 1194 sources, 143 were deblended, creating 358 children, total 1552 sources processCcd.measurement: Measuring 1552 sources (1194 parents, 358 children) processCcd WARNING: Persisting background models processCcd: Matching icSource and Source catalogs to propagate flags. processCcd: Matching src to reference catalogue LoadReferenceObjects: getting reference objects using center (1023.5, 2087.5) pix = Fk5Coord(320.3429016, 0.5001781, 2000.00) sky and radius 0.00195667 rad LoadReferenceObjects: search for objects at Fk5Coord(320.3429016, 0.5001781, 2000.00) with radius 0.112109149864 deg LoadReferenceObjects: found 499 objects LoadReferenceObjects: trimmed 261 out-of-bbox objects, leaving 238 processCcd.calibrate.astrometry.solver: Fit WCS: use iter 1 because it had less linear scatter than the next iter: 0.300624 vs. 0.300652 pixels {code} The verbosity of the astrometry module is out of proportion with the rest of the modules, which makes it difficult to follow the processing. This is a pull request for fixes I have made.",1 +"DM-3142","07/14/2015 16:38:01","Port HSC optimisations for reading astrometry.net catalog","Some astrometry.net catalogs used in production can be quite large, and currently all of the catalog must be read in order to determine bounds for each component. This can make the loading of the catalog quite slow (e.g., 144 sec out of 177 sec to process an HSC image, using an SDSS DR9 catalog). We have HSC code that caches the required information, making the catalog load much faster. The code is from the following HSC issues: * [HSC-1087: Make astrometry faster|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1087] * [HSC-1143: Floating point exception in astrometry|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1143] * [HSC-1178: Faster construction of Astrometry.net catalog|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1178] * [HSC-1179: Assertion failure in astrometry.net|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1178] While there have been some changes to the LSST astrometry code that will mean we can't directly cherry-pick the HSC code, yet I think the main structure remains, so the approach can be copied without much effort.",3 +"DM-3151","07/15/2015 14:29:35","CI validation of lsstsw's repos.yaml","Having some sort of automatic ""lint check"" of the repos.yaml file is desirable due to the length of time required to do a full up test of lsstsw. It should be possible to cobble a sanity checker together that can be run from travis-ci.",1 +"DM-3153","07/16/2015 13:20:51","meas_base still uses eups in tests","{{tests/centroid.py}} uses EUPS to determine the location of the data file used by the test. This needs to be fixed to use a location relative to the test file.",1 +"DM-3154","07/16/2015 14:30:21","meas_astrom still using eups in tests","In DM-2636 we modified the tests to be skipped if EUPS is not available. I've had a closer look and all the ones I have glanced at seem to be easily fixable to run without EUPS. The tests seem to be using EUPS to locate the {{meas_astrom}} (effectively asking EUPS for the location of the test file), then a path to the astrometry.net test data within the {{tests/}} directory is located and then EUPS is asked to setup {{astrometry_net_data}} using that path. Since the table files are all empty this is the equivalent to simply assigning the {{ASTROMETRY_NET_DATA_DIR}} environment variable directly to the path in the tests sub-directory. Making this change to one of the tests seems to work so I will change the rest.",2 +"DM-3160","07/16/2015 17:17:11","Improve name and default value of MeasureApCorrConfig.refFluxAlg","The config name refFluxAlg should be refFluxField (since it is a flux field name prefix) and the default should be base_CircularApertureFlux_5 instead of base_CircularApertureFlux_0 (thus giving a reasonable radius instead of one that is ridiculously too small). I should have handled it on DM-436 but it slipped through.",1 +"DM-3173","07/17/2015 14:28:14","In CalibrateTask if one disables psf determination then aperture correction will fail","In pipe_tasks CalibrateTask, by default aperture correction uses source flag ""calib_psfUsed"" to decide if a source is acceptable to use for measuring aperture correction. If PSF determination is disabled then this flag is never set and aperture correction will fail with a complaint that there are 0 sources. ",1 +"DM-3174","07/17/2015 15:57:16","CalibrateTask instantiates measureApCorr, applyApCorr and photocal subtasks using the wrong schema","CalibrateTask instantiates measureApCorr, applyApCorr and photocal subtasks using the initial schema ""schema1"" instead of the final schema. Normally this would not matter since most of the fields are shared, but aperture correction wants aperture flux at a larger radius than the narrowest option, and schema1 may only provide the narrowest option. In any case it is safer to instantiate those three subtasks using the final schema, since they are only ever run on the final schema. (Several other subtasks are run on both the initial and final schema, and should continue to be instantiated using schema1).",1 +"DM-3175","07/17/2015 16:23:28","Build 2015_08 Qserv Release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",1 +"DM-3176","07/17/2015 16:27:45","Build and Test 2015_09 Qserv Release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",3 +"DM-3177","07/17/2015 16:27:56","Build and Test 2015_10 Qserv Release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",3 +"DM-3178","07/17/2015 16:28:05","Build and Test 2015_11 Qserv Release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",3 +"DM-3179","07/17/2015 16:28:14","Build and Test 2015_12 Qserv Release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",3 +"DM-3181","07/17/2015 16:28:32","Build and Test 2016_02 Qserv Release","See https://confluence.lsstcorp.org/display/DM/Qserv+Release+Procedure for recipe.",1 +"DM-3182","07/19/2015 11:32:19","Aperture correction not applied for some measurements","Aperture correction needs to be applied every time a measurement is run after it is first measured in CalibrateTask. As of DM-436 aperture correction is only being applied in CalibrateTask, which for example means the information is overwritten during the final measurement of ProcessImageTask.run. This is probably best done by adding code to apply aperture correction to BaseMeasurementTask, so it is inherited by SingleFrameMeasurementTask and ForcedMeasurementTask.",5 +"DM-3192","07/20/2015 13:19:15","Re-implement watcher based on new CSS implementation","Current watcher implementation (in {{admin/bin/watcher.py}}) is based on direct watching of zookeeper updates via kazoo. If we are to re-implement CSS based on mysql then watcher needs to be updated to support it. Mysql does not have watch mechanism, so it has to be done via polling or using some other mechanism if synchronous notifications are needed.",8 +"DM-3194","07/21/2015 02:46:15","Fix cluster install procedure and improve docker support","Document how-to update cluster from Qserv release: See http://www.slac.stanford.edu/exp/lsst/qserv/2015_07/HOW-TO/cluster-deployment.html",0.5 +"DM-3196","07/21/2015 23:25:57","makeWcs() chokes on decam images in 10.1","In 10.0, processCcdDecam.py could process decam images to completion (whether the WCS was read correctly is a different question). Now it fails on makeWcs() (see traceback below), and I suspect this change in behavior is related to DM-2883 and DM-2967. Repository with both data and code to reproduce: http://www.astro.washington.edu/users/yusra/reproduce/reproduceMakeWcsErr.tar.gz (apologies for the size) The attachment is a document describing the WCS representation in the images from the community pipeline, courtesy of Francisco Forster. Please advise. This ticket captures any changes made to afw. {code} D-108-179-166-118:decam yusra$ processCcdDecam.py newTestRepo/ --id visit=0232847 ccdnum=10 --config calibrate.doPhotoCal=False calibrate.doAstrometry=False calibrate.measurePsf.starSelector.name=""secondMoment"" doWriteCalibrateMatches=False --clobber-config : Loading config overrride file '/Users/yusra/lsst_devel/LSST/repos/obs_decam_ya/config/processCcdDecam.py' : Config override file does not exist: '/Users/yusra/lsst_devel/LSST/repos/obs_decam_ya/config/decam/processCcdDecam.py' : input=/Users/yusra/decam/newTestRepo : calib=None : output=None CameraMapper: Loading registry registry from /Users/yusra/decam/newTestRepo/registry.sqlite3 processCcdDecam: Processing {'visit': 232847, 'ccdnum': 10} makeWcs WARNING: Stripping PVi_j keys from projection RA---TPV/DEC--TPV processCcdDecam FATAL: Failed on dataId={'visit': 232847, 'ccdnum': 10}: File ""src/image/Wcs.cc"", line 130, in void lsst::afw::image::Wcs::_initWcs() Failed to setup wcs structure with wcsset. Status 5: Invalid parameter value {0} lsst::pex::exceptions::RuntimeError: 'Failed to setup wcs structure with wcsset. Status 5: Invalid parameter value' Traceback (most recent call last): File ""/Users/yusra/lsst_devel/LSST/DMS5/DarwinX86/pipe_base/10.1-3-g18c2ba7+49/python/lsst/pipe/base/cmdLineTask.py"", line 320, in __call__ result = task.run(dataRef, **kwargs) File ""/Users/yusra/lsst_devel/LSST/DMS5/DarwinX86/pipe_base/10.1-3-g18c2ba7+49/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/Users/yusra/lsst_devel/LSST/repos/obs_decam_ya/python/lsst/obs/decam/processCcdDecam.py"", line 77, in run mi = exp.getMaskedImage() File ""/Users/yusra/lsst_devel/LSST/DMS5/DarwinX86/daf_persistence/10.1-1-g6edbc00+28/python/lsst/daf/persistence/readProxy.py"", line 41, in __getattribute__ subject = oga(self, '__subject__') File ""/Users/yusra/lsst_devel/LSST/DMS5/DarwinX86/daf_persistence/10.1-1-g6edbc00+28/python/lsst/daf/persistence/readProxy.py"", line 136, in __subject__ set_cache(self, get_callback(self)()) File ""/Users/yusra/lsst_devel/LSST/DMS5/DarwinX86/daf_persistence/10.1-1-g6edbc00+28/python/lsst/daf/persistence/butler.py"", line 242, in innerCallback(), dataId) File ""/Users/yusra/lsst_devel/LSST/DMS5/DarwinX86/daf_persistence/10.1-1-g6edbc00+28/python/lsst/daf/persistence/butler.py"", line 236, in location, dataId) File ""/Users/yusra/lsst_devel/LSST/repos/obs_decam_ya/python/lsst/obs/decam/decamMapper.py"", line 118, in bypass_instcal wcs = afwImage.makeWcs(md) File ""/Users/yusra/lsst_devel/LSST/DMS5/DarwinX86/afw/10.1-26-g9124caf+1/python/lsst/afw/image/imageLib.py"", line 8706, in makeWcs return _imageLib.makeWcs(*args) RuntimeError: File ""src/image/Wcs.cc"", line 130, in void lsst::afw::image::Wcs::_initWcs() Failed to setup wcs structure with wcsset. Status 5: Invalid parameter value {0} lsst::pex::exceptions::RuntimeError: 'Failed to setup wcs structure with wcsset. Status 5: Invalid parameter value' {code} ",2 +"DM-3199","07/22/2015 13:38:05","Standardize Qserv install procedure: step 1 build docker container for master/worker instance and development version ","- shmux could be used for parallel ssh (remove Qserv builtin one) - look at ""serf and consul"" (See Confluence pages) - improve doc: http://www.slac.stanford.edu/exp/lsst/qserv/2015_07/HOW-TO/index.html - run multiple instances/versions of Qserv using different run dir/ports and the same data",8 +"DM-3204","07/23/2015 16:02:53","W16 Data Access and Db Release Documentation","Write Release documentation covering Data Access and Database work.",5 +"DM-3209","07/23/2015 17:13:03","Add debugging for astrometry.net solver","To be able to debug astrometric matching, it helps to be able to visualise the source positions, the distorted source positions, and the reference positions. This is a pull request to add these.",0.5 +"DM-3214","07/27/2015 10:39:34","ChebyshevBoundedField should use _ not . as field separators for persistence","ChebyshevBoundedField uses ""."" instead of ""\_"" as field separators in its afw table persistence. This is the old way of doing things, and unfortunately causes errors when reading in older versions of tables, becaus afw converts ""."" to ""_"" in that situation. This shows up as a unit test failure in DM-2981 (brought over from HSC) when an older version table is read in. It is an open question whether to fix this as part of DM-2981 (which conveniently has a test that shows the problem, though not intentionally so) or separately, in which case a new test is wanted. In the former case I'm happy to do the work so I can finish DM-2981. Many thanks to Jim Bosch for diagnosing the problem.",1 +"DM-3218","07/27/2015 15:43:47","unable to create public images","Errors are returned when attempting to upload an image marked as public.",1 +"DM-3223","07/27/2015 16:23:26","Improve czar-worker communication debugging","Add features to make it easier to debug communication problems. Particularly, record the source of a message, and remove extraneous messages.",2 +"DM-3227","07/27/2015 17:15:58","openstack API endpoint is broken","Similar to what was observed in DM-3226, the referral endspoint returned by {code:java} https://nebulous.ncsa.illinois.edu:5000 {code} are not FQDNs. This fundamentally breaks any attempt to use the API one step past authenticating with keystone. This is an example HTTP response: {code:java} HTTP/1.1 200 OK Date: Mon, 27 Jul 2015 23:11:02 GMT Server: Apache/2.4.10 (Ubuntu) Vary: X-Auth-Token X-Distribution: Ubuntu x-openstack-request-id: req-ac7bb613-86ef-43ab-a663-75c2ed3fb124 Content-Length: 1656 Content-Type: application/json {""access"": {""token"": {""issued_at"": ""2015-07-27T23:11:02.342216"", ""expires"": ""2015-07-28T00:11:02Z"", ""id"": ""99b843d4baf94569a0d34ca4fecb470c"", ""tenant"": {""description"": null, ""enabled"": true, ""id"": ""d1f16653856540d386224fb057b5b00c"", ""name"": ""LSST""}, ""audit_ids"": [""fAP8851vTQi1n5pYmNoIjw""]}, ""serviceCatalog"": [{""endpoints"": [{""adminURL"": ""http://nebula:9292"", ""region"": ""RegionOne"", ""internalURL"": ""http://nebula:9292"", ""id"": ""49365a8e8fe743af9d517e84a98e3ee9"", ""publicURL"": ""http://nebula:9292""}], ""endpoints_links"": [], ""type"": ""image"", ""name"": ""glance""}, {""endpoints"": [{""adminURL"": ""http://nebula:8774/v2/d1f16653856540d386224fb057b5b00c"", ""region"": ""RegionOne"", ""internalURL"": ""http://nebula:8774/v2/d1f16653856540d386224fb057b5b00c"", ""id"": ""c1e31df3656042ef9c5502efd7d574f2"", ""publicURL"": ""http://nebula:8774/v2/d1f16653856540d386224fb057b5b00c""}], ""endpoints_links"": [], ""type"": ""compute"", ""name"": ""nova""}, {""endpoints"": [{""adminURL"": ""http://nebula:9696"", ""region"": ""RegionOne"", ""internalURL"": ""http://nebula:9696"", ""id"": ""266c9dc8e0344f8fa3f078652e868443"", ""publicURL"": ""http://nebula:9696""}], ""endpoints_links"": [], ""type"": ""network"", ""name"": ""neutron""}, {""endpoints"": [{""adminURL"": ""http://nebula:35357/v2.0"", ""region"": ""RegionOne"", ""internalURL"": ""http://nebula:5000/v2.0"", ""id"": ""5d474008bcee4f44800546e3f3302404"", ""publicURL"": ""http://nebula:5000/v2.0""}], ""endpoints_links"": [], ""type"": ""identity"", ""name"": ""keystone""}], ""user"": {""username"": ""jhoblitt"", ""roles_links"": [], ""id"": ""6ea0c8e153b04ae29572c5fd877b6ac3"", ""roles"": [{""name"": ""user""}], ""name"": ""jhoblitt""}, ""metadata"": {""is_admin"": 0, ""roles"": [""142761bd922e453294e9b7086a227cbc""]}}} {code} ",1 +"DM-3228","07/27/2015 18:59:01","evaluate NCSA OpenStack against SQRE requirements and provide feedback - part 1","See also https://confluence.lsstcorp.org/pages/viewpage.action?spaceKey=LDMDG&title=NCSA+Nebula+OpenStack+Issues",2 +"DM-3237","07/28/2015 15:04:04","Fix problems with no-result queries on multi-node setup","For queries like: select * from Object where id = qserv can't map it to any chunk, and it ends up executing SELECT * FROM qservTest_case01_qserv.Object_1234567890 AS QST_1_ WHERE objectId= the chunk 1234567890 is a special chunk and it exists on all nodes. And that fails with: (build/qdisp/QueryResource.cc:61) - Error provisioning, msg=Unable to write file; multiple files exist. code=2 ",1 +"DM-3241","07/28/2015 18:21:21","Create images for the mask bits at server side","LSST FITS images will have a extension that indicate the mask bits. In order to overlay the masks on the primary image, we need to turn the mask bits into a set of images. This task is to take the requested bits and FITS as input, output a set of images for each requested bit. Each bit will have different color. ",20 +"DM-3243","07/28/2015 18:29:00","Include polygon bounds in CoaddPsf logic","This is a port of [HSC-974|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-974]. Original description: The {{CoaddPsf}} class should use the polygon bounding areas that were added to {{Exposure}} and {{ExposureRecord}} in DM-2981 (was: HSC-973) when determining which PSF images to coadd.",1 +"DM-3245","07/28/2015 22:55:27","Add support for SUBMIT query parsing to czar","We need to be able to pass information from user about query type (sync/async). This may require tweaking the parser. ",2 +"DM-3249","07/28/2015 23:06:31","Revisit and document user-facing aspects of async queries","Outline all aspects of async queries that are affecting users, discuss with the DM team, and document. This includes things like: * managing async queries (checking status, terminating) * retrieving results from async queries * managing query results (purging policies etc) * probably more, need to think about it...",8 +"DM-3253","07/29/2015 00:10:14","Unify KVInterface python and c++ interfaces","Swig the C++ mysql-based KvInterface implementation. ",8 +"DM-3257","07/29/2015 07:53:01","Port flux.scaled from HSC","[HSC-1295|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1295] introduces {{flux.scaled}}, which measures the flux within a circular aperture that is set from the size of the PSF, scaled by some factor. Stephen Gwyn recommends using this as our fiducial calibration flux.",2 +"DM-3258","07/29/2015 08:27:36","CoaddPsf.getAveragePosition() is not a valid position","This is a port of [HSC-1138|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1138] to LSST. That is an aggregate of two related minor fixes: * {{CoaddInputRecorder}} should default to {{saveVisitGoodPix=True}} so that average positions in the {{CoaddPsf}} can be properly weighted; * {{computeAveragePosition}} and {{doComputeKernelImage}} should be consistent about the data included when determining whether a source is off image.",1 +"DM-3259","07/29/2015 09:38:37","Define polygon bounds for CCDs based on vignetted regions","This is a port of [HSC-976|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-976] to LSST. The original issue description was: We should set the polygon bounds (added in DM-2981 [was HSC-973]) for HSC CCD exposures to cover the non-vignetted regions. This should probably be done in ISR or some other camera-specific location. Note that, contrary to the description in DM-2981, this functionality was not included there.",1 +"DM-3347","07/30/2015 10:26:53","assertWcsNearlyEqualOverBBox and friends is too hard to use as a free function","assertWcsNearlyEqualOverBBox and similar functions elsewhere in afw were written to be methods of lsst.utils.tests.TestCase, so their first argument is a testCase. This is fine for use in unit tests, but a hassle to use as free functions because the user must provide a testCase argument (though it need only be a trivial class with a fail(self, msgStr) method). Worse, that minimal requirement is not documented, so technically providing a simple mock test case is unsafe. I have two proposals: - Document the fact that testCase need only support fail(self, msgStr). This makes it clear how to safely use these functions as free functions. - Allow testCase to be None, in which case RuntimeError is raised. That makes these functions even easier to use as free functions. ",1 +"DM-3349","07/30/2015 14:57:35","Add test case for ExposureRecord::contains","In DM-3243 we ported from HSC the ability to take account of the associated {{validPolygon}} when checking whether a point falls within an {{Exposure}}. This functionality was not accompanied by an adequate unit test.",2 +"DM-3355","07/31/2015 12:47:47","Support the FITS cube reader","RSA needs to be able to read in the FITS cube generated by Herschel project. We need to guide the effort so the code is generic enough for non-Herschel data.",1 +"DM-3356","07/31/2015 14:34:00","Fix Firefly build script so it'll work with latest version of gradle","Firefly build was failing when using gradle version 2.5. Minor changes to the dependencies declaration fixed it.",1 +"DM-3358","07/31/2015 16:09:53","Add mysql-based test to multi-node integration test","At the moment multi-node integration test runs only on multi-node using Qserv, it does not run on plain mysql, and thus we can't validate results. The story involves tweaking qserv_testdata such that we can run mysql test on the czar, and compare results from mysql and qserv.",5 +"DM-3377","08/03/2015 07:01:01","Initial issue investigation for the nebula openstack"," The nebula openstack system at NSCA first became available ~Fri Jul 24 and the week of Jul 27 -- 31 was spent testing and debugging issues that the LSST team identified within, for example, DM-3225, DM-3219, DM-3227 and others. ",20 +"DM-3387","08/03/2015 13:22:38","Make use of good pixel count when building CoaddPsfs","When building a CoaddPsf we have the ability to take account of the number of pixels contributed by the inputs (see http://ls.st/paj and DM-3258). However, the {{CoaddPsf}} constructor fails to use this information. It should copy this field when copying the provided {{ExposureCatalog}}, so that {{computeAveragePosition}} can use it.",1 +"DM-3390","08/03/2015 16:58:35","Re-generate data for large scale tests at in2p3","Sources were incorrectly duplicated, need to be redone",3 +"DM-3391","08/03/2015 17:13:54","Refactor Zscale.java class ","In early this year, the decision all data types would be converted to float in FitsRead. Thus,the bitpixel is not relevant. In Zscale, it still uses bitpixel to test the data type. It should be refactored in the same manner as FitsRead etc. ",2 +"DM-3398","08/04/2015 16:44:21","Fix problem with default_engine","Fix the problem: {quote} 08/04/2015 05:39:47 werkzeug INFO: 141.142.237.30 - - [04/Aug/2015 17:39:47] ""GET /meta/v0/ HTTP/1.1"" 200 - 08/04/2015 05:39:49 __main__ ERROR: Exception on /meta/v0/db [GET] Traceback (most recent call last): File ""/home/becla/stack/Linux64/flask/0.10.1+8/lib/python/Flask-0.10.1-py2.7.egg/flask/app.py"", line 1817, in wsgi_app response = self.full_dispatch_request() File ""/home/becla/stack/Linux64/flask/0.10.1+8/lib/python/Flask-0.10.1-py2.7.egg/flask/app.py"", line 1477, in full_dispatch_request rv = self.handle_user_exception(e) File ""/home/becla/stack/Linux64/flask/0.10.1+8/lib/python/Flask-0.10.1-py2.7.egg/flask/app.py"", line 1381, in handle_user_exception reraise(exc_type, exc_value, tb) File ""/home/becla/stack/Linux64/flask/0.10.1+8/lib/python/Flask-0.10.1-py2.7.egg/flask/app.py"", line 1475, in full_dispatch_request rv = self.dispatch_request() File ""/home/becla/stack/Linux64/flask/0.10.1+8/lib/python/Flask-0.10.1-py2.7.egg/flask/app.py"", line 1461, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File ""/nfs/home/becla/stack/repos/dax_metaserv/python/lsst/dax/metaserv/metaREST_v0.py"", line 59, in getDb return _resultsOf(text(query), scalar=True) File ""/nfs/home/becla/stack/repos/dax_metaserv/python/lsst/dax/metaserv/metaREST_v0.py"", line 122, in _resultsOf engine = current_app.config[""default_engine""] KeyError: 'default_engine' {quote}",1 +"DM-3400","08/05/2015 09:44:21","Eliminate circular aliases in slot centroid definition","[~smonkewitz] has discovered that our schema aliases for even the default configuration of measurement algorithms involve cycles, because the slot centroid algorithm contains a reference to its own flag. Fixing this should just involve an extra check in {{SafeCentroidExtractor}}.",1 +"DM-3404","08/05/2015 13:22:49","Port HSC updates to ingestImages.py","ingestImages.py provides a camera-agnostic manner of creating a data repository (including a registry). The HSC fork contains multiple improvements not present on the LSST side. We need these in order to ingest the HSC data.",2 +"DM-3411","08/05/2015 18:19:32","workspace functions specification document","The first version of the document is here https://confluence.lsstcorp.org/pages/viewpage.action?pageId=41783931",20 +"DM-3419","08/07/2015 13:07:45","obs_decam unit test for reading data ","The unit test wasn't working before and I edited the unit test of reading raw data. This got included with DM-3462. This unit test needs testdata_decam to be setup. The test fails with the stack b1597 at makeWcs (DM-3196). The afw branch u/yusra/DM-3196 is a temporary fix before DM-3196 is resolved. ",2 +"DM-3437","08/10/2015 18:28:09","Add column names metadata to db query results","Per discussion at data access meeting Aug 10, it'd be good to send column names with the query results.",2 +"DM-3440","08/11/2015 09:30:52","add meas_extensions_photometryKron to lsstsw, lsst_distrib","meas_extensions_photometryKron should be added to the CI system, since we are trying to keep it updated. This is blocked by DM-2429 because that includes a fix for a unit test (which the CI system would have caught).",1 +"DM-3442","08/11/2015 16:37:10","Processing y-band HSC data fails in loading reference sources","{code} processCcd.py /lsst3/HSC/data/ --output /raid/price/test --id visit=904400 ccd=50 [...] processCcd.calibrate.astrometry.solver.loadAN: Loading reference objects using center (1023.5, 2091) pix = Fk5Coord(319.8934727, -0.0006943, 2000.00) sky and radius 0.111920792477 deg processCcd FATAL: Failed on dataId={'taiObs': '2013-11-03', 'pointing': 672, 'visit': 904400, 'dateObs': '2013-11-03', 'filter': 'HSC-Y', 'field': 'STRIPE82L', 'ccd': 50, 'expTime': 30.0}: Could not find flux field(s) y_camFlux, y_flux Traceback (most recent call last): File ""/home/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+15/python/lsst/pipe/base/cmdLineTask.py"", line 320, in __call__ result = task.run(dataRef, **kwargs) File ""/home/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+15/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/lsstsw/stack/Linux64/pipe_tasks/10.1-28-gf9582e4+2/python/lsst/pipe/tasks/processCcd.py"", line 85, in run result = self.process(sensorRef, postIsrExposure) File ""/home/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+15/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/lsstsw/stack/Linux64/pipe_tasks/10.1-28-gf9582e4+2/python/lsst/pipe/tasks/processImage.py"", line 160, in process calib = self.calibrate.run(inputExposure, idFactory=idFactory) File ""/home/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+15/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/lsstsw/stack/Linux64/pipe_tasks/10.1-28-gf9582e4+2/python/lsst/pipe/tasks/calibrate.py"", line 457, in run astromRet = self.astrometry.run(exposure, sources1) File ""/home/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+15/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/lsstsw/stack/Linux64/meas_astrom/10.1-19-g6e01b25+5/python/lsst/meas/astrom/anetAstrometry.py"", line 177, in run results = self.astrometry(sourceCat=sourceCat, exposure=exposure, bbox=bbox) File ""/home/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+15/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/lsstsw/stack/Linux64/meas_astrom/10.1-19-g6e01b25+5/python/lsst/meas/astrom/anetAstrometry.py"", line 292, in astrometry astrom = self.solver.determineWcs(sourceCat=sourceCat, exposure=exposure, bbox=bbox) File ""/home/lsstsw/stack/Linux64/meas_astrom/10.1-19-g6e01b25+5/python/lsst/meas/astrom/anetBasicAstrometry.py"", line 409, in determineWcs return self.determineWcs2(sourceCat=sourceCat, **margs) File ""/home/lsstsw/stack/Linux64/meas_astrom/10.1-19-g6e01b25+5/python/lsst/meas/astrom/anetBasicAstrometry.py"", line 437, in determineWcs2 astrom = self.useKnownWcs(sourceCat, wcs=wcs, **kw) File ""/home/lsstsw/stack/Linux64/meas_astrom/10.1-19-g6e01b25+5/python/lsst/meas/astrom/anetBasicAstrometry.py"", line 308, in useKnownWcs calib = None, File ""/home/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+15/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/lsstsw/stack/Linux64/meas_algorithms/10.1-15-g0d3ecf6/python/lsst/meas/algorithms/loadReferenceObjects.py"", line 173, in loadPixelBox loadRes = self.loadSkyCircle(ctrCoord, maxRadius, filterName) File ""/home/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+15/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/lsstsw/stack/Linux64/meas_astrom/10.1-19-g6e01b25+5/python/lsst/meas/astrom/loadAstrometryNetObjects.py"", line 141, in loadSkyCircle fluxField = getRefFluxField(schema=refCat.schema, filterName=filterName) File ""/home/lsstsw/stack/Linux64/meas_algorithms/10.1-15-g0d3ecf6/python/lsst/meas/algorithms/loadReferenceObjects.py"", line 40, in getRefFluxField raise RuntimeError(""Could not find flux field(s) %s"" % ("", "".join(fluxFieldList))) RuntimeError: Could not find flux field(s) y_camFlux, y_flux {code} We should be able to fix this by setting config parameters (e.g., {{calibrate.astrometry.solver.defaultFilter}} or {{calibrate.astrometry.solver.filterMap}}), but how do we keep that synched with the choice of reference catalog? And once we get past astrometry, we also have the same problem in photocal.",2 +"DM-3450","08/12/2015 22:07:19","Tweaks to configurations discovered during S15 tests","Apply tweaks we found useful when running large scale tests. This includes: # etc/my.cnf: change max_connections to 512 # add"": {quote}export XRD_REQUESTTIMEOUT=64000 export XRD_STREAMTIMEOUT=64000 export XRD_DATASERVERTTL=64000 export XRD_TIMEOUTRESOLUTION=64000{quote} to init.d/qserv-czar # add : {quote}ulimit -c unlimited{quote} to all startup scripts in init.d. This will make sure core file is always dumped when we have problems.",1 +"DM-3451","08/12/2015 22:09:19","Resolve problem with running many simultaneous queries","When we run with 110 simultaneous queries, czar fails with ""uncaught exception""",2 +"DM-3452","08/13/2015 01:01:32","Integrate pipelines with MySQL and Qserv","Load data produced by pipelines into MySQL (on lsst10), and Qserv",5 +"DM-3453","08/13/2015 10:22:10","AstrometryTask.run return not consistent with ANetAstrometryTask","ANetAstrometryTask.run returns matchMetadata but AstrometryTask.run returns matchMeta. The two must agree. It turns out that matchMeta is more widely used, so I'll standardize on that.",1 +"DM-3454","08/13/2015 11:56:09","Odd error message in getDistortedWcs","lsst.afw.image.utils.getDistortedWcs complains as follows if the provided exposure has no WCS: ""exposure must have a WCS to use as an initial guess"". It should not say anything about an initial guess. This is presumably a leftover from when the code was part of meas_astrom. Thanks to [~price] for pointing this out.",0 +"DM-3455","08/13/2015 12:26:56","ProcessImageTask.matchSources fails if using ANetAstrometryTask","ProcessImageTask.matchSources fails when using ANetAstrometryTask with the following error: {code} processCcd.calibrate.astrometry: Applying distortion correction processCcd FATAL: Failed on dataId={'taiObs': '2013-11-03', 'pointing': 672, 'visit': 904400, 'dateObs': '2013-11-03', 'filter': 'HSC-Y', 'field': 'STRIPE82L', 'ccd': 50, 'expTime': 30.0}: File ""src/table/Schema.cc"", line 239, in lsst::afw::table::SchemaItem lsst::afw::table::detail::SchemaImpl::find(const string&) const [with T = double; std::string = std::basic_string] Field or subfield withname 'astrom_distorted_x' not found with type 'D'. {0} lsst::pex::exceptions::NotFoundError: 'Field or subfield withname 'astrom_distorted_x' not found with type 'D'.' Traceback (most recent call last): File ""/ssd/rowen/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+3/python/lsst/pipe/base/cmdLineTask.py"", line 320, in __call__ result = task.run(dataRef, **kwargs) File ""/ssd/rowen/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+3/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/ssd/rowen/lsstsw/stack/Linux64/pipe_tasks/tickets.DM-3453-g086c9ddd0a/python/lsst/pipe/tasks/processCcd.py"", line 85, in run result = self.process(sensorRef, postIsrExposure) File ""/ssd/rowen/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+3/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/ssd/rowen/lsstsw/stack/Linux64/pipe_tasks/tickets.DM-3453-g086c9ddd0a/python/lsst/pipe/tasks/processImage.py"", line 219, in process srcMatches, srcMatchMeta = self.matchSources(calExposure, sources) File ""/ssd/rowen/lsstsw/stack/Linux64/pipe_tasks/tickets.DM-3453-g086c9ddd0a/python/lsst/pipe/tasks/processImage.py"", line 250, in matchSources astromRet = astrometry.loadAndMatch(exposure=exposure, sourceCat=sources) File ""/ssd/rowen/lsstsw/stack/Linux64/pipe_base/10.1-4-g6ba0cc7+3/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/ssd/rowen/lsstsw/stack/Linux64/meas_astrom/tickets.DM-3453-gbb2ad1f49c/python/lsst/meas/astrom/anetAstrometry.py"", line 321, in loadAndMatch with self.distortionContext(sourceCat=sourceCat, exposure=exposure) as bbox: File ""/ssd/rowen/lsstsw/anaconda/lib/python2.7/contextlib.py"", line 17, in __enter__ return self.gen.next() File ""/ssd/rowen/lsstsw/stack/Linux64/meas_astrom/tickets.DM-3453-gbb2ad1f49c/python/lsst/meas/astrom/anetAstrometry.py"", line 295, in distortionContext sourceCat.table.defineCentroid(self.distortedName) File ""/ssd/rowen/lsstsw/stack/Linux64/afw/10.1-37-gaedf466/python/lsst/afw/table/tableLib.py"", line 8887, in defineCentroid return _tableLib.SourceTable_defineCentroid(self, *args) NotFoundError: File ""src/table/Schema.cc"", line 239, in lsst::afw::table::SchemaItem lsst::afw::table::detail::SchemaImpl::find(const string&) const [with T = double; std::string = std::basic_string] Field or subfield withname 'astrom_distorted_x' not found with type 'D'. {0} lsst::pex::exceptions::NotFoundError: 'Field or subfield withname 'astrom_distorted_x' not found with type 'D'.' {code} This is probably a result of DM-2939. The basic problem is that the distortion context in ANetAstrometryTask should not be run at that point in processing. [~price] suggests that a simple clean fix is to make the distortion context a no-op if the WCS already contains distortion, if that works. This is what I will try first.",1 +"DM-3456","08/13/2015 12:52:24","Fix problems with talking from webserv to qserv","Flask or sqlalchemy which are part of webserv are producing some extra queries that are confusing qserv. So basically, at the moment even the simplest query run via webserv that is directed to qserv fails.",2 +"DM-3459","08/13/2015 18:23:07","make forced and SFM interfaces more consistent","From [~rowen]: {quote} SimpleMeasurementTask.run and ForcedMeasurementTask.run now both take a source catalog, but the two use the opposite order for the first two arguments (one has the catalog first, the other has the exposure first) {quote}",1 +"DM-3460","08/14/2015 11:38:24","applyApCorr mis-handles missing data","In ApplyApCorrTask.run the following lines do not behave as expected because get returns None if the data is missing, rather than raising an exception: {code} try: apCorrModel = apCorrMap.get(apCorrInfo.fluxName) apCorrSigmaModel = apCorrMap.get(apCorrInfo.fluxSigmaName) except Exception: {code}",1 +"DM-3462","08/14/2015 14:13:17","Make obs_decam handle raw data ","The current obs_decam expects instrument calibrated data from the community pipeline, i.e. it requires matching instcal (Instrument Calibrated), dqmask (the associated mask file), and wtmap (weight map) data from the same visit. This issue is to add functionality so that raw DECam images can be ingested into registry and retrieved by the data butler. Practically, this will create new or expand existing sub-classes of CameraMapper and IngestTask. A brief summary of changes: - The unit test getRaw.py is updated and should pass, with DM-3196 - Working testdata_decam for the unit test is currently at lsst-dev /lsst8/testdata_decam and https://uofi.box.com/testdata-decam - DecamInstcalMapper is renamed to DecamMapper, to reflect that Butler can also get ""raw"" now besides ""instcal"". Please update _mapper in your data repositories. - To create a registry for raw data, run {code:java} ingestImagesDecam.py /path/to/repo --mode=link --filetype=""raw"" /path/*.fits.fz {code} - The default filetype is ""instcal"" for ingestImagesDecam.py, so previous use for instcal stays. ",13 +"DM-3463","08/14/2015 15:09:17","psfex lapack symbols may collide with built in lapack","On my Mac meas_extensions_psfex fails to build due to the numpy config test failing. ""import numpy"" fails with: {code} dlopen(/Users/rowen/LSST/lsstsw/anaconda/lib/python2.7/site-packages/numpy/linalg/lapack_lite.so, 2): can't resolve symbol __NSConcreteStackBlock in /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvMisc.dylib because dependent dylib #1 could not be loaded in /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvMisc.dylib {code} Our best guess (see discussion in Data Management 2015-08-14 at approx. 1:57 PM Pacific time) is that the special lapack functions in psfex are colliding with the lapack that anaconda uses. In case it helps I see this on OS X 10.9.5. I do not see it on lsst-dev.",2 +"DM-3468","08/14/2015 20:49:19","drawing text to ds9 fails if size or the font family is set","Commands like {code} ds9.dot('xxxx', 100, 100, size=3) ds9.dot('xxxx', 100, 120, fontFamily=""times"") {code} silently fail. The problem is that commands like {code} xpaset -p ds9 regions command '{text 100 100 # text=xxxx color=red font=""times 12""}' {code} fail; you need to say {{font=times 12 normal}}",1 +"DM-3470","08/15/2015 16:21:12","Install/deploy SUI web application at NCSA","For summer 15 release, we will deploya SUI web app on NCSA accessible to DM team. - work with NCSA to have a server setup - install necessary software packages - install SUI software - deploy the system and test ",5 +"DM-3480","08/17/2015 17:49:43","Design SQL APIs for async queries","Need SQL API for: * submitting async query, note that we should be able to specify where the results are going / what is the format of the results * retrieving status of async query * retrieving results of async query * retrieving partial results of async query while it is running ",2 +"DM-3481","08/18/2015 01:15:48","adapt sandbox-jenkins-demo to changes in jfryman/nginx 0.2.7","Changes in the way jfryman/nginx 0.2.7 handles tls cert files since 0.2.6 have run awful of selinux permissions issues.",2 +"DM-3483","08/18/2015 10:03:56","Calibration transformation should not fail on negative flux","Before database ingest, measured source fluxes are converted to magnitudes as per DM-2305. The default behaviour of {{afw::image::Calib}} is to throw when a negative flux is encountered, which derails the whole transformation procedure. Better is to return a NaN.",0.5 +"DM-3488","08/18/2015 16:12:47","Debug problem with large results set","Query returning 2 billion rows causes problems for czar - czar is using nearly 16 GB or memory. Need to understand why RAM usage in czar is correlated with result size.",1 +"DM-3490","08/18/2015 17:20:21","Quick-and-dirty n-way spatial matching","This issue will add limited N-way spatial matching of multiple catalogs with identical schemas, sufficient for measuring FY15 KPMs. It will be a simple wrapper on our existing 2-way matching code in afw, and will not be intended for long term use (as it won't be an efficient algorithm or an ideal interrface).",2 +"DM-3491","08/18/2015 17:57:47","Update ip_diffim to use the new NO_DATA flag instead of EDGE","In ip_diffImm some uses of EDGE were converted to or supplemented with NO_DATA, but others were not. This ticket handles the missing instances.",1 +"DM-3492","08/18/2015 22:20:19","Correct for distortion in matchOptimisticB astrometry matcher","matchOptimisticB does not correct for distortion, although an estimate of the distortion is available. We suspect that doing the matching on the celestial sphere might be ideal, but matching on a tangent plane has worked for HSC.",5 +"DM-3493","08/18/2015 22:36:13","Fix crosstalk following ds9 interface changes","crosstalk.py in obs_subaru uses ds9 without actually displaying anything, which causes trouble if display_ds9 is not setup.",1 +"DM-3506","08/19/2015 12:46:55","W16 Support Dynamic CSS Metadata in Czar","Czar needs to support dynamic CSS metadata. This epic involves reworking Facade and related code so that czar can have up to date CSS metadata (per query) instead of relying on static snapshots",40 +"DM-3522","08/21/2015 13:07:41","Releasing un-acquired resources bug","Running a mix of queries: 75 low volume and 10 high volume that include near neighbor failed at some point with {quote} terminate called after throwing an instance of 'lsst::qserv::Bug' what(): ChunkResource ChunkEntry::release: Error releasing un-acquired resource {quote} Stack trace: {quote} #0 0x00007fb6b0cce5e9 in raise () from /lib64/libc.so.6 Missing separate debuginfos, use: debuginfo-install expat-2.1.0-8.el7.x86_64 glibc-2.17-78.el7.x86_64 keyutils-libs-1.5.8-3.el7.x86_64 krb5-libs-1.12.2-14.el7.x86_64 libcom_err-1.42.9-7.el7.x86_64 libgcc-4.8.3-9.el7.x86_64 libicu-50.1.2-11.el7.x86_64 libselinux-2.2.2-6.el7.x86_64 libstdc++-4.8.3-9.el7.x86_64 nss-softokn-freebl-3.16.2.3-9.el7.x86_64 openssl-libs-1.0.1e-42.el7_1.9.x86_64 pcre-8.32-14.el7.x86_64 xz-libs-5.1.2-9alpha.el7.x86_64 zlib-1.2.7-13.el7.x86_64 (gdb) where #0 0x00007fb6b0cce5e9 in raise () from /lib64/libc.so.6 #1 0x00007fb6b0ccfcf8 in abort () from /lib64/libc.so.6 #2 0x00007fb6b15d29b5 in __gnu_cxx::__verbose_terminate_handler() () from /lib64/libstdc++.so.6 #3 0x00007fb6b15d0926 in ?? () from /lib64/libstdc++.so.6 #4 0x00007fb6b15cf8e9 in ?? () from /lib64/libstdc++.so.6 #5 0x00007fb6b15d0554 in __gxx_personality_v0 () from /lib64/libstdc++.so.6 #6 0x00007fb6b1069913 in ?? () from /lib64/libgcc_s.so.1 #7 0x00007fb6b1069e47 in _Unwind_Resume () from /lib64/libgcc_s.so.1 #8 0x00007fb6ab554247 in lsst::qserv::wdb::ChunkResourceMgr::Impl::release (this=0x21d1cc0, i=...) at build/wdb/ChunkResource.cc:398 #9 0x00007fb6ab552696 in lsst::qserv::wdb::ChunkResource::~ChunkResource (this=0x7fb68a5f9b70, __in_chrg=) at build/wdb/ChunkResource.cc:131 #10 0x00007fb6ab560f0f in lsst::qserv::wdb::QueryAction::Impl::_dispatchChannel (this=0x7fb65848c4d0) at build/wdb/QueryAction.cc:392 #11 0x00007fb6ab55f5ab in lsst::qserv::wdb::QueryAction::Impl::act (this=0x7fb65848c4d0) at build/wdb/QueryAction.cc:187 #12 0x00007fb6ab562084 in lsst::qserv::wdb::QueryAction::operator() (this=0x7fb658050548) at build/wdb/QueryAction.cc:450 #13 0x00007fb6ab544f46 in lsst::qserv::wcontrol::ForemanImpl::Runner::operator() (this=0x7fb67400fa20) at build/wcontrol/Foreman.cc:302 #14 0x00007fb6ab551cf0 in std::_Bind_simple::_M_invoke<>(std::_Index_tuple<>) ( this=0x7fb67400fa20) at /usr/include/c++/4.8.2/functional:1732 #15 0x00007fb6ab551a8b in std::_Bind_simple::operator()() (this=0x7fb67400fa20) at /usr/include/c++/4.8.2/functional:1720 {quote} Tail of log file from xrootd log: {quote} 0821 19:08:58.530 [0x7fb68a6fb700] DEBUG GroupSched (build/wsched/GroupScheduler.cc:139) - _getNextTasks(1)>->-> 0821 19:08:58.530 [0x7fb68a6fb700] DEBUG GroupSched (build/wsched/GroupScheduler.cc:151) - Returning 1 to launch 0821 19:08:58.530 [0x7fb68a6fb700] DEBUG GroupSched (build/wsched/GroupScheduler.cc:154) - _getNextTasks <<<<< 0821 19:08:58.530 [0x7fb68a6fb700] DEBUG ScanSched (build/wsched/ScanScheduler.cc:172) - _getNextTasks(29)>->-> 0821 19:08:58.530 [0x7fb68a6fb700] DEBUG ScanSched (build/wsched/ChunkDisk.cc:199) - ChunkDisk busyness: yes 0821 19:08:58.530 [0x7fb68a6fb700] DEBUG ScanSched (build/wsched/ChunkDisk.cc:171) - ChunkDisk getNext: current= (scan=10436, cached=8360,8259,) candidate=10301 0821 19:08:58.530 [0x7fb68a6fb700] DEBUG ScanSched (build/wsched/ChunkDisk.cc:184) - ChunkDisk denying task 0821 19:08:58.530 [0x7fb68a6fb700] DEBUG ScanSched (build/wsched/ScanScheduler.cc:196) - _getNextTasks <<<<< 0821 19:08:58.531 [0x7fb68a6fb700] INFO root (build/xrdsvc/SsiSession.cc:120) - Enqueued TaskMsg for Resource(/chk/LSST/2732) in 0.001016 seconds 0821 19:08:58.531 [0x7fb6895f8700] DEBUG Foreman (build/wcontrol/Foreman.cc:175) - Registered runner 0x7fb66c141ab0 0821 19:08:58.531 [0x7fb6895f8700] DEBUG Foreman (build/wcontrol/Foreman.cc:209) - Started task Task: msg: session=434445 chunk=2732 db=LSST entry time= frag: q=SELECT o.deepSourceId,o.ra,o.decl,s.coord_ra,s.coord_decl,s.parent FROM LSST.Object_2732 AS o,LSST.Source_2732 AS s WHERE scisql_s2PtInBox(o.ra,o.decl,48.482655,-54.274507,48.555903,-54.196952)=1 AND scisql_s2PtInBox(s.coord_ra,s.coord_decl,48.482655,-54.274507,48.555903,-54.196952)=1 AND o.deepSourceId=s.objectId, sc= rt=r_4344458c9456ede5cbe0b5f42e1a1571d5dd73_2732_0 0821 19:08:58.531 [0x7fb6895f8700] INFO Foreman (build/wcontrol/Foreman.cc:296) - Runner running Task: msg: session=434445 chunk=2732 db=LSST entry time= frag: q=SELECT o.deepSourceId,o.ra,o.decl,s.coord_ra,s.coord_decl,s.parent FROM LSST.Object_2732 AS o,LSST.Source_2732 AS s WHERE scisql_s2PtInBox(o.ra,o.decl,48.482655,-54.274507,48.555903,-54.196952)=1 AND scisql_s2PtInBox(s.coord_ra,s.coord_decl,48.482655,-54.274507,48.555903,-54.196952)=1 AND o.deepSourceId=s.objectId, sc= rt=r_4344458c9456ede5cbe0b5f42e1a1571d5dd73_2732_0 0821 19:08:58.531 [0x7fb6895f8700] INFO Foreman (build/wdb/QueryAction.cc:177) - Exec in flight for Db = q_fd51ad249f62fb765e173d7b3cae5d94 0821 19:08:58.531 [0x7fb6895f8700] WARN Foreman (build/wdb/QueryAction.cc:109) - QueryAction overriding dbName with LSST 0821 19:08:58.718 [0x7fb6a0d8e700] INFO root (build/wdb/QueryAction.cc:261) - &&& _fillRows size=106 0821 19:08:58.718 [0x7fb6a0d8e700] INFO root (build/wdb/QueryAction.cc:261) - &&& _fillRows size=210 0821 19:08:58.718 [0x7fb6a0d8e700] INFO root (build/wdb/QueryAction.cc:261) - &&& _fillRows size=316 ...(thousands of _fillRows lines) terminate called after throwing an instance of 'lsst::qserv::Bug' what(): ChunkResource ChunkEntry::release: Error releasing un-acquired resource {quote} ",3 +"DM-3544","08/25/2015 08:57:38","Cleanup of initial astrometry improvements","The astrometry improvements are working, but some cleanup would be good to remove dependencies on A.net and to provide default reference catalog loaders.",20 +"DM-3546","08/25/2015 11:37:05","Move LDM-151 to Sphinx/Read the Docs","Move the LDM-151 (DM applications design document) to restructuredText (built with Sphinx) and published automatically via readthedocs.org. See discussion at http://community.lsst.org/t/requesting-comments-for-design-documentation-format-for-dm/132?u=jsick This is an experiment.",1 +"DM-3555","08/25/2015 22:47:09","Ignore ""SELECT @@tx_isolation"" queries","Looks like one of the queries we registered in webserv is: cursor.execute('SELECT @@tx_isolation') and that is bound to confuse Qserv. Need to suppress it at mysql proxy level.",1 +"DM-3558","08/26/2015 00:21:36","Experiment with Jupyter widget technology and Firefly Tools","Based on DM-2047 work to date, investigate the feasibility of using the Jupyter widget interface to wrap up Firefly tools.",8 +"DM-3587","08/27/2015 11:02:40","Firefly infrastructure improvement to support new functions (W16)","This epic will capture the necessary changes of Firefly infrastructure to support new functions needed. It does not include the changes caused by the the conversion from GWT infrastructure to pure JavaScript based system using React and FLUX platform. ",40 +"DM-3593","08/27/2015 14:55:12","Firefly support for pipeline visualization needs (W16)","Data products pipeline needs visualization capabilities for display. Firefly needs to have new capabilities to support it. ",40 +"DM-3596","08/27/2015 15:34:33","More bug fixes in Firefly JS code ","Bug fixes and improvements for Firefly JS code ",40 +"DM-3608","08/27/2015 18:02:57","provide detailed information needed to DAX meta API","SUIT needs certain specific information through DAX meta service when searching for meta data. For Example, what kind of table it is, does it have spatial index to search by position, which set of (ra, dec) columns is the primary one, etc?",1 +"DM-3609","08/27/2015 18:06:42","The Alert subscription system requirement gathering (F16)","Solidify the requirement for the alert subscription system. Nov. 2, 2016 XW After much discussion with AP team and the re-plan exercise, the requirement for alert subscription has been identified as the following: *use the API that AP team will provide to* # provide a UI for user to specify the filters on alerts of their interests, the destination of the alert to be sent # save the specification in a DB # provide UI to allow users to make modification of the filters and destinations of alerts # possibly to annotate the alerts and allow user to access the annotation This will involve SLAC for DB, NCSA for user management ",8 +"DM-3610","08/27/2015 18:11:05","CCB review and posting of final updated document","Carry out the CCB review, respond to questions, support final implementation of updated document.",2 +"DM-3611","08/27/2015 18:18:02","Prepare for Winter 2016 work on LSE-68","Use a session at the LSST 2015 all-hands meeting to prepare for LSE-68 work in the Winter 2016 cycle.",3 +"DM-3615","08/27/2015 18:33:15","expose region overlay on image function through JavaScript API","expose region overlay on image function through JavaScript API",5 +"DM-3616","08/27/2015 18:35:16","Expose image XY readout at cursor point function in JavaScript API","Expose image XY readout at cursor point function in JavaScript API",2 +"DM-3618","08/27/2015 21:31:16","Fix bug related to restarting xrootd in wmgr","Changes from DM-2930 are failing integration tests because wmgr is restarting xrootd and now we need to also restart mysqld if xrootd pid changes.",1 +"DM-3630","08/28/2015 14:51:26","Change root to config in config override files","Implement RFC-62 by using {{config}} rather than {{root}} in config override files for the root of the config. Note that I propose not modifying astrometry_net_data configs because those are numerous and hidden. They have their own special loader in LoadAstrometryNetObjectsTask._readIndexFiles which could easily be updated later. if desired. An obvious time to make such a transition would be when overhauling the way this data is unpersisted.",2 +"DM-3638","08/28/2015 16:13:14","RangeField mis-handles max < min","RangeField contains the following bit of code to handle the case that max < min: {code} if min is not None and max is not None and min > max: swap(min, max) {code} This is broken because there is no swap function and if there was it could not work in-place like this. However, rather than replace this with the standard {{min, max = max, min}} I suggest we raise an exception. If max < min then this probably indicates some kind of error or sloppiness that should not be silently ignored. If we insist on swapping the values then at least we should print a warning. The fact that this bug has never been reported strongly suggests that we never do set min > max and thus that an exception will be fine.",1 +"DM-3639","08/28/2015 16:24:31","OCS-CCS-DAQ-DM teleconference, April 2015","Prepare for and attend a half-day teleconference on OCS issues.",2 +"DM-3641","08/28/2015 17:06:54","Firefly server side extensions using DM stack (F16)","Design and implement a control system to extend Firefly server side capabilities using task in DM stack. This will make it easier to use DM stack for customized data processing. ",40 +"DM-3642","08/28/2015 17:31:38","Support OCS revision of LSE-70, LSE-209","Support the OCS efforts to update LSE-70 and create a new associated document, LSE-209. Getting current versions of these under change control will allow us to complete a round of work on LSE-72.",20 +"DM-3646","08/28/2015 18:03:55","LSE-72: OCS-CCS-DAQ-DM workshop, July 2015","Work associated with Workshop IV in the series, held at NCSA July 8-10, 2015.",2 +"DM-3648","08/28/2015 18:27:24","SUIT design document outline","SUI/T design document outline. ",2 +"DM-3650","08/28/2015 18:32:24","on-going support to Camera team in UIUC","Attend UIUC weekly meeting and give support as needed. ",2 +"DM-3651","08/28/2015 18:33:15","MakeDiscreteSkyMapRunner.__call__ mis-handled returning results","{{MakeDiscreteSkyMapRunner.\_\_call\_\_}} will fail if {{self.doReturnResults}} is {{True}} due to trying to reference undefined variables. This is at least approximately a copy of a problem that was fixed in pipe_base {{TaskRunner}}. {{MakeDiscreteSkyMapRunner.\_\_call\_\_}} should be fixed in a similar way, and (like {{TaskRunner}}) changed to return a pipe_base {{Struct}}. ",2 +"DM-3652","08/28/2015 18:39:31","SUIT design document outline","Work with Gregory on the SUIT design document outline 1. Requirements flow down, making sure that we design the system satisfying the current requirements. 2. Use cases collection. at least one typical use case in each major science theme 3. Levels of different users ** novice: treat the web portal as a archive to get some information, don't know much about LSST ** novice expert: has some ideas of what special functions they would like, has some knowledge of LSST data ** domain expert: knows LSST data very well and want some special functions ready to use ** savvy expert: knows LSST data very well and like to use API to their own programming 4. functions for all different levels of users 5 system design ** system diagram ** details of the different parts *** Firefly server *** Firefly client *** Firefly server extension *** Firefly JavaScript API *** Firefly Python API *** Firefly Python API, Jupyter notebook, and other Python applications *** workspace and level3 data *** SUI web portal sketch, workflow ** dependency on other capabilities of other institutes 6. development and test plan, timeline 7. deployment plan ",1 +"DM-3653","08/28/2015 18:40:52","SUIT design document outline","work with John Rector on SUIT design document outline",1 +"DM-3656","08/30/2015 00:31:52","Data loader doesn't work for match tables","qserv-data-loader.py fails to load match tables: - it does not invoke the correct partitioner executable for them - not all CSS parameters required for match tables are passed down to the CSS update code",1 +"DM-3657","08/31/2015 04:04:21","Create change request for LSE-75","Create a change request for LSE-75, the TCS - to - DM ICD.",2 +"DM-3658","08/31/2015 04:14:10","Discussions on LSE-75 with Telescope & Site personnel","Pursue interactions with Telescope and Site personnel regarding LSE-75, and in particular the issues surrounding calibration data products for the wavefront and guider data analysis pipelines. Covers work through the end of August 2015.",3 +"DM-3659","08/31/2015 08:50:41","Initial discussions with Patrick Ingraham","This story is a catch-all for preliminary conversations about LSE-140 with the new Calibration Instrumentation Scientist, Patrick Ingraham.",1 +"DM-3667","08/31/2015 15:38:57","PSFEX does not build if PLplot is installed","During the configure phase PSFEX checks for the presence of PLplot. If PLplot is found then the build fails (at least on a Mac using homebrew): {code} /bin/sh ../libtool --tag=CC --mode=link clang -g -O2 -I/Users/timj/work/lsstsw/src/psfex/lapack_functions/include -o psfex check.o context.o cplot.o diagnostic.o fft.o field.o field_utils.o fitswcs.o homo.o main.o makeit.o makeit2.o misc.o pca.o prefs.o psf.o sample.o sample_utils.o vignet.o wcs_utils.o xml.o ./fits/libfits.a ./levmar/liblevmar.a ./wcs/libwcs_c.a -L/Users/timj/work/lsstsw/stack/DarwinX86/fftw/3.3.3-1-g8fdba61+da39a3ee5e/lib -lfftw3f -lm -L/Users/timj/work/lsstsw/src/psfex/lapack_functions/lib -llapackstub -lf2c -lm -lplplotd libtool: link: clang -g -O2 -I/Users/timj/work/lsstsw/src/psfex/lapack_functions/include -o psfex check.o context.o cplot.o diagnostic.o fft.o field.o field_utils.o fitswcs.o homo.o main.o makeit.o makeit2.o misc.o pca.o prefs.o psf.o sample.o sample_utils.o vignet.o wcs_utils.o xml.o ./fits/libfits.a ./levmar/liblevmar.a ./wcs/libwcs_c.a -L/Users/timj/work/lsstsw/stack/DarwinX86/fftw/3.3.3-1-g8fdba61+da39a3ee5e/lib /Users/timj/work/lsstsw/stack/DarwinX86/fftw/3.3.3-1-g8fdba61+da39a3ee5e/lib/libfftw3f.dylib -L/Users/timj/work/lsstsw/src/psfex/lapack_functions/lib -llapackstub -lf2c -lm -lplplotd Undefined symbols for architecture x86_64: ""_plwid"", referenced from: _cplot_drawloccoordgrid in cplot.o _cplot_fwhm in cplot.o _cplot_ellipticity in cplot.o _cplot_moffatresi in cplot.o _cplot_asymresi in cplot.o _cplot_counts in cplot.o _cplot_countfrac in cplot.o ... ld: symbol(s) not found for architecture x86_64 {code} This particular error is caused by PSFEX using a deprecated PLplot API ({{plwid}}) that is not enabled by default and whose name is not translated to {{c_plwid}}. This PLplot change occurred in version 5.9.10 released in 2013. I assume upstream PSFEX has a fix for this. Given that LSST does not need the PLplot functionality I think the simplest fix may well be to disable the test for PLplot in our version. It seems likely that there will be a reasonable number of systems ""in the wild"" who will have PLplot installed so I'm inclined to think that this should be a blocker for the v11.0 release. If we are lucky people will have all upgraded their PLplot installs to v5.11.0 because in that version PLplot change the name of the library from {{libplplotd}} to {{libplplot}} and PSFEX has hard-wired the former rather than using pkg-config. This results in configure not finding PLplot. I don't think this eventuality is likely though. ",0.5 +"DM-3670","08/31/2015 19:19:07","obs_test needs to override map_camera and std_camera","The Butler can't get a camera unless the map_camera and std_camera are defined correctly. In most cases the camera can be built by the map_camera method. In the case of obs_test, the camera is built in the constructor of the Mapper, so std_camera should just return the camera attribute.",1 +"DM-3675","08/31/2015 23:42:31","Resourcing Verification runs"," Identify required resources for Verification runs and communicate them to NCSA. ",2 +"DM-3678","09/01/2015 14:05:01","HSC backport: Standalone updates to star object selection","This involves pulling over the following standalone (i.e. non-ticket) HSC commits: [Updated star selection algorithm.|https://github.com/HyperSuprime-Cam/meas_algorithms/commit/071fcadc016908a10583c746f0a8e79df2a45ead] [Appropriate config parameter for a unit test of testPsfDetermination.py.|https://github.com/HyperSuprime-Cam/meas_algorithms/commit/e73c5e447ac0b8a71926d3e78fec30aad4beee91] [Remove HSC specific codes.|https://github.com/HyperSuprime-Cam/meas_algorithms/commit/15bb812578531766199e9a1ee41cc707fb3d9873] (Note, the above reverts some unwanted camera-specific clauses added in the first commit. May just squash them to only add the desired features) [ObjectSizeStarSelector: push non-fatal errors to DEBUG level|https://github.com/HyperSuprime-Cam/meas_algorithms/commit/44f75bc60b41c5f77b323a8d9981048ef7e5f3c4] [We don't use focal plane coordinates anywhere, and detector may be None|https://github.com/HyperSuprime-Cam/meas_algorithms/commit/4413db4610e4793727e591f395f5ad8cd0cb6030] [Fixed axis labels|https://github.com/HyperSuprime-Cam/meas_algorithms/commit/67efacaccf8346fdfa1b450617aebabddb2b7ec0] [Improved PSF debugging plots|https://github.com/HyperSuprime-Cam/meas_algorithms/commit/b1bc91ed1538607eb90e070881a82498fd551909] [Worked on star selector|https://github.com/HyperSuprime-Cam/meas_algorithms/commit/6b36f4d757187d30142a7e026754a07ffeb8dea2]",1 +"DM-3679","09/01/2015 14:40:40","Allow building/publishing components off branches other than master","Support of xrootd within the stack is currently complicated by the fact that qserv depends on features that are not available on upstream master (only available on an upstream non-master branch). Since we can currently only publish packages from master, this means that our lsst fork of xrootd cannot be a ""pure"" fork -- we end up merging/rebasing from an upstream branch, then force-pushing the downstream master. Upstream and downstream xrootd repos thus have completely different branch topologies, labels, etc., and history of master in the lsst fork is being continually rewritten to carry local patches forward. The processes of both adopting upstream changes into the lsst fork and the pushing lsst changes back upstream are cumbersome, confusing, and labor intensive. It is proposed that we extend our tools to allow publishing components from branches other than master. This would allow us to have xrootd for example be a ""pure"" fork of upstream -- we could then create our own branch based off any upstream branch, carry our downstream patches there, and release off of that. This functionality could be used similarly for any of our current ""t&p"" components where it would be convenient to track the upstream repo directly and/or carry changes in git instead of in an agglomerated patch file (e.g. when we might want to update frequently and/or contribute general purpose changes back upstream regularly with pr's, etc.)",2 +"DM-3684","09/01/2015 16:53:40","Release engineering Part Two","This epic covers testing and co-ordination work associated with making engineering and official releases, and code to support them. [FE at 70%, JH at 20%, JS at 10%]",40 +"DM-3686","09/01/2015 18:04:48","Fix PATH and compiler version detection in qserv scons","In recently merged DM-3662 compiler version testing was done using OS tools with regular $PATH. This is inconsistent with other scons tools which reset PATH when executing actions. We want to do two things: - propagate PATH to the command execution - Use scons tools to run ""$CXX --version"" instead of OS tools to keep things consistent",1 +"DM-3689","09/02/2015 11:42:37","Slack notification of discourse activity","It would be nice to have a hipchat channel with notifications of discourse activity. As a power up, perhaps new topics under certain categories, I'm thinking specifically of DM Notifications, could generate posts to select general HC channels -- similar to how RFC notifications are currently handled. A quick google search turns up this plugin for integration: https://github.com/binaryage/discourse-hipchat-plugin",0.5 +"DM-3691","09/02/2015 14:01:10","CalibrateTask has outdated, incorrect code for handling aperture corrections","The CFHT-specific CalibrateTask tries to apply aperture correction once just after measuring it (which is too early) and again later, at the right time. The error probably has no effect on the final results, but it is confusing and needlessly divergent from the standard CalibrateTask. The required changes are small. I plan to test by running [~boutigny]'s CFHT demo.",1 +"DM-3692","09/02/2015 16:18:57","HSC backport: Allow for some fraction of PSF Candidates to be reserved from the fitting","This is a port of the changesets from [HSC-966|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-966]. It provides the ability to reserve some fraction of PSF candidates from the PSF fitting in order to check for overfitting and do cross validation.",1 +"DM-3693","09/02/2015 16:25:10","HSC backport: allow photometric and astrometric calibrations to be required","This is a port of the standalone changesets: [calibrate: make astrometry failures non-fatal|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/e9db5c0dcdca20e8f7ba71f24f8b797e71699352] [fixup! calibrate: make astrometry failures non-fatal|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/c2d89396923f9d589822c043ed8753647e70f3f6] (the above is a fixup, so will likely be squashed) [make failure to match sources non-fatal|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/cf5724b852937cfcef1b71b7a372552011fda670] [calibrate: restore original Wcs after initial astrometry solution|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/ab6cb9e206d0456dc764c5ef78ac80ece937c610] [move CalibrateTask from ProcessImageTask into ProcessCcdTask|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/08a8ec029dd52ac55e47b707a6905df061a40506] [processCoadd: set detection to use the declared variances|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/9e8563fd8d630dad967786387b1f27b6bc7ee039] [adapt to removal of CalibrateTask from ProcessImageTask in pipe_tasks|https://github.com/HyperSuprime-Cam/obs_subaru/commit/52733a7ab1731a15cbb93151851f57cec276f928] and HSC tickets: [HSC-1085: background not saved in processCcd|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1085] and [HSC-1086: psf - catalog scatter is very large in some coadds|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1086]",2 +"DM-3694","09/03/2015 02:14:05","Decrease buildbot frequency","Buildbot frequency is now down to two builds, one at 19:42 machine time (NCSA) and one at 1:42. This is to stop people needing buildbot runs to eups publish to have to wait before a CI build, since they are now done on[ https://ci.lsst.codes ]/ Jenkins. ",1 +"DM-3698","09/03/2015 09:42:03","Replace --trace with --loglevel in pipe_base ArgumentParser","Replace the --trace argument with an enhanced version of --loglevel that supports named values and numeric log levels (which are the negative of trace levels). This simplifies the interface for users and potentially reduces the log level/trace level confusion, though that won't fully happen until we finish replacing use of pex_logging Trace and Debug with Log. This work was already done as part of DM-3532; it just needs to be copied with minor changes (since there are no named trace levels in pex_logging).",1 +"DM-3702","09/03/2015 12:13:57","Assemble the report on DCR","Everything learned through the literature search and project wide meeting should be synthesized into a single readable report that details the expected effects of DCR on difference imaging as well as possible mitigation techniques. This may involve some preliminary analysis work to measure effectiveness of various techniques. Note that I expect this to be two weeks of work for two people, thus the 40 story points. I don't know how to assign a story to two people.",40 +"DM-3705","09/03/2015 14:58:49","Update ndarray to use current numpy API","When we build software that uses SWIG and ndarray we get {{warning ""Using deprecated NumPy API}}. Please update ndarray so we are using the current API. If in the interim you want to suppress the warnings, I suggest issuing a separate ticket for that. I have mixed feelings about suppressing the warnings. On the one hand false warnings do make it harder to spot real problems. On the other hand these warnings are not very numerous (building afw only results in 8 of them) and so are fairly easy to ignore.",1 +"DM-3707","09/03/2015 19:43:59","qserv scons - do not copy files to variant_dir","Some people are not happy with our current scons setup which copies source files from source directories to variant_dir, it makes it harder to trace errors using tools like eclipse or debug code. Would be nice to get rid of the extra copy, but we still want to have separate build directory (variant_dir). It should be simple enough, I think, but will need some testing of course.",2 +"DM-3749","09/04/2015 07:51:03","Scons build of lapack_functions in PSFex fails if SCONSFLAGS are set","The scons build system is unaware of extra flags which may be set in SCONSFLAGS environment variable, which are used from scons utils. This will cause the build to fail. The package needs to behave properly and build in the presence of these flags",2 +"DM-3763","09/04/2015 13:37:23","Be friendlier to New Users on Discourse","Please adjust New Users thresholds so that the following common use case on Discourse will work: 1. I have a question/thread I'm really interested in so I join. 2. I post answers, questions, and links. I'm probably going to be heavily involved in this thread (it's why I bothered to create an account, after all). The current defaults allow me only 3 replies, posting of 2 URLs, and no attachments. I don't know what the exact values should be, but the current defaults don't seem to work for what would seem to me to be a pretty common use case.",0 +"DM-3768","09/04/2015 18:17:16","Resolve the issues found in the S15 end-to-end system exercise","There are a few items we need to take care to finish the end-to-end system for S15. ",8 +"DM-3769","09/04/2015 18:22:26","access the database created and populated for Bremerton end-to-end system","Collect the information for the tables populated for Bremerton end-to-end exercise. Use them in SUI/T so we can access them using the DAX API. ",2 +"DM-3770","09/04/2015 18:24:23","build the SUI system on NCSA to use the right database and tables","Due to the changes of the database and tables, the system has to be rebuilt.",1 +"DM-3771","09/04/2015 18:28:05","Resolve the issues accessing the newly populated tables","There are several issues need to be resolved for the system to work properly. ",5 +"DM-3772","09/07/2015 02:27:31","Fix compiler detection for non-default gcc/g++ compiler","{{scons CXX=g+\+-4.4}} launches {{g\+\+-4.4 --version}} which returns {{g++-4.4 (Debian 4.4.7-2) 4.4.7}}. Nevertheless the {{-4.4}} is not supported by Qserv compiler detection tool. Support will be added here",0.5 +"DM-3773","09/08/2015 08:20:04","add RUNID option to EventAppender","A RUNID needs to be added as an option to EventAppender to allow event logging selectors to receive only events for a particular run.",3 +"DM-3774","09/08/2015 12:21:47","lsst_build's default ref from repos.yaml support is broken when building multiple packages","A problem with the default ref in {{repos.yaml}} support implemented in DM-3679 was discovered last Friday, shortly after deploying this feature to the production CI systems. The default ref for {{xrootd}} was changed/overridden in {{repos.yaml}} to {{legacy/master}}. This worked as expected (and as was tested) when setting {{xrootd}} as the sole {{lsstswBuild.sh}} product or when running {{rebuild}} by hand. However, when building any package that pulled in {{xrootd}} as a recursive dependency, the {{master}} branch was being used (this case had not been manually tested).",1 +"DM-3775","09/08/2015 13:09:19","HSC backport: updates to tract and patch finding","This is a port of the following HSC updates to how tracts and patches are found and listed given a set of coordinates. These are all standalone commits (i.e. not associated with a ticket): [Add findTract() and findTractPatchList() in ringsSkyMap.|https://github.com/HyperSuprime-Cam/skymap/commit/761e915dde25ce8ed5622c2d84b83793e9580fd7] [move RingsSkyMap.findTractPatchList to BaseSkyMap.findClosestTractPatchlist|https://github.com/HyperSuprime-Cam/skymap/commit/56476142060bdb7d8c7fb59eacc383f0e0d5c85b] [Small bug fix for RingsSkyMap.findTract().|https://github.com/HyperSuprime-Cam/skymap/commit/f202a7780ebb89166f03479d7447ace1555027c1] [Add fast findTractPatchList() in RingsSkyMap.|https://github.com/HyperSuprime-Cam/skymap/commit/7e49c358501f95ce4c0e1aa8f48103a24391fc22] [Fixed the problems regarding poles and RA wrap.|https://github.com/HyperSuprime-Cam/skymap/commit/841b0c9eda7462a7a4f182b7971d5e8e81478bfe] [Add spaces around '+' and '-' to match LSST standard coding style.|https://github.com/HyperSuprime-Cam/skymap/commit/f7e2f036494afe382e653194c82bb15728c60fc3]",1 +"DM-3778","09/08/2015 16:51:37","Fix compiler warns in protobuf clients","Google protobufs 2.6.1 includes a few unnecessary semicolons in some of its supplied header files; these generate a lot of compiler warnings when compiling client packages. Proposed fix is to add a patch to our eups t&p protobufs package to remove the offending semicolons.",1 +"DM-3779","09/08/2015 17:04:34","clean up gcc and eclipse code analyzer warns","We've been ignoring some accumulating warns in the qserv build for some time now. Now that it is possible to develop qserv in eclipse, it would be useful to address warns and analyzer issues so that we can start to notice when new ones pop up.",1 +"DM-3780","09/08/2015 17:13:13","Rationalize lsst/xrootd repo and maintenance procedures","The procedure for pulling/pushing xrootd changes from/to the upstream official xrootd repo is cumbersome, confusing, and error-prone. Buildbot now has support for releasing packages from branches other than master. Given this, we can now reasonably replace our lsst/xrootd repo with a fresh genuine fork (shared history) of upstream, then carry our lsst-specific work forward on a dev-branch. This will make it much easier to track and contribute to the xrootd project moving forward. Existing legacy branches and tags are to be migrated to the fresh fork, so historical builds will not be broken.",1 +"DM-3790","09/09/2015 11:20:18","Top level product for obs_decam ","We have an action to allow people to pull obs_decam and a working stack without including obs_decam in lsst_distrib. And we want to CI it. The strawman plan is to make a new TLP for decam. SQuaRE will discuss. ",0 +"DM-3792","09/09/2015 13:06:13","obs_test data mis-assembled","obs_test images are mis-assembled and need to be regenerated. This may affect some existing unit tests that rely on the data.",2 +"DM-3797","09/10/2015 11:56:38","Enable SSL to community.lsst.org","Enable SSL (https) for the Discourse site at community.lsst.org",1 +"DM-3798","09/10/2015 12:22:06","Update flag names and config override files to current conventions","The {{deblend.masked}} and {{deblend.blendedness}} flag names in {{meas_deblender}} need to be updated to use underscores instead of periods. Various flag names in the {{examples}} scripts also need updating to the underscore and camelCase format. A search for these flags throughout the database revealed a number of config files that need updating to current conventions. These are also included here.",0.5 +"DM-3800","09/10/2015 15:31:45","testProcessCcd.py computes values that are too different between MacOS and linux","tests/testProcessCcd.py runs processCcd on visit 1 of obs_test's data repository. The result on MacOS is surprisingly different than on linux in at least one case: psfShape.getIxx() computes 2.71 on MacOS X and 2.65 on linux. Iyy and Ixy are likely different. It's worth checking all other computed values, as well. These differences likely indicate that something is wrong, e.g. in obs_test, processCcd, or the way the test runs processCcd. This showed up as part of fixing DM-3792, but it is not clear if the changes on DM-3792 actually caused or increased the difference between MacOS and linux, or if the difference was always too large, but was masked by an intentionally generous tolerance in the unit test.",2 +"DM-3803","09/10/2015 15:55:51","Fix Qserv compiler warnings with clang","Qserv triggers numerous warnings with clang on OS X. Full details are in the attached ticket, here we summarize the distinct warnings classes: h5. Protobuf {code} /Users/timj/work/lsstsw/stack/DarwinX86/protobuf/2.6.1+fbf04ba888/include/google/protobuf/unknown_field_set.h:214:13: warning: anonymous types declared in an anonymous union are an extension [-Wnested-anon-types] mutable union { ^ {code} h5. Qserv {code} In file included from core/modules/sql/statement.cc:32: core/modules/sql/Schema.h:74:1: warning: 'Schema' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct Schema { ^ core/modules/sql/statement.h:35:1: note: did you mean struct here? class Schema; // Forward ^~~~~ struct {code} {code} core/modules/proto/WorkerResponse.h:34:1: warning: 'WorkerResponse' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct WorkerResponse { ^ core/modules/ccontrol/MergingRequester.h:38:3: note: did you mean struct here? class WorkerResponse; ^~~~~ struct {code} {code} In file included from core/modules/qana/QueryMapping.cc:46: core/modules/qproc/ChunkSpec.h:51:1: warning: 'ChunkSpec' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct ChunkSpec { ^ core/modules/qana/QueryMapping.h:44:5: note: did you mean struct here? class ChunkSpec; ^~~~~ struct {code} {code} core/modules/qana/TableInfo.h:186:1: warning: 'DirTableInfo' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct DirTableInfo : TableInfo { ^ core/modules/qana/TableInfo.h:86:1: note: did you mean struct here? class DirTableInfo; ^~~~~ struct core/modules/qana/TableInfo.h:221:1: warning: 'ChildTableInfo' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct ChildTableInfo : TableInfo { ^ core/modules/qana/TableInfo.h:87:1: note: did you mean struct here? class ChildTableInfo; ^~~~~ struct core/modules/qana/TableInfo.h:260:1: warning: 'MatchTableInfo' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct MatchTableInfo : TableInfo { ^ core/modules/qana/TableInfo.h:88:1: note: did you mean struct here? class MatchTableInfo; ^~~~~ struct In file included from core/modules/qana/ColumnVertexMap.cc:36: core/modules/qana/RelationGraph.h:513:1: warning: struct 'Vertex' was previously declared as a class [-Wmismatched-tags] struct Vertex; ^ core/modules/qana/ColumnVertexMap.h:44:7: note: previous use is here class Vertex; ^ In file included from core/modules/qana/ColumnVertexMap.cc:36: core/modules/qana/RelationGraph.h:547:1: warning: 'Vertex' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct Vertex { ^ core/modules/qana/ColumnVertexMap.h:44:1: note: did you mean struct here? class Vertex; ^~~~~ struct {code} {code} core/modules/wbase/Base.h:72:1: warning: 'ScriptMeta' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct ScriptMeta { ^ core/modules/wbase/Task.h:41:5: note: did you mean struct here? class ScriptMeta; ^~~~~ struct {code} {code} In file included from core/modules/parser/BoolTermFactory.cc:46: core/modules/query/Predicate.h:86:27: warning: 'lsst::qserv::query::GenericPredicate::putStream' hides overloaded virtual function [-Woverloaded-virtual] virtual std::ostream& putStream(std::ostream& os) = 0; ^ core/modules/query/Predicate.h:71:27: note: hidden overloaded virtual function 'lsst::qserv::query::Predicate::putStream' declared here: different qualifiers (const vs none) virtual std::ostream& putStream(std::ostream& os) const = 0; ^ {code} {code} core/modules/parser/FromFactory.cc:62:15: warning: unused function 'walkToSiblingBefore' [-Wunused-function] inline RefAST walkToSiblingBefore(RefAST node, int typeId) { ^ core/modules/parser/FromFactory.cc:72:1: warning: unused function 'getSiblingStringBounded' [-Wunused-function] getSiblingStringBounded(RefAST left, RefAST right) { ^ {code} {code} In file included from core/modules/wsched/ChunkDisk.cc:25: core/modules/wsched/ChunkDisk.h:130:10: warning: private field '_completed' is not used [-Wunused-private-field] bool _completed; ^ {code} {code} In file included from core/modules/parser/PredicateFactory.cc:45: core/modules/query/Predicate.h:86:27: warning: 'lsst::qserv::query::GenericPredicate::putStream' hides overloaded virtual function [-Woverloaded-virtual] virtual std::ostream& putStream(std::ostream& os) = 0; ^ core/modules/query/Predicate.h:71:27: note: hidden overloaded virtual function 'lsst::qserv::query::Predicate::putStream' declared here: different qualifiers (const vs none) virtual std::ostream& putStream(std::ostream& os) const = 0; ^ {code} {code} core/modules/parser/WhereFactory.cc:265:31: warning: binding reference member 'c' to stack allocated parameter 'c_' [-Wdangling-field] PrintExcept(Check c_) : c(c_) {} ^~ core/modules/parser/WhereFactory.cc:291:28: note: in instantiation of member function 'lsst::qserv::parser::PrintExcept::PrintExcept' requested here PrintExcept p(mc); ^ core/modules/parser/WhereFactory.cc:269:12: note: reference member declared here Check& c; ^ {code} {code} core/modules/rproc/ProtoRowBuffer.cc:44:11: warning: unused variable 'largeRowThreshold' [-Wunused-const-variable] int const largeRowThreshold = 500*1024; ^ {code} {code} core/modules/util/testIterableFormatter.cc:85:43: warning: suggest braces around initialization of subobject [-Wmissing-braces] std::array iterable { ""1"", ""2"", ""3"", ""4"", ""5"", ""6""}; ^~~~~~~~~~~~~~~~~~~~~~~~~~~~ { } {code} {code} In file included from core/modules/qdisp/XrdSsiMocks.cc:37: core/modules/qdisp/XrdSsiMocks.h:64:16: warning: private field '_executive' is not used [-Wunused-private-field] Executive *_executive; ^ {code} {code} core/modules/xrdoss/QservOss.cc:77:1: warning: unused function 'print' [-Wunused-function] print(std::ostream& os, lsst::qserv::xrdoss::QservOss::StringSet const& h) { ^ {code} h5. OS X {code} core/modules/qdisp/QueryRequest.h:54:25: warning: 'lsst::qserv::qdisp::BadResponseError::what' hides overloaded virtual function [-Woverloaded-virtual] virtual char const* what() throw() { ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../include/c++/v1/exception:95:25: note: hidden overloaded virtual function 'std::exception::what' declared here: different qualifiers (const vs none) virtual const char* what() const _NOEXCEPT; ^ In file included from core/modules/qdisp/Executive.cc:64: In file included from core/modules/qdisp/XrdSsiMocks.h:37: core/modules/qdisp/QueryRequest.h:67:25: warning: 'lsst::qserv::qdisp::RequestError::what' hides overloaded virtual function [-Woverloaded-virtual] virtual char const* what() throw() { ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../include/c++/v1/exception:95:25: note: hidden overloaded virtual function 'std::exception::what' declared here: different qualifiers (const vs none) virtual const char* what() const _NOEXCEPT; ^ {code} {code} core/modules/proto/TaskMsgDigest.cc:55:5: warning: 'MD5' is deprecated: first deprecated in OS X 10.7 [-Wdeprecated-declarations] MD5(reinterpret_cast(str.data()), ^ /usr/include/openssl/md5.h:116:16: note: 'MD5' has been explicitly marked deprecated here unsigned char *MD5(const unsigned char *d, size_t n, unsigned char *md) DEPRECATED_IN_MAC_OS_X_VERSION_10_7_AND_LATER; ^ {code} {code} core/modules/util/StringHash.cc:78:24: warning: 'SHA1' is deprecated: first deprecated in OS X 10.7 [-Wdeprecated-declarations] return wrapHashHex(buffer, bufferSize); ^ /usr/include/openssl/sha.h:124:16: note: 'SHA1' has been explicitly marked deprecated here unsigned char *SHA1(const unsigned char *d, size_t n, unsigned char *md) DEPRECATED_IN_MAC_OS_X_VERSION_10_7_AND_LATER; ^ core/modules/util/StringHash.cc:83:24: warning: 'SHA256' is deprecated: first deprecated in OS X 10.7 [-Wdeprecated-declarations] return wrapHashHex(buffer, bufferSize); ^ /usr/include/openssl/sha.h:150:16: note: 'SHA256' has been explicitly marked deprecated here unsigned char *SHA256(const unsigned char *d, size_t n,unsigned char *md) DEPRECATED_IN_MAC_OS_X_VERSION_10_7_AND_LATER; ^ {code} h5. Xrootd {code} In file included from core/modules/qdisp/Executive.cc:64: In file included from core/modules/qdisp/XrdSsiMocks.h:33: In file included from /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdSsi/XrdSsiRequest.hh:37: /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdSsi/XrdSsiRespInfo.hh:43:1: warning: 'XrdSsiRespInfo' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct XrdSsiRespInfo ^ /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdSsi/XrdSsiSession.hh:45:1: note: did you mean struct here? class XrdSsiRespInfo; ^~~~~ struct {code} {code} core/modules/xrdoss/QservOss.h:64:17: warning: 'lsst::qserv::xrdoss::FakeOssDf::Opendir' hides overloaded virtual function [-Woverloaded-virtual] virtual int Opendir(const char *) { return XrdOssOK; } ^ /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdOss/XrdOss.hh:63:17: note: hidden overloaded virtual function 'XrdOssDF::Opendir' declared here: different number of parameters (2 vs 1) virtual int Opendir(const char *, XrdOucEnv &) {return -ENOTDIR;} ^ {code} {code} In file included from core/modules/xrdsvc/SsiSession.h:32: /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdSsi/XrdSsiResponder.hh:177:27: warning: control may reach end of non-void function [-Wreturn-type] } ^ {code} h5. boost {code} /Users/timj/work/lsstsw/stack/DarwinX86/boost/1.55.0.1.lsst2+fbf04ba888/include/boost/regex/v4/regex_raw_buffer.hpp:132:7: warning: 'register' storage class specifier is deprecated [-Wdeprecated-register] register pointer result = end; ^~~~~~~~~ {code}",0.5 +"DM-3804","09/10/2015 16:44:05","Fix order of arguments - run method of meas_base SingleFrameMeasurementTask","In sfm.py on line 271, a comment indicates that some code is a temporary work around until the switch from meas_algorithms to meas_base is complete. This work is complete, so this temporary workaround should be removed, or if it is decided it should be kept, the comment should be removed. See https://github.com/lsst/meas_base/blob/tickets/DM-2915/python/lsst/meas/base/sfm.py#L271",2 +"DM-3808","09/10/2015 20:11:13","Setup lsst_sphinx_kit package structure","Setup the lsst_sphinx_kit package, including * setup.py * unit tests, tox and Travis CI * README stub * Sphinx stub and readthedocs",0.5 +"DM-3816","09/13/2015 10:11:24","levels in DecamMapper.paf is not quite right","When ccdnum is not given as part of the dataId, instead of iterating over it, an error like this happens {code:java} RuntimeError: No unique lookup for ['ccdnum'] from {'visit': 205344}: 61 matches {code} Likely a problem in policy/DecamMapper.paf",1 +"DM-3821","09/14/2015 08:15:37","Recent CModel bugfixes from HSC","I've just fixed two rather critical bugs in the CModel code on the HSC side (they would have been introduced on the LSST side in the last transfer, DM-2977): - The {{minInitialRadius}} configuration parameter had a default that is too small, causing many galaxies to be fit with point source models, leading to bad star/galaxy classifications. This is HSC-1306. - There was a simple but important algebra error in the uncertainty calculation, making the uncertainty a strong function of magnitude. This is HSC-1313. On the LSST side, the transfer should be quite simple; we'll have to rewrite a bit of code due to the difference in measurement frameworks, but there was very little to begin with (most of the effort in the HSC issues was in debugging).",1 +"DM-3836","09/14/2015 12:41:42","Migrate LDM-129 to new design docs platform","Convert LDM-129 from Word to restructuredText and deploy onto readthedocs.org",2 +"DM-3840","09/14/2015 13:58:21","LSE-72: Phase 3 in X16","Advance Phase 3 details as needed to eliminate obstacles to OCS and DM development during F16.",8 +"DM-3841","09/14/2015 14:01:34","LSE-75: Refine WCS and PSF requirements in W16","Clarify the data format and precision requirements of the TCS (or other Telescope and Site components) on the reporting of WCS and PSF information by DM on a per-image basis. Depends on the ability of the T&S group to engage with this subject. Current PMCS deadline for Phase 3 readiness of LSE-75 is 29-Sep-2015.",8 +"DM-3850","09/14/2015 19:06:16","Nebula metadata service is intermittent","Upon restarting one of my nebula instances (ktl-test), I noticed a failure in the logs: {quote} Sep 14 18:07:29 ktl-test cloud-init: 2015-09-14 18:07:29,157 - util.py[WARNING]: Failed fetching metadata from url http://169.254.169.254/latest/meta-data {quote} Attempting to retrieve that URL seems to randomly vary between succeeding, which returns: {quote} ami-id ami-launch-index ami-manifest-path [...] {quote} and failing, which returns: {quote} 500 Internal Server Error

500 Internal Server Error

Remote metadata server experienced an internal server error.

{quote} These failures may be contributing to observed sporadic {{ssh}} key injection failures.",2 +"DM-3860","09/15/2015 13:38:38","Communication Toolchain support","This epic covers support of communication tools primarily used by DM and/or supported by DM on behalf of other parts of the project - JIRA, Discourse, Hipchat, etc The source of this work is primarily driven by short-term user requests, and so the outcome is timeboxed rather than planned. [JS 50% FE 50%] ",20 +"DM-3863","09/15/2015 14:17:14","Web design fixes DM Design Documents on Sphinx/Read The Docs","Solve fit-and-finish issues with the stock readthedocs.org Sphinx template when rendering DM design documents. Issues include: * Sections need to be numbered and those numbers need to appear in TOC * RTD's TOC does not properly collapse sub-topics * Appropriate styling for document title and author list * Wrapping the changelog table * Adapt section references so that just the section number can be referenced, independently of the section number and title in combination * Section labels given explicitly in the reST markup are different from the anchors that Sphinx gives to the {{}}tags; the former are simply divs inserted in the HTML. The solutions may involve # reconfiguring the Sphinx installation of individual documents # forking the RTD HTML template, and/or # developing extensions for Sphinx in {{sphinxkit}}.",2 +"DM-3887","09/16/2015 09:28:05","Review ICD flowdown to DMSR and design documents","This epic will result in a plan for how DM will manage the ICD requirements and their relationship to LSE-61 (DMSR). This plan should make it easier for authors of design documents to understand which requirements are relevant.",8 +"DM-3892","09/16/2015 11:58:27","Review current version of LSE-78, prepare for LCR","Do a comprehensive read-through of the previous released version of LSE-78. Look for self-consistency and for consistency with the rest of the DM and overall system design. Report issues to appropriate people.",3 +"DM-3898","09/16/2015 12:44:10","Fix xrootd compiler warnings with clang","h5. Xrootd {code} In file included from core/modules/qdisp/Executive.cc:64: In file included from core/modules/qdisp/XrdSsiMocks.h:33: In file included from /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdSsi/XrdSsiRequest.hh:37: /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdSsi/XrdSsiRespInfo.hh:43:1: warning: 'XrdSsiRespInfo' defined as a struct here but previously declared as a class [-Wmismatched-tags] struct XrdSsiRespInfo ^ /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdSsi/XrdSsiSession.hh:45:1: note: did you mean struct here? class XrdSsiRespInfo; ^~~~~ struct {code} {code} core/modules/xrdoss/QservOss.h:64:17: warning: 'lsst::qserv::xrdoss::FakeOssDf::Opendir' hides overloaded virtual function [-Woverloaded-virtual] virtual int Opendir(const char *) { return XrdOssOK; } ^ /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdOss/XrdOss.hh:63:17: note: hidden overloaded virtual function 'XrdOssDF::Opendir' declared here: different number of parameters (2 vs 1) virtual int Opendir(const char *, XrdOucEnv &) {return -ENOTDIR;} ^ {code} {code} In file included from core/modules/xrdsvc/SsiSession.h:32: /Users/timj/work/lsstsw/stack/DarwinX86/xrootd/u.timj.DM-3584-ge22410fa7f+da39a3ee5e/include/xrootd/XrdSsi/XrdSsiResponder.hh:177:27: warning: control may reach end of non-void function [-Wreturn-type] } ^ {code}",0.5 +"DM-3910","09/18/2015 04:22:27","Run and document multi-node test with docker","In order to validate Docker setup on CC-IN2P3 cluster, it is required to launch some test on consistent data. S15 LargeScaleTest data doesn't seems to be compliant with latest Qserv version so running multi-node test would be interesting. Nevertheless the multi-node setup doesn't seems to be documented and, hence, is difficult to reproduce.",3 +"DM-3911","09/18/2015 11:27:19","HSC backport: avoid I/O race conditions config write out","This is a port of [HSC-1106|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1106] When running tasks that write out config settings files ({{processCcd.py}}, for example), if multiple processes start simultaneously, an I/O race condition can occur in writing these files. This is solved here by writing to temp files and then renaming them to the correct destination filename in a single operation. Also, to avoid similar race conditions in the backup file creation (e.g. config.py~1, config.py~2, ...), a {{--no-backup-config}} option (to be used with --clobber-config) is added here to prevent the backup copies being made. The outcome for this option is that the config that are still recorded are for the most recent run.",1 +"DM-3912","09/18/2015 14:16:30","some ctrl_events tests execute outside of execution domain","There are a couple of ctrl_events tests that attempt to execute outside of the valid domains acceptable by the tests, when they shouldn't be. There's a check in place for tests to find this, but a couple of the tests do not have this check.",0.5 +"DM-3922","09/21/2015 14:41:19","Update multi-node setup documentation","Workers in multi-node setup no longer require granting mysql permissions for test datasets since direct mysql connections are no longer used by the data loader.",1 +"DM-3926","09/22/2015 11:58:04","Implement iostream-style formatting in log package","Implement proposed in RFC-96 change to log macros. This ticket only covers defining new set of macros (LOGS() and friends) which use ostringstream for formatting messages. Migration of all clients and removal of LOGF macros will be done in separate ticket.",1 +"DM-3940","09/24/2015 06:02:33","NaiveDipoleCentroid/NaiveDipoleFlux algorithms should not require centroid slot","The {{NaiveDipoleCentroid}} and {{NaiveDipoleFlux}} algorithms in {{ip_diffim}} have members which are instances of {{meas::base::SafeCentroidExtractor}}. Due to the prerequisites that imposes, it is impossible to initialize these algorithms without first defining a {{centroid}} slot. However, there is nothing in these algorithms which actually uses the {{SafeCentroidExtractor}} or any of the information stored in the slot; this seems to be an entirely arbitrary restriction which is likely a legacy of the port to the {{meas_base}} framework. We should remove the use of {{SafeCentroidExtractor}} to simply the code and make it easier to run the test suite (since it will no longer be necessary to run a centroider).",0.5 +"DM-3943","09/24/2015 16:13:58","QMeta thread safety","Initial QMeta implementation is not thread safe, it uses sql/mysql modules which also do not have any protection (there are some mutexes there but not used). Need an urgent fix to avoid crashes due to concurrent queries in czar.",1 +"DM-3947","09/25/2015 17:55:18","Remove dependency on mysqldb in wmgr","Move remaining code that depends on mysqldb to db module",0.5 +"DM-3949","09/25/2015 21:27:30","Remove dependency on mysqldb in qserv","Remove remaining dependencies on mysqldb in qserv.: {code} ./core/modules/tests/MySqlUdf.py ./core/modules/wmgr/python/config.py {code} and use the sqlalchemy from db module instead.",2 +"DM-3951","09/28/2015 03:04:19","Remove qserv_objectId restrictor","qserv_objectId restrictor can be replaced by the IN restrictor. This story involves checking if performance is acceptable if we use IN restrictor instead of qserv_objectId restictor, and if it is, doing the switch and removing the qserv_objectId restictor code.",3 +"DM-3957","09/28/2015 13:01:39","Enable CModel in CalibrateTask prior to PhotoCal","CModel needs to run in CalibrateTask before PhotoCal in order to compute aperture corrections, but it also needs a Calib objects as input, and that isn't available until after PhotoCal is run. On the HSC side, we dealt with this by adding preliminary PhotoCal run before CModel is run, but we could also deal with it by removing the need for a Calib as input, at least in some situations.",1 +"DM-3971","09/29/2015 02:34:48","Package sqlalchemy in eups","Db module is expected to be used by science pipelines, and (per K-T, see qserv hipchat room) we have to package it through eups.",1 +"DM-3980","09/30/2015 10:07:32","Post SQLAlchemy-migration tweaks","Implement some minor tweaks take came in late through PR comments, mostly related to sqlalchemy related migration",0.5 +"DM-3981","09/30/2015 11:42:58","Improve the performance for making the image plot","Make FitsRead work better for multi-threads",1 +"DM-3982","09/30/2015 16:40:45","CalibrateTask is incompatible with older astrometry tasks","[~lauren] and I recently discovered an incompatibility between {{CalibrateTask}}'s schema-handling and the older astrometry tasks, such as {{ANetAstrometryTask}} (the problem affects any astrometry task that utilizes its {{schema}} constructor argument). The problem is that astrometry is called twice in {{CalibrateTask}}: - Just before PSF estimation, in order to get matches to support {{CatalogStarSelector}}. This call is passed {{sources1}}, a catalog containing only the initial, pre-PSF measurements. - After the second measurement stage, to determine the {{Wcs}} and get matches to feed {{PhotoCal}}. This call is passed {{sources}}. The problem is that {{sources.schema != sources1.schema}}, and the astrometry subtask is initialized with {{schema1 == sources1.schema}}. So if the astrometry task needs to fields it added to the schema (the default {{AstrometryTask}} does not), it will likely fail in the second run, as it's being given a catalog that doesn't correspond to the schema it was initialized with. Comments in the code indicate that there's a desire that future {{AstrometryTask}} s will not take a schema argument, and that would mean that they could be called repeatedly with different schemas (as we are doing). I'm not sure that's viable; I think it'd probably be appropriate for {{AstrometryTask}} to set flags indicated the sources it's using (or considered using), as the other calibrate subtasks do. Instead, I think we're probably best off creating two separate astrometry subtasks, each with a different schema, and using the appropriate subtask for each input/output catalog. Clearly we need to come up with a better solution when in the upcoming {{CalibrateTask}} redesign; two subtasks is obviously clunky. Whether this is high-priority before then depends on how much we're dependent on {{ANetAstrometryTask}} as a fallback option, and I haven't been following that conversation closely.",0 +"DM-3987","10/01/2015 10:25:19","remove unnecessary 'psf' arg to SourceDeblendTask.run()","{{SourceDeblendTask.run}} takes both an {{Exposure}} and a {{Psf}}, even though it can get the latter from the former and always should.",1 +"DM-3993","10/01/2015 11:07:10","Display.dot origin swaps x and y","Correcting for xy0 in {{dot}} currently does: {code:hide-linenum} r -= x0 c -= y0 {code} which is backwards.",0.5 +"DM-4002","10/01/2015 14:02:55","Add doc dir and main.dox to obs_test","No obs_* packages have a doc dir. I suggest starting with obs_test since it is perhaps less obvious what it is used for than the others and in some ways acts as a template.",0 +"DM-4003","10/01/2015 14:20:04","Replace zookeeper CSS with mysql","To switch from QservAdmin to CssAccess interface in our Python tools we will need to replace zookeeper with mysql implementation because we do not have C++ KvInterface implementation for zookeeper.",2 +"DM-4009","10/02/2015 13:04:33","Allow FlagHandler to be used from Python","The {{FlagHandler}} utility class makes it easier to manage the flags for a measurement algorithm, and using it also makes it possible to use the {{SafeCentroidExtractor}} and {{SafeShapeExtractor}} classes. Unfortunately, its constructor requires arguments that can only be provided in C++. A little extra Swig wrapper code should make it usable in Python as well.",3 +"DM-4014","10/05/2015 17:56:47","Replace boost::tuple with ","Replace boost::tuple with This ticket will be completed as part of the DM bootcamp at UW.",1 +"DM-4020","10/06/2015 10:00:18","Remove #!/usr/bin/env from pipe_tasks library code","Many tasks in pipe_tasks start with #!/usr/bin/env..., yet are not executable.",0 +"DM-4021","10/06/2015 16:00:25","Replace boost::unordered_map with std::unordered_map","DM boot camp tutorial. ",2 +"DM-4022","10/07/2015 08:31:57","forcedPhotCoadd.py fails on HSC data","When trying to run {{forcedPhotCoadd.py}} on HSC data, I see the following error: {code} $ forcedPhotCoadd.py /raid/swinbank/rerun/LSST/bootcamp --id filter='HSC-I' tract=0 patch=7,7 : Loading config overrride file '/nfs/home/swinbank/obs_subaru/config/forcedPhotCoadd.py' Cannot import lsst.meas.extensions.photometryKron: disabling Kron measurements Traceback (most recent call last): File ""/home/lsstsw/stack/Linux64/meas_base/11.0+2/bin/forcedPhotCoadd.py"", line 24, in ForcedPhotCoaddTask.parseAndRun() File ""/home/lsstsw/stack/Linux64/pipe_base/11.0+2/python/lsst/pipe/base/cmdLineTask.py"", line 433, in parseAndRun parsedCmd = argumentParser.parse_args(config=config, args=args, log=log, override=cls.applyOverrides) File ""/home/lsstsw/stack/Linux64/pipe_base/11.0+2/python/lsst/pipe/base/argumentParser.py"", line 360, in parse_args self._applyInitialOverrides(namespace) File ""/home/lsstsw/stack/Linux64/pipe_base/11.0+2/python/lsst/pipe/base/argumentParser.py"", line 475, in _applyInitialOverrides namespace.config.load(filePath) File ""/home/lsstsw/stack/Linux64/pex_config/11.0/python/lsst/pex/config/config.py"", line 529, in load self.loadFromStream(stream=code, root=root) File ""/home/lsstsw/stack/Linux64/pex_config/11.0/python/lsst/pex/config/config.py"", line 549, in loadFromStream exec stream in {}, local File ""/nfs/home/swinbank/obs_subaru/config/forcedPhotCoadd.py"", line 10, in config.deblend.load(os.path.join(os.environ[""OBS_SUBARU_DIR""], ""config"", ""deblend.py"")) AttributeError: 'ForcedPhotCoaddConfig' object has no attribute 'deblend' {code} This is with the stack version 11.0+3 and {{obs_subaru}} 5.0.0.1-676-g4ae362c.",0.5 +"DM-4036","10/09/2015 16:12:52","Change from boost::math","Most boost::math contents (not including pi) are now available in standard C++. Please convert the code accordingly. In addition to the packages listed above, boost/math is used in ""partition"" a package I don't recognize and not a component JIRA accepts.",5 +"DM-4043","10/09/2015 23:19:01","update memory management in jointcal","jointcal currently uses a combination of raw pointers and a custom reference-counted smart pointer class, {{CountedRef}} (similar to {{boost::intrusive_ptr}}). The code needs to be modified to use a combination of {{shared_ptr}} (most code), {{unique_ptr}} local-scope variables and factory functions, and {{weak_ptr}} (at least some will be necessary to avoid cycles in some of the more complex data structures). As part of this work, we'll also have to remove a lot of inheritance from {{RefCounted}}, which is part of the {{CountedRef}} implementation. This ticket looks like it will require a lot of work, because we'll have to be careful about every conversion to avoid cycles and memory leaks. Nevertheless, I think it will be necessary to do this conversion before attempting any other major refactoring, as I'm worried that having a newcomer make changes to the codebase without first making the memory management less fragile could be very dangerous.",8 +"DM-4063","10/12/2015 11:02:15","Support new casting requirements in NumPy 1.10","The function imagesDiffer() in testUtils attempts to OR an array of unit16s (LHS) against an array of bools(RHS) {{valSkipMaskArr |= skipMaskArr}} and errors with message {code} TypeError: ufunc 'bitwise_or' output (typecode 'H') could not be coerced to provided output parameter (typecode '?') according to the casting rule ''same_kind'' {code} preventing afw from building correctly. ",1 +"DM-4065","10/12/2015 11:26:11","Discuss with MySQL team","This story captures issues/topics that we want to bring up with mysql team.",2 +"DM-4071","10/12/2015 12:12:33","testPsfDetermination broken due to NumPy behaviour change","Old NumPy behaviour (tested on 1.6.2): {code} In [1]: import numpy In [2]: a = numpy.array([]) In [3]: numpy.median(a) /usr/lib64/python2.6/site-packages/numpy/core/fromnumeric.py:2374: RuntimeWarning: invalid value encountered in double_scalars return mean(axis, dtype, out) Out[3]: nan {code} New NumPy behaviour (1.10.0): {code} In [1]: import numpy In [2]: a = numpy.array([]) In [3]: numpy.median(a) [...] IndexError: index -1 is out of bounds for axis 0 with size 0 {code} This breaks {{testPsfDeterminer}} and {{testPsfDeterminerSubimage}}, e.g.: {code} ERROR: testPsfDeterminerSubimage (__main__.SpatialModelPsfTestCase) Test the (PCA) psfDeterminer on subImages ---------------------------------------------------------------------- Traceback (most recent call last): File ""./testPsfDetermination.py"", line 342, in testPsfDeterminerSubimage trimCatalogToImage(subExp, self.catalog)) File ""/Users/jds/Projects/Astronomy/LSST/src/meas_algorithms/python/lsst/meas/algorithms/objectSizeStarSelector.py"", line 377, in selectStars widthStdAllowed=self._widthStdAllowed) File ""/Users/jds/Projects/Astronomy/LSST/src/meas_algorithms/python/lsst/meas/algorithms/objectSizeStarSelector.py"", line 195, in _kcenters centers[i] = func(yvec[clusterId == i]) File ""/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/lib/function_base.py"", line 3084, in median overwrite_input=overwrite_input) File ""/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/lib/function_base.py"", line 2997, in _ureduce r = func(a, **kwargs) File ""/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/lib/function_base.py"", line 3138, in _median n = np.isnan(part[..., -1]) IndexError: index -1 is out of bounds for axis 0 with size 0 {code}",0.5 +"DM-4075","10/13/2015 17:19:47","Assemble eslint rules for JavaScript code quality control","Review and assemble eslint rules, which enforce clean JavaScript and JSX code. Code cleanup to avoid too many rule violations.",8 +"DM-4076","10/13/2015 17:26:41","JavaScript code cleanup - remove unused packages","Remove es6-promise, react-modal, other cleanup",2 +"DM-4078","10/13/2015 21:44:57","convert underscore to lodash","lodash has become a superset of underscore, providing more consistent API behavior, more features, and more thorough documentation. We'd like to convert our underscore package dependencies to lodash while we have only ~20 calls to underscore functions",2 +"DM-4080","10/14/2015 10:19:18","Shutdown mechanism doesn't work when logging process is disabled.","If the logging mechanism is turned off in ctrl_execute, the ctrl_orca Logger doesn't get launched. The current shutdown mechanism waits for the last logging message to be transmitted before shutting down so it doesn't kill off that process. If the logger.launch config file option is set to false, this process never get launched and ctrl_orca hangs after the shutdown waiting for the message to arrive.",8 +"DM-4085","10/14/2015 11:29:08","Attend DM boot camp","Attend DM boot camp to learn more about DM stack, butler, and task. ",2 +"DM-4086","10/14/2015 11:30:13","Attend DM boot camp","Attend DM boot camp to learn more about DM stack, butler, and task. Most of the presentations are located at URL https://community.lsst.org/t/dm-boot-camp-announcement/249. Presentations like afw, eups, tasks, and butler are necessary to participate in LSST, so everyone on LSST must understand these concepts. Look at the list of presentations covering these topics and make sure your understand them. Some of the remaining talks go into more detail or cover more specialized topics. Those talks should be scanned to see if they are of interestß to you.",3 +"DM-4087","10/14/2015 11:32:24","Attend DM boot camp ","Attend DM boot camp to learn more about DM stack, butler, and task. ",3 +"DM-4092","10/14/2015 14:25:18","Update qserv for lastest xrootd","Small API change in latest xrootd, requires a parallel change to qserv. Paves the way for DM-2334",0.5 +"DM-4095","10/15/2015 14:05:04","Please port showVisitSkyMap.py from HSC","The HSC documentation at http://hsca.ipmu.jp/public/scripts/showVisitSkyMap.html includes a useful script for displaying the skymap and CCDs from a set of visits. It would be convenient if a version of this script was available in the LSST stack.",1 +"DM-4098","10/16/2015 00:45:17","Update Trust Level of all LSST DM Staff to Level 4 via the API","It seems safe to update the Discourse trust level of all members of the LSSTDM group on community.lsst.org to Level 4 (full permissions). See https://meta.discourse.org/t/consequences-of-using-or-bypassing-trust-levels-for-company-organization-staff/34564?u=jsick This should alleviate concerns that DM staff are being prevented from fully using the forum. This ticket implements a small notebook to exercise the Discourse API to make this trust level migration possible.",0.5 +"DM-4100","10/16/2015 14:18:50","Replace use of image <<= with [:] in python code","Replace all use of the afw image pixel copy operator {{<<=}} with {{\[:]}} in Python code. See DM-4102 for the C++ version. These can be done independently.",2 +"DM-4102","10/16/2015 16:21:44","Remove use of <<= from C++ code in our stack","Replace usage of deprecated Image operator {{<<=}} in C++ code with {{assign(rhs, bbox=Box2I(), origin=PARENT)}} as per RFC-102 Switch from [:] to assign pixels in Python code where an image view is created for the sole purpose of assigning pixels (thus turning 2-4 lines of code to one and eliminating the need to make a view).",3 +"DM-4104","10/16/2015 16:28:23","Document that <<= is deprecated","Update our documentation to replace Image {{<<=}} pixel copy operator with {{[:] =}} for Python code and {{set(rhs)}} for C++ code (or whatever RFC-102 decides), with a note that {{<<=}} exists but is deprecated.",0 +"DM-4105","10/16/2015 16:33:00","Update user documentation","{{ORDER BY}}, {{objectId IN}} and {{objectId BETWEEN}} predicates support have been improved, this should be documented. ",1 +"DM-4117","10/16/2015 18:24:17","Clean up lsst_stack_docs for preview","Improve the presentation of the New docs overall: # Add a Creative Commons license # Remove stub documents from the presentation # Put READMEs in all doc directories to explain what content will go in them # Clean up and update the source installation guide to reflect 11_0",0.5 +"DM-4125","10/20/2015 09:37:29","pipe_tasks/examples/calibrateTask.py fails","The self contained example calibrateTask.py in pipe_tasks/examples/ fails when attempting to set field ""coord"" in refCat. Exact error message - {code} 11:04:19-vish~/lsst/pipe_tasks (u/lauren/DM-3693)$ examples/calibrateTask.py --ds9 calibrate: installInitialPsf fwhm=5.40540540548 pixels; size=15 pixels calibrate.repair: Identified 7 cosmic rays. calibrate.detection: Detected 4 positive sources to 5 sigma. calibrate.detection: Resubtracting the background after object detection calibrate.initialMeasurement: Measuring 4 sources (4 parents, 0 children) Traceback (most recent call last): File ""examples/calibrateTask.py"", line 150, in run(display=args.ds9) File ""examples/calibrateTask.py"", line 119, in run result = calibrateTask.run(exposure) File ""/home/vish/lsst/lsstsw/stack/Linux64/pipe_base/11.0-2-g8218aaa+5/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/vish/lsst/pipe_tasks/python/lsst/pipe/tasks/calibrate.py"", line 478, in run astromRet = self.astrometry.run(exposure, sources1) File ""examples/calibrateTask.py"", line 90, in run m.set(""coord"", wcs.pixelToSky(s.getCentroid())) File ""/home/vish/lsst/lsstsw/stack/Linux64/afw/11.0-5-g97168e0+1/python/lsst/afw/table/tableLib.py"", line 2372, in set self.set(self.schema.find(key).key, value) File ""/home/vish/lsst/lsstsw/stack/Linux64/afw/11.0-5-g97168e0+1/python/lsst/afw/table/tableLib.py"", line 1064, in find raise KeyError(""Field '%s' not found in Schema."" % k) KeyError: ""Field 'coord' not found in Schema."" {code} Note that {{wcs.pixelToSky(s.getCentroid())}} is set to {{Fk5Coord(15.007663073114244 * afwGeom.degrees, 1.0030133772819259 * afwGeom.degrees, 2000.0)}}",0.5 +"DM-4129","10/20/2015 15:09:29","Database connection problems in daf_ingest","The DbAuth connection fallback in ingestCatalogTask passes the ""password"" keyword argument to {{MySQLdb.connect}} instead of ""passwd"", which fails. Also, the ""port"" command line argument isn't marked as an integer, causing port strings to be passed down to MySQLdb. This results in a type error. ",0.5 +"DM-4133","10/20/2015 23:00:57","Change type of LTV1/2 from int to float when writing afw images to FITS","The LTV1/2 problem is originally my bug. I used integer LTV1/2 in {code} afw/src/image/ExposureInfo.cc: data.imageMetadata->set(""LTV1"", -xy0.getX()); afw/src/image/ExposureInfo.cc: data.imageMetadata->set(""LTV2"", -xy0.getY()); {code} whereas a more careful reading of the NOAO page [http://iraf.noao.edu/projects/ccdmosaic/imagedef/imagedef.html] introducing them includes floating point examples. The fix is to cast the XY0 values to float. I'm not sure if there'll be any side effects of fixing this, but if so they'll be obvious and trivial. ",1 +"DM-4137","10/21/2015 13:13:18","Update DECam CCDs gain, read noise, and saturation values","The values of DECam gain, read noise, and saturation value need to be updated. This ticket is to update them in the Detector amplifier information, which is used in IsrTask. Talked to Robert Gruendl. These values should take precedence over the values in the fits header. They seem stable and do not seem to vary with time. ",1 +"DM-4141","10/21/2015 20:47:59","cmdLineTasks should provide proper unix return codes","When a cmdLineTask fails it doesn't appear to return a non-0 exit code to the shell, making it hard to write shell scripts that chain commands together. Please fix this. E.g. {code} $ bin/assembleCoadd.py /lustre/Subaru/SSP --rerun yasuda/SSP3.8.5_20150810_cosmos:rhl/brightObjectMasks --id tract=9813 patch=5,5 filter=HSC-I --selectId ccd=0..103 visit=1238..1246:2 -c doMaskBrightObjects=True && echo ""Success"" ... 2015-10-22T02:44:13: assembleCoadd FATAL: Failed in task initialization: Your Eups versions have changed. The difference is: --- +++ @@ -48 +48 @@ -obs_subaru HSC-3.11.0a_hsc /data1a/ana/products2014/Linux64/obs_subaru/HSC-3.11.0a_hsc +obs_subaru LOCAL:/home/rhl/LSST/obs/subaru rev:ef3c892f clean-working-copy @@ -55 +55 @@ -pipe_tasks LOCAL:/home/rhl/LSST/pipe/tasks-HSC-1342 rev:84b0f3c4 2 files changed, 5 insertions(+), 6 deletions(-) +pipe_tasks LOCAL:/home/rhl/LSST/pipe/tasks-HSC-1342 rev:9e8ed18b 2 files changed, 47 insertions(+), 42 deletions(-) @@ -60 +59,0 @@ -pyflakes git /home/rhl/Src/pyflakes Please run with --clobber-config to override Success {code}",1 +"DM-4143","10/21/2015 23:52:51","Demonstrate using Breathe for Python & C++ API reference in New Docs","Demonstrate use of breathe for utilizing the existing Doxygen API documentation in the new Sphinx-based doc platform.",3 +"DM-4145","10/22/2015 09:42:56","Reduce scons output in qserv","Yesterday AndyH expressed a valid concern that qserv prints too much info which makes it hard to find errors. By default scons prints whole command line for C++ compilation and linking which are quite long (~half screen depending on your screen size). Most of the time we don't need to see that, so it would be better to replace that with shorter messages like ""Compiling Something.cxx"" and have an option to print full command with --verbose option.",0.5 +"DM-4151","10/22/2015 16:36:03","Search for uses of current afw.wcs in the stack","Search through the stack for all the uses of our Wcs implementation (Wcs, TanWcs, makeWcs, and any other hidden objects) and make a list of all of those uses (on Community for example). This list should note whether the usage is in C++ or python.",2 +"DM-4158","10/23/2015 06:23:07","Allow configuring more statistical options for assembleCoadds.py ","The assembleCoadd.py task has a configuration option doSigmaClip which chooses between MEAN and MEANCLIP. Please replace this with an option to specify the algorithm to be used. Note that afwMath.stringToStatisticsProperty can be used to convert a string to an enum value. In particular, this would allow me to specify a MEDIAN stack if desired (e.g. to make pretty RGB images) ",3 +"DM-4160","10/23/2015 06:38:36","Unused variables in meas.algorithms.utils","Pyflakes 1.0.0 reports: {code} $ pyflakes-2.7 utils.py utils.py:232: local variable 'chi2' is assigned to but never used utils.py:481: local variable 'numCandidates' is assigned to but never used utils.py:482: local variable 'numBasisFuncs' is assigned to but never used utils.py:487: local variable 'ampGood' is assigned to but never used utils.py:492: local variable 'ampBad' is assigned to but never used {code} In the best case, those variables are simply unnecessary, and they should be removed to simplify the code and avoid wasting time. Alternatively, it's possible that they ought to be used elsewhere in the calculation but have been omitted accidentally. Please establish this for each one, then either remove them or fix the rest of the code.",1 +"DM-4165","10/23/2015 11:22:06","Take upstream boost 1.59 patch to squelch warnings for gcc 5.2.1","Under gcc 5.2.1, use of boost 1.59.0 produces a torrent of compiler warns from within boost headers about use of deprecated std::auto_ptr (see https://svn.boost.org/trac/boost/ticket/11622). A patch for this is already committed upstream in boost. It is proposed that we take this patch into the lsst t&p in interim until the next official boost release.",0.5 +"DM-4194","10/26/2015 16:59:15","Python LogHandler does not pass logger name to log4cxx","Not sure how or why it happened, but presently Python LogHandler for lsst.log does not pass logger name to log4cxx layer and all messages from Python logging end in root logger. ",1 +"DM-4202","10/27/2015 14:14:49","Revert temporary disabling of CModel in config override files","Revert the temporary disabling of CModel that relates to a bug noted in DM-4033 that was causing too many failures to test that processCcd (etc.) would run all the way to completion (most of the other fixes/updates related to the initial disabling in the multiband tasks have now been completed in DM-2977 & DM-3821). Relevant files: {code} config/processCcd.py config/forcedPhotCcd.py config/forcedPhotCoadd.py config/measureCoaddSources.py {code} ",1 +"DM-4206","10/28/2015 10:13:52","wmgr should delete database from inventory when dropping it","When wmgr drops database it should also cleanup chunk inventory for that database. ",2 +"DM-4219","10/29/2015 15:12:53","Package capnproto for eups","Prototype is now getting to the point where a wire-protocol package like capnproto or protobuf is needed. capnproto is the new hotness, and we're probably going to want to migrate qserv from protobuf->capnproto at some point. This task is to go ahead and get capnproto packaged and published for use in the replication prototype.",2 +"DM-4223","10/30/2015 07:15:26","IsrTask calls removeFringe in FringeTask but the method does not exist","The method {{removeFringe}} of {{FringeTask}} is called in {{IsrTask}} but there is no {{removeFringe}}. Not sure if {{removeFringe}} was meant to be a place holder",1 +"DM-4225","10/30/2015 11:07:43","Collect single-host performance data for secondary index","Run production-scale (billions of entries) tests on different index options, collect performance statistics for allocation (CPU, memory) and for queries.",3 +"DM-4229","10/30/2015 11:31:52","Identify candidate technology for secondary index","Evaluate results of production-scale performance tests, both single and multiple host. Identify the technology most likely to meet requirements, and estimate performance capability with respect to those requirements",3 +"DM-4230","10/30/2015 12:30:38","Port HSC-1355: Improved fringe subtraction","[HSC-1355|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1355]: ""with this fix, we get much better fringe subtraction"".",2 +"DM-4232","11/02/2015 10:54:08","Variance is set after dark subtraction","In the default {{IsrTask}}, the variance is currently set after dark subtraction. This means that photon noise from the dark is not included in the variance plane, which is incorrect. The variance should be set after bias subtraction and before dark subtraction. [~hchiang2] also points out (DM-4191) that the {{AssembleCcdTask}} with default parameters requires amplifier images with variance planes, even though the variance cannot be set properly until after full-frame bias subtraction. I believe that {{AssembleCcdTask}} only requires a variance plane in the amp images because it does an ""effective gain"" calculation, but I suggest that this isn't very useful (an approximation of an approximation, and you're never going to use that information anyway because it's embedded in the variance plane with better fidelity). I therefore suggest that this effective gain calculation be stripped out and that {{AssembleCcdTask}} not require variance planes.",5 +"DM-4236","11/02/2015 14:00:55","Specify default output location for CmdLineTasks","When neither {{\--output}} or {{\--rerun}} is specified as an argument to a {{CmdLineTask}}, any output from that task appears to be written back to the input repository. Note the use of the term ""appears"": from a preliminary inspection of the code and documentation, it's not clear if this behaviour can be overridden e.g. by environment variables. The HSC stack behaves differently, using {{$INPUT/rerun/$USER}} as a default output location. A [brief discussion|https://community.lsst.org/t/new-argument-parser-behavior-rerun-flag-introduction-discussion/345] suggests that this is the preferred behaviour. Please update the LSST stack to match the HSC behaviour.",2 +"DM-4237","11/02/2015 15:12:45","unable to upload images to nebula","I seem to be unable to upload an image to neblua from a URL via either horizon or the nova cli client. The request seems to queue briefly and then reports a status of {{killed}}. Eg {code} glance image-create --name ""centos-7.1-vagrant"" --disk-format qcow2 --container-format bare --progress --copy-from http://sqre-kvm-images.s3.amazonaws.com/centos-7.1-x86_64 --is-public False --min-disk 8 --min-ram 1024 {code}",2 +"DM-4238","11/02/2015 15:30:19","Fix integer casting error in numpy version 1.10 in obs subaru","Fix type casting in obs_subaru in lates numpy in obs_subaru",0.5 +"DM-4239","11/02/2015 15:43:47","Identify Qserv areas affected by secondary index","Evaluate Qserv software for the Czars and workers to identify where an interface to the secondary index will be required for efficient operation.",5 +"DM-4245","11/02/2015 16:30:48","Image Viewer memory leak","When reloading the same 500MB RAFT image into an image viewer (see the script below), it was discovered that single node Firefy server with 3G memory runs out of memory after ~15 reloads Test case: keep reloading the html file with the following Javascript, creating an image viewer with 500MB image: function onFireflyLoaded() { var iv2= firefly.makeImageViewer(""plot""); iv2.plot({ ""Title"" :""Example FITS Image'"", ""ColorTable"" :""16"", ""RangeValues"":firefly.serializeRangeValues(""Sigma"",-2,8,""Linear""), ""URL"" :""http://localhost/demo/E000_RAFT_R01.fits""}); } Follow up: The bug was traced to java.awt.image.BufferedImage objects not being evicted from VIS_SHARED_MEM cache. Further search showed that java.awt.image.BufferedImage (along with java.io.BufferedInputStream) is in src/firefly/java/edu/caltech/ipac/firefly/server/cache/resources/ignore_sizeof.txt, which lists the classes that have to be ignored when calculating the size of cache. Testing on single node server (VIS_SHARED_MEM cache is not replicated), using [host:port]/fftools/admin/status page: BEFORE (java.awt.image.BufferedImage was commented out in ignore_sizeof.txt) After 14 reloads: Memory - Used : 3.7G - Max : 3.55G - Max Free : 488.0M - Free Active : 488.0M - Total Active : 3.55G Caches: VIS_SHARED_MEM @327294449 Statistics : [ Size:15 Expired:0 Evicted:0 Hits:246 Hit-Ratio:NaN Heap-Size:1120MB ] OUT OF MEMORY on next reload AFTER THE CHANGE (Commented java.awt.image.BufferedImage in ignore_sizeof.txt) After 36 reloads: Memory - Used : 1672.9M - Max : 3.55G - Max Free : 1968.0M - Free Active : 1468.0M - Total Active : 3.6G Caches: VIS_SHARED_MEM @201164543 Statistics : [ Size:3 Expired:0 Evicted:34 Hits:659 Hit-Ratio:NaN Heap-Size:1398MB ] ",2 +"DM-4251","11/04/2015 10:29:07","Please include obs_subaru in CI","{{obs_subaru}} should be included in the CI system.",1 +"DM-4252","11/04/2015 11:50:54","Create GitLFS Technical Note","Create a SQuaRE Technical Note describing the architecture of the GitLFS service implementation.",2 +"DM-4256","11/04/2015 15:06:53","Sphinx support of sqr-001 technical note","Support the distribution of a technical note SQR-001 - Remove oxford comma in author list (documenteer) - Solve issue where title is repeated if the title is included in the restructured text document - Solve issue where name of the HTML document is README.html",1 +"DM-4259","11/04/2015 15:45:42","Create a set of tests (or update the current ones) to facilitate refactoring of dipole measurement","This will create a test (not necessarily a unit test) that will simulate dipoles and measure them so that the measurement can be compared to truth values. This may be simply refactoring the current tests. This task should also include generating more generalizable utilities needed to create the dipoles and incorporating these and other test data into the stack so that they can be used in other studies.",5 +"DM-4264","11/05/2015 13:06:45","SQuaRE supertask design meeting 2","Hold a teleconference design discussion with members of the SQuaRE team. ",1 +"DM-4266","11/05/2015 13:20:44","Should only read fringe data after checking the filter","The fringe subtraction is not necessarily performed if {{doFringe}} is True. It is only if the filter of the raw exposure is listed in config fringe.filters. Fringe data should not be read unless the filter is indicated. There are likely no such filter data and it would cause runtime errors. Seems related to changes from RFC-26 and DM-1299. ",8 +"DM-4286","11/06/2015 22:35:34","Processing of DECAM COSMOS field - Part I","This story covers work on the verification plan done in October. The DECAM Cosmos field was selected, however DECAM ISR is not available so the starting point for now is Community Pipeline reduced data. Have put the data through processCcdDecam and makeCoaddTempExp but had a number of failures that we have so far not been able to pin down. A list of user experience issues is being collated and [~frossie] will generate Summer 16 cycle stories to address those that fall within SQuaRE's defined scope of activities. Story closed to fit within month, but work is ongoing. ",20 +"DM-4295","11/09/2015 12:05:49","Run and document multinode integration tests on Openstack+Docker","Boot openstack machines using vagrant, then deploys docker images and finally launch multinodes tests. FYI, lack of DNS on OpenStack Cloud cause problems, but a vagrant plugin seems to solve this.",8 +"DM-4298","11/09/2015 15:13:16","want equiv of m1.xlarge flavor with smaller disk","I'd like to be able to build images with vcpus & ram from the {{m1.xlarge}} flavor that can be run on a {{m1.medium}} with it's smaller disk image, this would require a new flavor with 16GiB ram/8vcpus but only 40GiB of disk. Something along the lines of: {{openstack flavor create --ram 16384 --disk 40 --vcpus 8 ...}} Is that possible?",2 +"DM-4301","11/10/2015 12:17:50","Convert banner and menu to react/flux","Add flux data model to capture menu and banner information. Convert banner and menu UI from gwt to react. As part of this task, bring in Fetch API to simplify client/server interactions.",8 +"DM-4303","11/10/2015 17:15:45","re-deploy lsstsw on Jenkins","Pandas was added to the bin/deploy script in lsstsw to support sims development. This has already been merged to master in 4b1d1a0fa. The ticket is to ask that lsstsw be redeployed so the sims team can build branches that use pandas.",3 +"DM-4304","11/10/2015 17:38:12","Add unit testing into gradle build for Firefly's server-side code","Add a test task to Firefly's common build script. This can be used by any sub-project to run unit test. Added unit testing to Jenkins continuous integration job to ensure new code does not break unit testing.",1 +"DM-4305","11/10/2015 19:34:35","assembleCoadd broken","A recent update to assembleCoadd to bring over changes to do clipped coadds breaks coadd generation. There are two specific problems. 1. There is an infinite recursion because of SafeClipAssembleCoaddTask calling its own constructor in the \_\_init\_\_ method. 2. The overridden assemble method does not adhere to the original assemble call signature, so when the default run method is called by ParseAndRun, it raises an exception. Additionally I find the flow fairly confusing as the overridden assemble method is called by the default run method which then calls the default assemble method on the parent class.",1 +"DM-4307","11/11/2015 08:24:15","Please add HSC tests to CI","In DM-3663 we (= [~price]) provided an integration test for processing HSC data through the stack with the intention that it should be integrated with the CI system. Having this test available and regularly run would be enormously helpful with the HSC port -- we've already run into problems which it could have helped us avoid (DM-4305).",0.5 +"DM-4310","11/11/2015 10:53:36","Missing Doxygen documentation","As of [2015-11-10 02:53.26|https://lsst-web.ncsa.illinois.edu/doxygen/xlink_master_2015_11_10_02.53.27/] there were 19 ""mainpages in subpackages"" available through Doxygen. In the next build, [2015-11-10 21:16.19|https://lsst-web.ncsa.illinois.edu/doxygen/xlink_master_2015_11_10_21.16.19/], most of them have vanished and we only provide links for {{ndarray}} and {{lsst::skymap}}. As of filing this issue, they were still missing from the [latest build|https://lsst-web.ncsa.illinois.edu/doxygen/x_masterDoxyDoc/]. Please bring them back!",1 +"DM-4311","11/11/2015 11:54:03","Oct. on-going support to Camera team in UIUC","Attend UIUC weekly meeting and give support as needed. ",2 +"DM-4312","11/11/2015 11:55:52","Nov. on-going support to Camera team in UIUC","Attend UIUC weekly meeting and give support as needed. ",2 +"DM-4323","11/11/2015 16:56:17","Replace fitsthumb in obs_subaru (port HSC-1196)","{{fitsthumb}} is now obsolete; all the functionality we need is available in {{afw}}. Further, we want to drop it as a dependency to make the job of integrating {{obs_subaru}} with CI easier.",1 +"DM-4329","11/12/2015 09:18:31","Coadd_utils tests should run and skip if afwdata is missing","Currently, the {{coadd_utils}} tests are completely skipped at the scons layer if afwdata can not be located. This is bad for two reasons: 1. Are there any tests that can be run even if afwdata is missing?. 2. When we switch to a proper test harness (e.g. DM-3901) an important metric is the number of tests executed compared with the number of tests skipped. Each test file (or even each test) should determine itself whether it should be skipped based on afwdata availability. This should not be a global switch.",1 +"DM-4344","11/13/2015 08:58:40","pipe_tasks 11.0-14-ga314014+5 fails a test on os/x 10.10.5","Running {code} SCONSFLAGS=""-j 6 opt=3"" eups distrib install pipe_tasks 11.0-14-ga314014+5 {code} on my laptop running ox/x 10.10.5 results in a test failure: {code} F. ====================================================================== FAIL: testEdge (__main__.interpolationTestCase) Test that we can interpolate to the edge ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/testInterpImageTask.py"", line 127, in testEdge validateInterp(miInterp, useFallbackValueAtEdge, fallbackValue) File ""tests/testInterpImageTask.py"", line 90, in validateInterp self.assertAlmostEqual(val0, fallbackValue, 6) AssertionError: 2.5515668 != 2.5515660972664267 within 6 places {code} ",0 +"DM-4347","11/13/2015 10:31:46","dax_imgserv 2015_10.0 build error","{{2015_10.0}} has a build error under a current {{lsstsw/bin/deploy}} environment. Current speculation is that this is related to the conda version of numpy being upgraded to {{1.10.1}}.",1 +"DM-4349","11/13/2015 12:52:26","Fix publishing script async issue and add additional release notes.","Async command execution causes unpredictable and unreliable results. Switches to synchronous where possible. Also add additional description to the release notes.",2 +"DM-4360","11/16/2015 12:10:02","obs_subaru fails to compile after DM-3200","Due to atypical calls in {{obs_subaru}}'s {{hsc/SConscript}} to run scripts in the {{bin}} directory, {{obs_subaru}} fails to compile after the changes made in DM-3200.",0.5 +"DM-4362","11/16/2015 14:42:56","SuperTask phase 1 implementation","This story represents the implementation of the first part of the SuperTask framework design,",8 +"DM-4366","11/17/2015 09:19:38","Improve overscan correction for DECam raw data","Currently, the default overscan correction from IsrTask is used for processing DECam raw data. Overscan subtraction is done one amplifier at a time. However, a bias jump occurs due to the simultaneous readout of the smaller ancillary CCDs on DECam, some images show discontinuity in the y direction across one amplifier, as in the example screen shot. This ticket is to improve overscan correction for DECam data so to mitigate this discontinuity in the ISR processing. Arrangement of CCDs on DECam: http://www.ctio.noao.edu/noao/sites/default/files/DECam/DECamPixelOrientation.png h3. More details: There are 6 backplanes in the readout system, shown by the colors in DECamPixelOrientation.png. In raw data files, the CCD's backplane is noted in the header keyword ""FPA"". Examination of some images suggests that science CCDs on orange and yellow backplanes show bias jump at 2098 pixels from the y readout. That is the y size of the focus CCDs. h3. Actions: For CCDs on the affected backplanes, divide the array into two pieces at the jump location, and do overscan correction on the upper and lower pieces separately. ",5 +"DM-4367","11/17/2015 09:46:41","Fix bug and add unit tests for PsfShapeletApprox ","We discovered during this Sprint that this plugin was giving us faulty values for all the models except for SingleGaussian. I will fix that bug on this issue. Obviously, a better unit test would have caught this. I am adding a DoubleGaussian unit test, plus a test that the default models provide different results. Also a timing test for all the models, as we do not really have enough information about the performance of the shapelet approximation. ",3 +"DM-4370","11/17/2015 12:03:47","Migrate testdata for DECam from disk to git-lfs","There is a package full with test data for the obs_decam. It is an eups package currently, but not a git repository. I would like to migrate that into our hosted git-lfs so the obs_decam package can be built by Jenkins. [~jmatt] I'm hoping you would be willing to handle this for me. If not I can find somebody else. Thanks!",0.5 +"DM-4373","11/17/2015 13:37:36","HSC backport: Add tract conveniences","This is a port of [HSC-715: Add tract conveniences|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-715]. Here is the original description: {panel} In regular HSC survey processing, we'll run with a ""rings"" skymap to cover the entire survey area. meas_mosaic does not currently efficiently or conveniently iterate over tracts. For example: {code} mosaic.py /tigress/HSC/HSC --rerun price/cosmos --id field=SSP_UDEEP_COSMOS filter=HSC-R {code} Note the lack of a tract in the --id specifier — we want to iterate over all tracts. This is not currently possible. Instead, if we do not know the tract of interest (which the user should not be required to know), we have to iterate over all the tracts (e.g., tract=0..12345), but the user should not be required to know the number of tracts, and this is slow (and possibly memory-hungry: currently consuming 11GB on tiger3 just for 12 exposures). We need an efficient mechanism to iterate over all tracts by not specifying any tract on the command-line. {panel} As this functionality was added specifically for {{meas_mosaic}}, it was going to be ported as part of DM-2674. Due to a recent desire to use this functionality, this ticket will be ported here.",1 +"DM-4375","11/17/2015 17:01:00","A slimmer testdata_decam","Before this ticket, the files in {{testdata_decam}} are as they are downloaded from the archive. Some are MEF, and the total is a bit big (1.2G). I made a trimmed down version of {{testdata_decam}}, 109M and available here: [https://github.com/hchiang2/testdata_decam.git] I trimmed it down by only saving the primary HDU and one data HDU. However the unit tests in {{getRaw.py}} become less meaningful and I am not sure if we really want to do this, because some complexities of DECam files are about the MEF. {{getRaw.py}} tests Butler retrieval of multiple dataset types, in particular tests if the correct HDU is retrieved. Nonetheless, {{getRaw.py}} can pass (with branch u/hfc/DM-4375 of obs_decam) Note: the old testdata_decam still live on lsst-dev:/lsst8/testdata_decam/",2 +"DM-4381","11/18/2015 10:33:53","""SHUTOFF"" nebula instances consume core/ramIt quota","It appears that halted/shutoff instances have no effect on resource quota usage. Eg: {code:java} $ openstack server list +--------------------------------------+-----------------------+-------------------+----------------------------------------+ | ID | Name | Status | Networks | +--------------------------------------+-----------------------+-------------------+----------------------------------------+ ... | 1956c6d0-8aec-4f42-a781-8a68fd10179d | el7-jhoblitt | SHUTOFF | LSST-net=172.16.1.171, 141.142.208.150 | ... $ nova absolute-limits +--------------------+--------+--------+ | Name | Used | Max | +--------------------+--------+--------+ | Cores | 141 | 150 | | FloatingIps | 0 | 10 | | ImageMeta | - | 128 | | Instances | 44 | 100 | | Keypairs | - | 100 | | Personality | - | 5 | | Personality Size | - | 10240 | | RAM | 342016 | 400000 | | SecurityGroupRules | - | 20 | | SecurityGroups | 1 | 10 | | Server Meta | - | 128 | | ServerGroupMembers | - | 10 | | ServerGroups | 0 | 10 | +--------------------+--------+--------+ $ openstack server delete 1956c6d0-8aec-4f42-a781-8a68fd10179d $ nova absolute-limits +--------------------+--------+--------+ | Name | Used | Max | +--------------------+--------+--------+ | Cores | 133 | 150 | | FloatingIps | 0 | 10 | | ImageMeta | - | 128 | | Instances | 43 | 100 | | Keypairs | - | 100 | | Personality | - | 5 | | Personality Size | - | 10240 | | RAM | 325632 | 400000 | | SecurityGroupRules | - | 20 | | SecurityGroups | 1 | 10 | | Server Meta | - | 128 | | ServerGroupMembers | - | 10 | | ServerGroups | 0 | 10 | +--------------------+--------+--------+ {code} ",2 +"DM-4383","11/18/2015 15:33:46","Avoid restarting czar when empty chunk list changes","Currently czar caches empty chunk list after it reads the list from file. This complicates things when we need to update the list, integration test for example has to restart czar process after it loads new data to make sure that czar updates its cached list. Would be nice to have simpler mechanism to resetting cached list in czar without restarting it completely. It could be done via special query (abusing FLUSH for example) or via sending signal (problematic if czar runs remotely). This can be potentially useful even after we replace empty chunk list file with some other mechanism as I expect that cache will stay around even for that.",2 +"DM-4385","11/18/2015 17:35:42","Remove stringToAny from the utils package","I accidentally left stringToAny in Utils.cc when implementing DM-2635, even though I removed it from Utils.h and it is not used anywhere. Remove it and mark RFC-47 Done.",0 +"DM-4386","11/19/2015 10:51:31","Clean up ProcessCcdDecam","ProcessCcdDecam needs some cleanup: * {{run}} method simply delegates to the base class * {{propagateCalibFlags}} is a no-op (deliberately in {{cab69086}}, need to explore if the original problem still exists) * The config overrides (in config/processCcdDecam.py): ** Uses the catalog star selector, which isn't wise given the current heterogeneity of reference catalogs. ** Sets the background {{undersampleStype}} to {{REDUCE_INTERP_ORDER}}, which is the default.",1 +"DM-4387","11/19/2015 12:08:06","Skymap fails tests on testFindTractPatchList","When skymap is built and healpy is loaded, {{testFindTractPatchList}} fails with: {quote}====================================================================== FAIL: Test findTractPatchList ---------------------------------------------------------------------- Traceback (most recent call last): File ""/Users/ctslater/lsstsw/build/skymap/tests/SkyMapTestCase.py"", line 245, in testFindTractPatchList self.assertClosestTractPatchList(skyMap, [tractInfo.getCtrCoord()], tractId) File ""/Users/ctslater/lsstsw/build/skymap/tests/SkyMapTestCase.py"", line 284, in assertClosestTractPatchList tractPatchList = skyMap.findClosestTractPatchList(coordList) File ""/Users/ctslater/lsstsw/build/skymap/python/lsst/skymap/baseSkyMap.py"", line 146, in findClosestTractPatchList tractInfo = self.findTract(coord) File ""/Users/ctslater/lsstsw/build/skymap/python/lsst/skymap/healpixSkyMap.py"", line 97, in findTract index = healpy.ang2pix(self._nside, theta, phi, nest=self.config.nest) File ""/Users/ctslater/lsstsw/stack/DarwinX86/healpy/1.8.1+12/lib/python/healpy-1.8.1-py2.7-macosx-10.5-x86_64.egg/healpy/pixelfunc.py"", line 367, in ang2pix check_theta_valid(theta) File ""/Users/ctslater/lsstsw/stack/DarwinX86/healpy/1.8.1+12/lib/python/healpy-1.8.1-py2.7-macosx-10.5-x86_64.egg/healpy/pixelfunc.py"", line 110, in check_theta_valid assert (np.asarray(theta) >= 0).all() & (np.asarray(theta) <= np.pi + 1e-5).all(), ""Theta is defined between 0 and pi"" AssertionError: Theta is defined between 0 and pi{quote} This was missed during regular CI testing since healpy is not normally setup. ",1 +"DM-4391","11/20/2015 10:32:09","Update testCoadds.py to accommodate changes in DM-2915","As of DM-2915, the config setting: {code}self.measurement.plugins['base_PixelFlags'].masksFpAnywhere = ['CLIPPED']{code} is set as a default for {{MeasureMergedCoaddSourcesTask}}. However, this *CLIPPED* mask plane only exists if a given coadd was created using the newly implemented {{SafeClipAssembleCoaddTask}}. If a coadd was built using {{AssembleCoaddTask}}, the *CLIPPED* mask plane is not present, so the above default must be overridden to exclude it when using {{MeasureMergedCoaddSourcesTask}}. This is the case for the mock coadd that is assembled in the unittest code in {{testCoadds.py}}, so the config needs to be set for the test to run properly. Note that the associated tests for {{SafeClipAssembleCoaddTask}} will be added as part of DM-4209.",0.5 +"DM-4395","11/20/2015 12:50:19","Update cmsd configuration for multi-node tests","A particular cmsd configuration parameter prefixes a hardcoded path for QueryResource, which needs to be removed. This seems to appear only during multi-node tests.",1 +"DM-4396","11/20/2015 12:55:17","ctrl_execute test fails to find test binary","There's a test in ctrl_execute that exercises the bin/dagIdInfo.py test program. Since the rewrite_shebang rewrites happen after the tests are executed, the test that looks for the bin/dagIdInfo.py binary fails, since it's not there before the tests execute.",0.5 +"DM-4398","11/20/2015 14:44:48","Fix regexp for gcc48","DM-2622 inttoduced some regexes which raise exceptions when built with gcc48 (e.g. on centos7). gcc48 support for regexes is generally broken, so it's better to replace that with boost regexes.",1 +"DM-4399","11/20/2015 15:15:58","ctrl_execute test fails under El Capitan","The test/testDagIdInfo.py because it runs a script from bin.src, rather than bin. This test needs to be rewritten.",1 +"DM-4402","11/23/2015 12:06:44","Experiment with light-weight SQL databases for secondary index","Evaluate the use of light-weight SQL, such as InnoDB, TokuDB (now Kyoto Cabinet), or RocksDB to create and manage the secondary index.",8 +"DM-4408","11/24/2015 10:02:54","HSC backport: fix memory leak in afw:geom:polygon","This is a backport of a bug fix that got included as part of [HSC-1311|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1311]. It is not related to that issue in particular, so is being ported here as an isolated bug fix. {panel} Original commit message: pprice@tiger-sumire:/tigress/pprice/hsc-1311/afw (tickets/HSC-1311=) $ git sub commit 55ad42d37fd1346f8ebc11e4077366dff4eaa87b Author: Paul Price Date: Wed Oct 21 10:59:56 2015 -0400 imageLib: import polygonLib to prevent memory leak When doing ""exposure.getInfo().getValidPolygon()"", was getting: swig/python detected a memory leak of type 'boost::shared_ptr< lsst::afw::geom::polygon::Polygon > *', no destructor found. This was due to the polygonLib not being imported in imageLib. Using polygonLib in imageLib then requires adding polygon.h to all the swig interface files that use imageLib.i. examples/testSpatialCellLib.i | 1 + python/lsst/afw/cameraGeom/cameraGeomLib.i | 1 + python/lsst/afw/detection/detectionLib.i | 1 + python/lsst/afw/display/displayLib.i | 1 + python/lsst/afw/geom/polygon/Polygon.i | 1 + python/lsst/afw/image/imageLib.i | 2 ++ python/lsst/afw/math/detail/detailLib.i | 1 + python/lsst/afw/math/mathLib.i | 1 + 8 files changed, 9 insertions(+) {panel}",0.5 +"DM-4410","11/24/2015 12:45:53","Port detection task footprint growth changes from HSC","In hsc the default behavior for the detection task is to updated footprints with a footprint which has been grown by the psf. This behavior needs to be ported to LSST, as some source records have footprints which are too small. When making this change, the new default needs to be overridden for the calibrateTask, as it needs the original size. The port includes 8e9fb159a3227f848e0db1ecacf7819599f1c03b from meas_algorithms and 8bf0f4a44c924259d9eefbd109aadec7d839e0f2 from pipe_tasks",5 +"DM-4421","11/25/2015 10:35:24","faulty assumption about order dependency in ctrl_event unit tests","A recent change to daf_base uncovered a couple of faulty tests in ctrl_events that incorrectly assumes the order in which assumed the order in which data in a PropertySet would be received. We can't assume which order these values will be put into the property set, and therefore into the list retrieved from the Event object.",0.5 +"DM-4431","11/25/2015 16:47:08","setup mechanism to measure the portal performance","Setup the method to measure the query response time: # query sent to the data provider from client (from portal to FF server, to data provider) # result returns from data provider (FF server gets data, prepare the data for portal) # result displayed in the client Setup the method to measure image preparation in Firefly server: * measure the time needed to prepare the image (generate the image in PNG or other suitable format) for client display. * Track/identify the reasons if performance is not satisfactory * Make plan for design/code change Setup the method to measure image/plot rendering in Firefly client (portal): * Measure the time needed to render the image after the data received by the client for display. * Measure the time needed to generate the 2D plot after the data is received by the client * Track/identify the reasons if the performance is not satisfactory * Make plan for design/code change",40 +"DM-4438","11/26/2015 08:32:14","Replace sed with stronger template engine in docker scripts","Dockerfile are generated using templates and sed, this should be strengthened.",2 +"DM-4440","11/26/2015 08:41:45","Remove QSW_RESULTPATH and XROOTD_RUN_DIR if useless","These parameters may be useless (see DM-4395). If yes they can be removed to simplify configuration procedure.",2 +"DM-4443","11/30/2015 06:58:16","Please document the --rerun option","DM-3371 adds the {{--rerun}} option to command line tasks. The help for this option reads: {quote} rerun name: sets OUTPUT to ROOT/rerun/OUTPUT; optionally sets ROOT to ROOT/rerun/INPUT {quote} While essentially correct, that's not particularly helpful in understanding what's actually going on here. A motivation and description of this functionality is available in RFC-95: please ensure that, or some variation of it, is included in the stack documentation.",1 +"DM-4451","11/30/2015 11:42:24","F17 Qserv Disconnected Queries","* Design and implement *basic* system for determining whether particular query is synchronous or asynchronous. The complete version will come through DM-1490. Note that this work is related to shared scans (e.g., we need to know what scans we have running) * Design SQL API for starting and interacting with async queries. * Modify Qserv to support async queries (starting, getting status, retrieving results) Note, async queries are indirectly related to authentication (users should not see each other' async queries). Deliverable: Qserv that accepts and executes queries asynchronously, and allows users to retrieve results.",20 +"DM-4454","11/30/2015 12:35:47","Fix multiple patch catalog sorting for forcedPhotCcd.py","{{forcedPhotCcd.py}} is currently broken due to the requirement of the {{lsst.afw.table.getChildren()}} function that the *SourceCatalog* is sorted by the parent key (i.e. {{lsst.afw.table.SourceTable.getParentKey()}}). This occurs naturally in the case of *SourceCatalogs* produced by the detection and deblending tasks, but it may not be true when concatenating multiple such catalogs. This is indeed the case for {{forcedPhotCcd.py}} as a given CCD can be overlapped by multiple patches, thus requiring a concatenation of the reference catalogs of all overlapping patches. There two places in the running of {{forcedPhotCcd.py}} where calls to {{getChildren()}} can cause a failure: one in the {{subset()}} function in {{references.py}}, and the other in the {{run}} function of *SingleFrameMeasurementTask* in {{sfm.py}}.",2 +"DM-4457","11/30/2015 15:55:02","Investigate MemSQL","Take a look at the MemSQL distributed database.",8 +"DM-4503","12/01/2015 22:43:18","FITS Visualizer porting: selecting points of catalog from image view, showing selected points","able to draw a rectangle on the image, and select the catalog entries overlaid on the image",5 +"DM-4504","12/01/2015 22:44:20","FITS Visualizer porting: Image Select Panel/Dialog","Converting the image select dialog/panel is a very big job and should be to be broken up into several tickets: Each ticket should reference this ticket as the base. Panel includes the following: * issa, 2mass, wise, dss, sdss tabs * file upload tab, upload widget might have to be written * url tab * blank image tab * target info reusable widget * 3 color support - any panel should show for 3 times, for read, green, and blue in 3 color mode * must be able to appear in a panel or dialog * must add or modify a plot * Allow to create version with most or less than the standard tabs. example - see existing wise 3 color or finder chart 3 color * A plot might need to be tied to specific type of image select dialog, we need a way to tie a plotId to and non-standard image select panel.",1 +"DM-4510","12/02/2015 10:55:36","makeDocs uses old style python","{{makeDocs}} is written in python 2.4 style. This ticket is for updating it to python 2.7.",0.5 +"DM-4511","12/02/2015 11:39:08","Improve reStructuredText documentation","Enhance docs by covering - Images as links - Table spans - Abbreviations - :file: semantics, etc.",2 +"DM-4515","12/02/2015 13:48:28","Flag out the glowing edges of DECam CCDs","Pixels near the edges of the DECam CCDs are bigger/brighter and correcting them is not trivial. One way to move forward is to mask them out. DESDM and CP mask 15 pixels on each edge. The cut was later raised to 25 pixels, with the inner 10 pixels flagged as SUSPECT. ",5 +"DM-4523","12/03/2015 03:12:05","Fix startup.py inside Docker container","qserv tag should be replace with qserv_latest",0.5 +"DM-4529","12/03/2015 10:17:24","Compilation errors from CLang (Apple LLVM 7.0) in XCode 7 on MacOSX","Compiling on MacOSX Yosemite with XCode 7, a number of files fail compilation. ---- {{core/modules/util/EventThread.h,cc}} fails because {{uint}} is used as a data type. This is non-standard (though some compilers support it), and should be replaced with {{unsigned int}}. ---- {{core/modules/wbase/SendChannel.h,cc}} fails because {{#include }} is missing. ---- {{core/modules/wsched/ChunkState.cc}} fails because {{#include }} is missing. ---- {{build/qmeta/qmetaLib_wrap.cc}} (generated by SWIG) fails with many errors because the {{typedef unsigned long int uint64_t}} included in {{qmetaLib.i}} conflicts with MacOSX's typedef of it as {{unsigned long long}}.",1 +"DM-4534","12/03/2015 14:56:34","shellcheck linting of lsstsw bash scripts","This issue is to recover a branch from DM-4113 that was not merged due to issues with installing shellcheck under travis.",1 +"DM-4544","12/04/2015 16:27:33","Revisit short and long term plans for butler","Revisit short and long term requirements and needs and capture it through stories.",5 +"DM-4556","12/05/2015 16:54:40","Fix docker workflow","Some issues where discovered while trying to package DM-2699 in Docker (for IN2P3 cluster deployment), they're fixed here. - apt-get update times out: why? - git clone then pull is too weak (if building the first clone fails, pull never occurs) => step merged - eupspkg -er build creates lib/python in /qserv/stack/.../qserv/... and next install can't remove it for unknow reason => build and install merged.",2 +"DM-4564","12/07/2015 09:55:38","Convert basic table functionalities to JS.","Task includes server-side json conversion, data modeling, and a simple React table for presentation.",20 +"DM-4572","12/07/2015 10:46:58","Table (JS): table options","This task is composed of: - adding table options panel to TablePanel. - providing features: - show/hide units in header - show/hide columns, reset to defaults, etc - page size",5 +"DM-4574","12/07/2015 10:49:37","Table (JS): text view","This task is composed of: - adding text view option to TablePanel",2 +"DM-4576","12/07/2015 11:03:46","XY Scatter Plot (JS) ","Implement basic scatter plot widget using react-highcharts library",8 +"DM-4580","12/07/2015 11:18:36","XY Plot view of a table (JS) - Toolbar","Toolbar, which toggles plot options, selection and filter buttons Extra: - handling zoom from the toolbar rather than using built-in zoom - ability to switch between histogram and scatter plot view",8 +"DM-4582","12/07/2015 11:30:51","XY Plot View of a table (JS) - selection support","Show/change selected/highlighted points. Ideally, this should be done without redrawing the whole plot. ",8 +"DM-4583","12/07/2015 11:38:15","SUIT: search returning images in a directory","- Create a sample search processor, which returns images in a given directory. - It should be using an external python task - Update search form configuration to use this search processor to return image metadata",2 +"DM-4591","12/07/2015 13:05:45","GWT conversion: System notifications","This task is composed of: - adding notification panel to the application - convert server-side code to use messaging for notifications - use messaging on client-side to handle notifications - creating action, action creator, and reducing functions - depends on DM-4578 Integrate websocket messaging into flux ",3 +"DM-4596","12/07/2015 14:15:02","Remove deprecated versions of warpExposure and warpImage","afw.math supports two templated variants of warpExposure and warpImage, one that takes a warping control and the other which does not. The latter have been deprecated for a long time and are no longer used. I think it is time to remove them.",1 +"DM-4603","12/08/2015 10:37:07","sconsUtils tests should depend on shebang target","Some tests rely on code in the {{bin}} directory. Whilst these tests have been modified to use {{bin.src}} the general feeling is that the test code should be able to rely upon the {{shebang}} target having been executed before they are run.",0.5 +"DM-4609","12/09/2015 13:15:06","Partition package should use the standard package layout","The partition package does not build on OS X El Capitan because the package is not laid out in the standard manner and whilst {{sconsUtils}} is used most of the default behaviors are over-ridden. This means that fixes implemented for DM-3200 do not migrate over to {{partition}}. I think the best approach would be to reorganize the package so that it does build in the normal way.",1 +"DM-4617","12/09/2015 19:44:22","Send all chunk-queries to primary copy of the chunk","We are planning to distribute chunks / replicas across worker nodes such that each node will have a mix of primary copies for some chunks, and backup copies for some chunks. While doing shared scan, we are going to always rely on the primary chunks (e.g., all queries that need a given chunk should be sent to the same machine so that we read that chunks only once on one node). This story involves tweaking xrootd to ensure we don't send chunk-queries to nodes hosting non-primary copies.",5 +"DM-4631","12/11/2015 08:58:10","Create IDL pipeline workflow for DRP processing - processCcdDecam","For the verification datasets work we need the ability to run a dataset all the way through DRP processing that takes advantage of many cores (orchestration), keeps track of successful and failed dataIDs, creates individual log files, and creates helpful QA plots/metrics/webpages. Nidever will use his PHOTRED IDL workflow and rewrite it for the stack. The first step is processCcdDecam.",5 +"DM-4643","12/14/2015 11:47:13","Add utility function to handle client-side download requests.","Create utility function to handle client-side download requests. It needs to be done in a way that does not mess with history and current page state.",1 +"DM-4648","12/14/2015 16:25:39","Support sqlalchemy use with qserv","When one tries to connect to qserv using sqlalchemy there is an exception generated currently: {noformat} $ python -c 'import sqlalchemy; sqlalchemy.create_engine(""mysql+mysqldb://qsmaster@127.0.0.1:4040/test"").connect()' /u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/default.py:298: SAWarning: Exception attempting to detect unicode returns: InterfaceError(""(_mysql_exceptions.InterfaceError) (-1, 'error totally whack')"",) ""detect unicode returns: %r"" % de) Traceback (most recent call last): File """", line 1, in File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py"", line 2018, in connect return self._connection_cls(self, **kwargs) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py"", line 72, in __init__ if connection is not None else engine.raw_connection() File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py"", line 2104, in raw_connection self.pool.unique_connection, _connection) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py"", line 2078, in _wrap_pool_connect e, dialect, self) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py"", line 1405, in _handle_dbapi_exception_noconnection exc_info File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/util/compat.py"", line 199, in raise_from_cause reraise(type(exception), exception, tb=exc_tb) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py"", line 2074, in _wrap_pool_connect return fn() File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/pool.py"", line 318, in unique_connection return _ConnectionFairy._checkout(self) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/pool.py"", line 713, in _checkout fairy = _ConnectionRecord.checkout(pool) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/pool.py"", line 480, in checkout rec = pool._do_get() File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/pool.py"", line 1060, in _do_get self._dec_overflow() File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/util/langhelpers.py"", line 60, in __exit__ compat.reraise(exc_type, exc_value, exc_tb) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/pool.py"", line 1057, in _do_get return self._create_connection() File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/pool.py"", line 323, in _create_connection return _ConnectionRecord(self) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/pool.py"", line 454, in __init__ exec_once(self.connection, self) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/event/attr.py"", line 246, in exec_once self(*args, **kw) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/event/attr.py"", line 256, in __call__ fn(*args, **kw) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/util/langhelpers.py"", line 1312, in go return once_fn(*arg, **kw) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/strategies.py"", line 165, in first_connect dialect.initialize(c) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/dialects/mysql/base.py"", line 2626, in initialize default.DefaultDialect.initialize(self, connection) File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/default.py"", line 256, in initialize self._check_unicode_description(connection): File ""/u2/salnikov/STACK/Linux64/sqlalchemy/2015_10.0/lib/python/SQLAlchemy-1.0.8-py2.7-linux-x86_64.egg/sqlalchemy/engine/default.py"", line 343, in _check_unicode_description ]).compile(dialect=self) File ""/u2/salnikov/STACK/Linux64/mysqlpython/1.2.3.lsst1/lib/python/MySQL_python-1.2.3-py2.7-linux-x86_64.egg/MySQLdb/cursors.py"", line 174, in execute self.errorhandler(self, exc, value) File ""/u2/salnikov/STACK/Linux64/mysqlpython/1.2.3.lsst1/lib/python/MySQL_python-1.2.3-py2.7-linux-x86_64.egg/MySQLdb/connections.py"", line 36, in defaulterrorhandler raise errorclass, errorvalue sqlalchemy.exc.InterfaceError: (_mysql_exceptions.InterfaceError) (-1, 'error totally whack') {noformat} The reason for that is that sqlalchemy generate few SELECT queries to figure out unicode support by the engine, and those selects are passed to qserv which cannot parse them. Here is the list of SELECTs which appears in proxy log: {code:sql} SELECT CAST('test plain returns' AS CHAR(60)) AS anon_1 SELECT CAST('test unicode returns' AS CHAR(60)) AS anon_1 SELECT CAST('test collated returns' AS CHAR CHARACTER SET utf8) COLLATE utf8_bin AS anon_1 SELECT 'x' AS some_label {code}",3 +"DM-4652","12/14/2015 17:10:24","CI debugging","diagnosing build failures and refreshing build slaves",1 +"DM-4656","12/15/2015 00:12:29","Port code style guidelines to new DM Developer Guide","Verbatim port of DM Coding style guidelines to Sphinx doc platform from Confluence. - https://confluence.lsstcorp.org/display/LDMDG/DM+Coding+Style+Policy - https://confluence.lsstcorp.org/display/LDMDG/Python+Coding+Standard - https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908666 and contents I’m unclear whether these pages should be included: - https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20283399 (C++ ‘using’) - https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20284190 (how to use C++ templates) - https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20283399 (C++11/14; which should seem to belong in the code style guide) Any temptation to amend and update the style guideline content will be avoided.",5 +"DM-4667","12/15/2015 14:25:03","Improve sphgeom documentation","Per RFC-117, the sphgeom package needs decent overview documentation, linked from the top-level README.md. The doxygen output should also be reviewed.",2 +"DM-4677","12/15/2015 18:33:13","Design Interfaces for Memory Management for Shared Scans","Part of the shared scans involve memory management - a system that will be used by Qserv that will manage memory allocation / pin chunks in memory. This story involves designing the API between Qserv and the memory management system. ",8 +"DM-4679","12/16/2015 09:56:40","work with database team to exercise all the APIs for data access (F16)","SUI will continue to work with database team to exercise all the APIs for data access. All known issues should be worked out in S16 cycle.",40 +"DM-4688","12/16/2015 19:25:12","Changed the implementation of HistogramProcessorTest due to the minor change about the algorithm in the HistogramProcessor","In Histogram, when the data points fall on the bin edges, the following rules are used: # For each bin, it contains the data points fall inside the bin and the data point fall on the left edge. For example, if binSize=2, the bin[0] is in the range of [0,2]. The data value 0 is in bin[0] . # For each bin, the data point falls on the right edge is not included in the number point count. For example if binSize=2, the bin[0] is having the range of [0,2]. The data value 0 is in bin[0] but the data value 2 is not in the bin[0]. # For the last bin, the data points fall inside the bin or fall on the left or right bin are counted as the number of bin points. The last rule is newly introduced. ",2 +"DM-4704","12/17/2015 17:28:46","Qserv integration tests fail on CentOS7 with gcc 4.8.5","The version of gcc that ships with CentOS7, {{gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-4)}}, appears to miscompile the qserv worker source in a way that makes it impossible to actually run queries. Installing {{devtoolset-3-toolchain}} and {{devtoolset-3-perftools}} via {{yum}} to get gcc 4.9 resolves the issue.",2 +"DM-4705","12/17/2015 19:54:09","qdisp/testQDisp fails with mariadb","Fabrice fried to build qserv with mariadb and it caused failure in one of the unit test: qdisp/testQDisp with the message: {noformat} pure virtual method called terminate called without an active exception {noformat} Runnig it with GDB it' obvious that there is a problem with resource lifetime management in qdisp/testQDisp.cc. The problem is that XrdSsiSessionMock is destroyed sooner than other objects that use it. One way to resolve this problem is to instantiate XrdSsiSessionMock earlier than other objects that use it (to reverse the order of destructors), possibly make it a global instance. Big mystery here is how mariadb could trigger this interesting behavior and why did not we see this earlier.",1 +"DM-4706","12/17/2015 22:28:39","Rerun and create a repository for CFHT astrometry test.","Understand, re-rerun, and recreate clean version of [~boutigny] 's CFHT astrometry test for the astrometry RMS for two sample CFHT observations. This test is on the NCSA machines in /lsst8/boutigny/valid_cfht Create a repository for this test with an eye toward it becoming integrated in a validation suite for the stack. ",1 +"DM-4708","12/17/2015 22:37:20","Integrate astrometry test into SDSS demo lsst_dm_stack_demo","Incorporate the astrometry test as an optional component in lsst_dm_stack_demo. This is chosen because lsst_dm_stack_demo currently serves as the very loose stack validation and understanding how to do astrometric repeatibility testing in this demo will help explore how it would make sense to put in a fuller CFHT validation test of the DM stack.",0.5 +"DM-4710","12/18/2015 08:22:48","host identification info needs to be part of log message","The EventAppender needs to add host identification (host/process/id) information to the log message it transmits. This was inadvertently left out.",3 +"DM-4711","12/18/2015 09:51:04","Edit testdata_cfht to pass obs_cfht unit tests","This ticket covers the first half of the issues in DM-2917. {{testdata_cfht}} was left unedited while some past changes in {{obs_cfht}} {{MegacamMapper}} required coordinated changes. The goal of this ticket is to simply pass the unit tests currently in {{obs_cfht}}. ",1 +"DM-4716","12/20/2015 12:18:20","Track down reason for slow performance when running many jobs of processCcdDEcam on bambam","During the processing of the COSMOS data for the verification dataset work I ran many jobs of processCcdDecam.py on the new linux server, bambam. The performance was very slow, 4x longer than running a single job at a time. Figure out what is going on.",2 +"DM-4722","12/21/2015 13:43:34","File tickets for list of stack deficiencies and suggested upgrades","K-T suggested that I take my list of ""stack deficiencies and suggested improvements"" [https://confluence.lsstcorp.org/display/SQRE/Stack+Deficiencies+and+Suggested+Upgrades] on confluence and (with Tim J.'s help) create tickets for each item (as much as possible) so that the work could be scheduled. ",3 +"DM-4728","12/22/2015 06:22:03","Doxygen package fails to build with flex 2.6","To wit: {code} $ flex --version flex 2.6.0 $ bash newinstall.sh LSST Software Stack Builder [...stuff...] eups distrib: Failed to build doxygen-1.8.5.eupspkg: Command: source /Users/jds/Projects/Astronomy/LSST/stack/eups/bin/setups.sh; export EUPS_PATH=/Users/jds/Projects/Astronomy/LSST/stack; (/Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/doxygen-1.8.5/build.sh) >> /Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/doxygen-1.8.5/build.log 2>&1 4>/Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/doxygen-1.8.5/build.msg exited with code 252 $ grep error /Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/doxygen-1.8.5/build.log commentscan.l:1064:55: error: use of undeclared identifier 'yy_current_buffer' commentscan.l:1126:58: error: use of undeclared identifier 'yy_current_buffer' {code} Builds fine using {{flex 2.5.35 Apple(flex-31)}}.",1 +"DM-4729","12/23/2015 10:42:51","HSC backport: Add functions to generate 'unpacked matches' in a Catalog","The qa analysis script under development (see DM-4393) calls to HSC {{hscPipeBase}}'s [matches.py|https://github.com/HyperSuprime-Cam/hscPipeBase/blob/master/python/hsc/pipe/base/matches.py] which adds functions to generate ""unpacked matches"" in a Catalog (and vice versa). It will be ported into {{lsst.afw.table}}. The port includes following HSC commits: *Add functions to generate 'unpacked matches' in a Catalog.* https://github.com/HyperSuprime-Cam/hscPipeBase/commit/210fcdc6e1d19219e2d9365adeefd9289b2e1186 *Adding check to prevent more obscure error.* https://github.com/HyperSuprime-Cam/hscPipeBase/commit/344a96de741cd5aafb5e368f7fa59fa248305af5 *Some little error handling helps.* https://github.com/HyperSuprime-Cam/hscPipeBase/commit/61cc053b873d42802581adff8cbbdb52a348879e (from branch: {{stage-ncsa-3}}) *matches: add ArrayI to list of field types that require a size* https://github.com/HyperSuprime-Cam/hscPipeBase/commit/d4ccd11d8afbcdd9cf0b35eba948cca4b5d09ba5 (from branch {{tickets/HSC-1228}}) Please also include a unittest.",3 +"DM-4731","12/23/2015 15:56:58","Add labels to qa analysis plots for better interpretation","The plots output by the qa analysis script (see DM-4393) currently do not display any information regarding the selection/rejection criteria used in making the figures and computing the basic statistics. This includes magnitude and clipping thresholds. This information should be added to each plot such that the figures can be interpreted properly.",2 +"DM-4732","12/23/2015 21:52:31","butler parentSearch incorrectly returns '[]' instead of 'None' in one case.","The contract for `daf.butlerUtils.CameraMapping.parentSearch` states that `None` will be returned if no matches are found. But in one of the options ` if os.path.realpath(pathPrefix) != os.path.realpath(root):` the code says `return []` instead of `return None`. This is then not caught properly by `daf.butlerUtils.mapping.map`, which was checking just for the return value `is not None`. The behavior of parentSearch is clearly an error (it's againt the documentation for the function). I would also suggest that `daf.butlerUtils.mapping.map` should check more robustly for `if newPath:` rather than `if newPath is not None:`. Thus we explicitly set up this if/else to return `None` if `not paths` instead of `[]`. ",0 +"DM-4734","12/30/2015 08:10:24","afw fails to build on a machine with many cores","The afw package does not build reliably (if at all) on a linux box at UW (""magneto"", which has 32 cores and 128 Gb of RAM). The failure is that some unit tests fail with the following error: {code} OpenBLAS: pthread_creat error in blas_thread_init function. Error code:11 {code} For the record, /usr/include/bits/local_lim.h contains this: {code} /* The number of threads per process. */ #define _POSIX_THREAD_THREADS_MAX 64 /* We have no predefined limit on the number of threads. */ #undef PTHREAD_THREADS_MAX {code} It appears that the build system is trying to use too many threads when building afw, which presumably means it is trying to use too many cores. According to [~mjuric] the package responsible for this is {{eupspkg}}, and it tries to use all available cores. A workaround suggested by [~mjuric] is to set environment variable {{EUPSPKG_NJOBS}} to the max number of cores wanted. However, I suggest we fix our build system so that setting this variable is unnecessary. I suggest we hard-code an upper limit for now, though fancier logic is certainly possible. A related request is to document the environment variables that control our build system. I searched for {{NJOBS}} on confluence and found nothing.",0.5 +"DM-4752","01/06/2016 15:47:54","Build on Mac very slow due to running fc-list","Builds on MacOS 10.11 have been painfully slow (for instance 23 minutes to build {{afw}} instead of the more typical 12 minutes, 30 minutes to rebuild {{ip_diffim}}, 20 minutes to rebuild {{meas_astrom}}) and much of this time is spent running fc-list in 8 cores. I suspect {{matplotlib}} is triggering this process, but I have not verified it. I see this using lsstsw to build a fresh stack and with manual builds. A workaround is to repeatedly kill {{fc-list}}, e.g. with this bash script: {code} while sleep 1; do pkill fc-list; done {code} I checked my fonts with Apple's Font Book and found a few dozen with ""minor errors"" that I deleted, but nothing serious. I'm still seeing the problem.",0.5 +"DM-4753","01/06/2016 15:56:44","Cleanup location of anonymous namespaces","we place anonymous namespace in two ways: (a), INSIDE lsst::qserv:: namespace, or (b) BEFORE. This story involves cleaning it up - move them to before lsst::qserv::",1 +"DM-4759","01/07/2016 09:35:18","Port Data set info converter achitechture","defines various image data types, how to get them, groupings, artifacts. I am not quite happy with how we did in in GWT so the design needs to be improved. Must be less complex.",8 +"DM-4780","01/08/2016 18:08:24","meas_extensions_shapeHSM seems to be broken","I have installed the meas_extensions_shapeHSM package together with galsim and tmv (I documented it at : https://github.com/DarkEnergyScienceCollaboration/ReprocessingTaskForce/wiki/Installing-the-LSST-DM-stack-and-the-related-packages#installing-meas_extensions_shapehsm) and tried to run it on CFHT cluster data. My config file is the following: {code:python} import lsst.meas.extensions.shapeHSM config.measurement.plugins.names |= [""ext_shapeHSM_HsmShapeRegauss"", ""ext_shapeHSM_HsmMoments"", ""ext_shapeHSM_HsmPsfMoments""] config.measurement.plugins['ext_shapeHSM_HsmShapeRegauss'].deblendNChild='' config.measurement.slots.shape = ""ext_shapeHSM_HsmMoments"" {code} When I run measCoaddSources.py, I get the following error : {code} Traceback (most recent call last): File ""/sps/lsst/Library/lsstsw/stack/Linux64/pipe_tasks/2015_10.0-10-g1170fd0/bin/measureCoaddSources.py"", line 3, in MeasureMergedCoaddSourcesTask.parseAndRun() File ""/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/cmdLineTask.py"", line 444, in parseAndRun resultList = taskRunner.run(parsedCmd) File ""/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/cmdLineTask.py"", line 192, in run if self.precall(parsedCmd): File ""/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/cmdLineTask.py"", line 279, in precall task = self.makeTask(parsedCmd=parsedCmd) File ""/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/cmdLineTask.py"", line 363, in makeTask return self.TaskClass(config=self.config, log=self.log, butler=butler) File ""/sps/lsst/Library/lsstsw/stack/Linux64/pipe_tasks/2015_10.0-10-g1170fd0/python/lsst/pipe/tasks/multiBand.py"", line 530, in __init__ self.makeSubtask(""measurement"", schema=self.schema, algMetadata=self.algMetadata) File ""/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/task.py"", line 255, in makeSubtask subtask = configurableField.apply(name=name, parentTask=self, **keyArgs) File ""/sps/lsst/Library/lsstsw/stack/Linux64/pex_config/2015_10.0-1-gc006da1/python/lsst/pex/config/configurableField.py"", line 77, in apply return self.target(*args, config=self.value, **kw) File ""/sps/lsst/dev/lsstprod/clusters/my_packages/meas_base/python/lsst/meas/base/sfm.py"", line 247, in __init__ self.initializePlugins(schema=self.schema) File ""/sps/lsst/dev/lsstprod/clusters/my_packages/meas_base/python/lsst/meas/base/baseMeasurement.py"", line 298, in initializePlugins self.plugins[name] = PluginClass(config, name, metadata=self.algMetadata, **kwds) File ""/sps/lsst/dev/lsstprod/clusters/my_packages/meas_base/python/lsst/meas/base/wrappers.py"", line 15, in __init__ self.cpp = self.factory(config, name, schema, metadata) File ""/sps/lsst/dev/lsstprod/clusters/my_packages/meas_base/python/lsst/meas/base/wrappers.py"", line 223, in factory return AlgClass(config.makeControl(), name, schema) File ""/sps/lsst/dev/lsstprod/clusters/my_packages/meas_extensions_shapeHSM/python/lsst/meas/extensions/shapeHSM/hsmLib.py"", line 964, in __init__ def __init__(self, *args, **kwargs): raise AttributeError(""No constructor defined - class is abstract"") AttributeError: No constructor defined - class is abstract {code}",1 +"DM-4781","01/09/2016 00:25:30","MariaDB does not work together with mysql-proxy","We have switched to MAriaDB but there is one issue that complicates things - mysql client from mariadb fails to connect to mysql-proxy with an error: {noformat} ERROR 1043 (08S01): Bad handshake {noformat} so Fabrice had to find a workaround for our setup to use client from mysqlclient package instead. This workaround is not perfect and it complicates other things. Would be nice to make things work transparently for mariadb. ",2 +"DM-4782","01/10/2016 21:52:31","JIRA project for the publication board","The LSST Publication Board requests a JIRA project for managing its workload. ",2 +"DM-4785","01/11/2016 16:01:59","Update provenance in baseline schema","Current provenance schema in baseline (cat/sql) is very old and no longer reflect latest thinking. This story involves bringing cat/sql up to data and replacing existing prv_* tables with tables we came up with in the epic.",2 +"DM-4786","01/12/2016 03:28:31","Packge mysqlproxy 0.8.5","See https://mariadb.atlassian.net/browse/MDEV-9389",2 +"DM-4789","01/12/2016 11:31:00","FITS Visualizer porting: Mouse Readout: part 3: Lock by click & 3 color support","add toggle button that make the mouse readout lock to last position click on. It will not longer update on move but by click Include: 3 Color Support",8 +"DM-4793","01/12/2016 17:45:21","Refactor prototype docs into “Developer Guide” and Science Pipelines doc projects","Refactor [lsst_stack_docs|https://github.com/lsst-sqre/lsst_stack_docs] into two doc projects - LSST DM Developer Guide that will be published to {{developer.lsst.io}}, and - LSST Science Pipelines that will be published to {{pipelines.lsst.io}}",3 +"DM-4794","01/12/2016 18:04:15","Write Zoom Options Popup","Write the simple zoom options popup that is show when the user clicks zoom too fast or the zoom level exceeds the maximum size. activate this popup from visualize/ui/ZoomButton.jsx",2 +"DM-4798","01/13/2016 10:03:31","DetectCoaddSourcesTask.scaleVariance gets wrong result","DetectCoaddSourcesTask.scaleVariance is used to adjust the variance plane in the coadd to match the observed variance in the image plane (necessary after warping because we've lost variance into covariance). The current implementation produces the wrong scaling in cases where the image has strongly variable variance (e.g., 10 inputs contributed to half the image, but only 1 input contributed to the other half) because it calculates the variance of the image and the mean of the variance separately so that clipping can affect different pixels. Getting this scaling very wrong can make us dig into the dirt when detecting objects, with drastic implications for the resultant catalog. This is a port of [HSC-1357|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1357] and [HSC-1383|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1383].",1 +"DM-4801","01/13/2016 14:48:50","Update the ground truth values in the lsst_dm_demo to reflect new defaults in deblending","In DM-4410 default configuration options were changed such that footprints are now grown in the detection task, and the deblender is run by default. This breaks the lsst_dm_demo, as now the results of processing are slightly different. The short term solution as part of DM-4410 was to run the demo with the defaults overridden to be what they were prior to DM-4410. In the long term the values used in the compare script should be updated to reflect what would be generated with running processCcd with the stack defaults. ",0.5 +"DM-4806","01/14/2016 13:27:18","Test stack with mariadbclient","Now that we switched Qserv to mariadb, it'd be good to switch the rest of the stack. This story involves trying out if things still work if we switch mysqlclient to mariadbclient.",2 +"DM-4820","01/15/2016 11:33:49","Improvement of raw data handling in DecamMapper","Two minor improvements with better coding practice: - Be more specific copying FITS header keywords. Avoid potential problems if unwelcome keywords appear in the header in the future. Suggested in the discussions in DM-4133. - Reuse {{isr.getDefectListFromMask}} for converting defects. A more efficient method that uses the FootprintSet constructor with a Mask and a threshold has just been adopted in DM-4800. Processing is not changed effectively. ",1 +"DM-4821","01/15/2016 15:58:30","HSC backport: Remove interpolated background before detection to reduce junk sources","This is a port of [HSC-1353|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1353] and [HSC-1360|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1360]. Descriptions from HSC: {panel:title=HSC-1353} We typically get a large number of junk detections around bright objects due to noise fluctuations in the elevated background. We can try to reduce the number of junk detections by adding an additional local background subtraction before object detection. We can then add this back in after detection of footprints and peaks. {panel} {panel:title=HSC-1360} I forgot to set the useApprox=True for the background subtraction that runs before footprint and peak detection. This will then use the Chebyshev instead of the spline. {panel}",1 +"DM-4823","01/15/2016 17:45:01","Add Dropdowns to Vis toolbar","Add the dropdown to the vis tool bar",2 +"DM-4824","01/15/2016 18:02:30","Clean up div and css layout on FitsDownloadDialog","FitsDownload dialogs html and css is not quite right. Needs some clean up.",1 +"DM-4825","01/15/2016 18:35:53","makeDiscreteSkymap has a default dataset of 'raw'","The default dataset type for command line tasks is raw. In the case MakeDiscreteSkyMapTask is asking the butler for calexp images. This shouldn't be a problem, but in my case I have calexp images, but no raw images. This causes the task to think there is no data to work on, so it exits.",1 +"DM-4831","01/18/2016 10:45:46","Add bright object masks to pipeline outputs","Given per-patch inputs providing {code} id, B, V, R, ra, dec, radius {code} for each star to be masked, use this information to set: * A bit in the mask plane for each affected pixel * A flag in the source catalogues for each object that has a centroid lying within this mask area This is a port of [HSC-1342|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1342] and [HSC-1381|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1381].",3 +"DM-4833","01/18/2016 11:08:13","Update configuration for Suprime-Cam","The {{obs_subaru}} configuration for Suprime-Cam needs updating to match recent changes in the stack. Port of [HSC-1372|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1372].",1 +"DM-4834","01/18/2016 11:11:55","Preliminaries for LSST vs HSC pipeline comparison through coadd processing","This is the equivalent of DM-3942 but through coadd processing. Relevant HSC tickets include: * [HSC-1371|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1371]",1 +"DM-4835","01/18/2016 11:17:39","Allow slurm to request total CPUs rather than nodes*processors.","On some systems, we are asked to request a total number of tasks, rather than specify a combination of nodes and processors per node. It also makes sense to use the SMP option this way. This is a port of [HSC-1369|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1369].",2 +"DM-4841","01/18/2016 15:07:01","Use high S/N band as reference for multiband forced photometry","We are currently choosing the priority band as the reference band for forced photometry as long as it has a peak in the priority band regardless of the S/N. Please change this to pick the highest S/N band as the reference band when the priority band S/N is sufficiently low. This is a port of [HSC-1349|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1349].",1 +"DM-4842","01/18/2016 15:17:15","Don't write HeavyFootprints in forced photometry","There's no need to persist {{HeavyFootprint}}s while performing forced photometry since retrieving them is as simple as loading the _meas catalog. This is a port of [HSC-1345|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1345].",0.5 +"DM-4847","01/18/2016 17:08:07","Add new blendedness metric","[HSC-1316|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1316] shifts the calculation of blendedness from {{meas_deblender}} to {{meas_algorithms}} and defines a new blendedness metric in the process. Please port it.",3 +"DM-4849","01/18/2016 21:26:22","LDM-151 - comments from Jacek","I am reading your https://github.com/lsst/LDM-151/blob/draft/DM_Applications_Design.tex, and I have some minor comments suggestions. I am going to add comments to this story to capture it. Feel free to apply to ignore :)",1 +"DM-4850","01/19/2016 06:20:23","Factor out duplicate setIsPrimaryFlag from MeasureMergedCoaddSourcesTask and ProcessCoaddTask","{{MeasureMergedCoaddSourcesTask.setIsPrimaryFlag()}} and {{ProcessCoaddTask.setIsPrimaryFlag()}} are effectively the same code. Please split this out into a separate task which both of the above can call. This is a (partial) port of [HSC-1112|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1112] and should include fixes from [HSC-1297|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1297].",2 +"DM-4856","01/19/2016 11:19:54","Add __setitem__ for columns in afw.table","It's confusing to have to use an extra {{[:]}} to set a column in afw.table, and we can make that unnecessary if we override {{\_\_setitem\_\_}} as well as {{\_\_getitem\_\_}}.",2 +"DM-4858","01/19/2016 15:13:06","imagesDiffer doesn't handle overflow for unsigned integers","I'm seeing a test failure in afw's testTestMethods.py, apparently due to my numpy (1.8.2) treating images that differ by -1 as differing by 65535 in both {{numpy.allclose}} and array subtraction (which doesn't promote to an unsigned type). Does this still cause problems in more recent versions of {{numpy}}? If not, I imagine it's up to me to find a workaround for older versions if I want it fixed? (assigning to [~rowen] for now, just because I know he originally wrote this test and I hope he might know more)",1 +"DM-4862","01/20/2016 10:08:08","Add point selection","click and highlight a point. Is on when mouse readout ""Lock by Click"" is on. However, can me turned on externally by adding toolbar context menu options.",2 +"DM-4867","01/20/2016 15:25:37","scisql build scripts are buggy ","The scisql build script logic for MySQL/MariaDB version checking is broken on all platforms. There are also assumptions about shared library naming that do not hold on OS/X, which means that the deployment scripts are likely broken on all platforms other than Linux.",2 +"DM-4873","01/20/2016 17:02:45","Test the matchOptimisticB astrometric matcher","The matchOptimisticB matcher fails on many visits of the bulge verification dataset. This prompted a deeper investigation of the performance of the matcher. Angelo and David developed a test script and discovered that the matcher works well with offsets of the two source catalogs of up to 80 arcsec, but fails beyond that. This should be robust enough for nearly all datasets that the LSST stack will be used on.",3 +"DM-4876","01/20/2016 17:09:10","Compile list of DM simulation needs for Andy Connolly","Compile list of DM simulation needs over the next ~6 months to give to Andy Connolly (simulations lead).",3 +"DM-4878","01/20/2016 17:18:51","Propagate flags from individual visit measurements to coadd measurements","It is useful to be able to identify suitable PSF stars from a coadd catalogue. However, the PSF is not determined on the coadd, but from all the inputs. Add a mechanism for propagating flags from the input catalogues to the coadd catalogue indicating stars that were used for measuring the PSF. Make the inclusion fraction threshold configurable so we can tweak it (so we only get stars that were consistently used for the PSF model; the threshold might be set it to 0 for ""or"", 1 for ""all"" and something in between for ""some""). Make the task sufficiently general that it can be used for propagating arbitrary flags. This is a port of work carried out on [HSC-1052|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1052] and (part of) [HSC-1293|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1293].",2 +"DM-4882","01/20/2016 20:41:57","base_Variance plugin generates errors in lsst_dm_stack_demo","Since DM-4235 was merged, we see a bunch of messages along the lines of: {code} processCcd.measurement WARNING: Error in base_Variance.measure on record 427969358631797076: The center is outside the Footprint of the source record {code} in the output from {{lsst_dm_stack_demo}}. (See e.g. [here|https://ci.lsst.codes/job/stack-os-matrix/label=centos-6/7482/console#console-section-3]). It's not fatal, but the warnings are disconcerting and could be indicative of a deeper problem.",2 +"DM-4885","01/20/2016 21:44:41","Improve/simplify multi-worker tests","The idea is that our current integration test is using ""mono"" configuration which is only useful for integration test but it's not used anywhere else. It would be more useful to have an integration test which is close to real setup, e.g. used more than one worker. It should still be possible to run the whole shebang on a single node though to keep it usable for regular development tasks.",5 +"DM-4887","01/21/2016 09:48:04","Refactor measurement afterburners into a new plugin system","Some of the operations we currently run as part of measurement (or would like to) share some features that make them a bit different from most plugin algorithms: - They must be run after at least some other high-level plugins, and may be run after all of them. - They do not require access to pixel data, as they derive their outputs entirely from other plugins' catalog outputs. - They may require an aggregation stage of some sort to be run on the regular plugin output before they can be run. Some examples include: - Star/Galaxy classification (with training done after measurement and before classification). - Applying aperture corrections (estimating the correction must be done first). - BFD's P, Q, R statistics (requires a prior estimated from deep data). We should move these algorithms to a new plugin system that's run by a new subtask, allowing these plugins to be run entirely separately from {{SingleFrameMeasurementTask}}. This will simplify some of the currently contorted logic required to make S/G classification happen after aperture correction, while making room for hierarchical inference algorithms like BFD and Bayesian S/G classification in the future. (We will not be able to support BFD immediately, as this will also require changes to our parallelization approach, but this will be a step in the right direction). This work should *probably* be delayed until after the HSC merge and [~rowen]'s rewrite of {{ProcessCcdTask}} are complete, but it's conceivable that this refactoring could solve emergent problems there and be worth doing earlier as a result.",8 +"DM-4893","01/22/2016 10:00:42","Write tutorial describing remote IPython + ds9 on lsst-dev","[~mfisherlevine] recently figured out how to set up his system to run a remote IPython kernel on {{lsst-dev}} and interact with it from his laptop, including streaming image display from the remote system to a local instance of {{ds9}}. He will write all this up so that others in the community can easily do the same.",2 +"DM-4894","01/22/2016 10:07:13","Ingest DECam/CBP data into LSST stack","[~mfisherlevine] will ingest the data taken in DM-4892 into the LSST stack. Initial experiments indicate problems with: * Bias subtraction * Flat fielding * Bad pixel masks These may already be remedied by work on {{obs_decam}}; if not, he will file stories and fix them.",3 +"DM-4904","01/22/2016 20:59:03","Buffer overrun in wcslib causes stack corruption","The buffer 'msg' in wcsfix.c is used to report attempts by wcslib to re-format units found in fits files. It is allocated on the stack (in function 'unitfix') using a pre-processor macro defined size of 160 chars (set in wcserr.h). When attempting to run the function 'unitfix' in wcsfix, this buffer can overflow on some fits files (the raw files generated by HSC seem particularly prone to triggering this behavior) and results in the session being terminated on Ubuntu 14.04 as stack protection is turned on by default i.e. the stack crashes with a 'stack smashing detected' error. We have reported the bug to the creators of wcslib. As a temporary workaround, users affected by the bug should increase the default size of 'msg' by increasing WCSERR_MSG_LENGTH defined in wcserr.h We are providing a small python example that demonstrates the problem. Run it as python test.py /raw/ We are also providing a simple c program to demonstrate the bug. Compile it as cc -fsanitize=address -g -I$WCSLIB_DIR/include/wcslib -o test test.c -L$WCSLIB_DIR/lib -lwcs (on Linux) cc -fsanitize=address -g -L$WCSLIB_DIR/lib -lwcs -I$WCSLIB_DIR/include/wcslib -o test test.c (on Mac OS X)",2 +"DM-4916","01/26/2016 09:18:58","Test obs_decam with processed data","Sometimes DECam-specific bugs only reveal in or affect the processed data. For example the bug of DM-4859 reveals in the {{postISRCCD}} products. If the bugs are DECam-specific, some changes in {{obs_decam}} are likely needed. It would be useful to have a more convenient way to test those changes. In this ticket I modify {{testdata_decam}} so that those data can be processed, and then allow wider options in the {{obs_decam}} unit tests. I add {{testProcessCcd.py}} in {{obs_decam}} that runs {{processCcd.py}} with raw and calibration data in {{testdata_decam}}. Besides a short sanity check, I add a test (testWcsPostIsr) that tests DM-4859. {{testWcsPostIsr}} fails without the DM-4859 fix, and passes with it. ",3 +"DM-4917","01/26/2016 11:37:24","Porting encodeURL of the java FitsDownlaodDialog code to javascript ","When download an image, the proper name needs to be resolved based on the URL and the information about the image. In Java code, it has the following three methods: {code} encodeUrl makeFileName makeTitleFileName {code} These method should be ported to javascript. Thus, the javascript version of the FitsDownloadDialog will save the file in the same manner. ",2 +"DM-4921","01/26/2016 14:23:55","Make obs_subaru build with OS X SIP","Because of OS X SIP, {{obs_subaru}} fails to build on os x 10.11. In the {{hsc/SConscript}} file, the library environment variables need properly set, and scripts need to be delayed until the shebang rewriting occurs. ",0.5 +"DM-4926","01/27/2016 07:02:07","Centroids fall outside Footprints","In DM-4882, we observed a number of centroids measured while running the {{lsst_dm_stack_demo}} routines fall outside their associated {{Footprints}}. This was seen with both the {{NaiveCentroid}} and the {{SdssCentroid}} centroiders. For the purposes of DM-4882 we quieted the warnings arising from this, but we should investigate why this is happening and, if necessary, weed out small {{Footprints}} entirely.",8 +"DM-4929","01/27/2016 11:32:50","Fix build of MariaDB on OS X El Capitan","The current MariaDB EUPS package does not build on OS X El Capitan because OS X no longer ships with OpenSSL developer files. MariaDB has a build option to use a bundled SSL library in preference to OpenSSL but the logic for automatically switching to this version breaks when the Anaconda OpenSSL libraries are present.",1 +"DM-4931","01/27/2016 11:48:11","Qserv build fails on El Capitan with missing OpenSSL","Qserv does not build on OS X El Capitan due to the absence of OpenSSL include files. Apple now only ship the OpenSSL library (for backwards compatibility reasons). Qserv only uses SSL in two places to calculate digests (MD5 and SHA). This functionality is available in the Apple CommonCrypto library. Qserv digest code needs to be taught how to use CommonCrypto.",2 +"DM-4933","01/27/2016 13:50:01","Create a utility function do do spherical geometry averaging","I would like to calculate a correct average and RMS for a set of RA, Dec positions. Neither [~jbosch] nor [~price] knew of an easy, simple function to do that that existed in the stack. [~price] suggested: {code} mean = sum(afwGeom.Extent3D(coord.toVector()) for coord in coordList, afwGeom.Point3D(0, 0, 0)) mean /= len(coordList) mean = afwCoord.IcrsCoord(mean) {code} That makes sense, but it's a bit unobvious (it's obvious how it works, but would likely never occur to someone that they should do it that way in the stack). Pedantically it's also not the best way to do a mean while preserving precision, but I don't anticipate that to be an issue in practice. Creating a function that did this would provide clarity. I don't know where that function should live. Note: I know how to do this in Astropy. I'm intentionally not using astropy here. But part of the astropy dependency discussion is likely ""how much are we otherwise rewriting in the LSST stack"".",1 +"DM-4934","01/27/2016 14:57:01","on-going support to Camera team in visualization at UIUC","Attend the weekly meeting and answer questions as needed",2 +"DM-4936","01/28/2016 09:11:22","Enable validateMatches in ci_hsc","{{python/lsst/ci/hsc/validate.py}} in {{ci_hsc}} [says|https://github.com/lsst/ci_hsc/blob/69c7a62f675b8fb4164065d2c8c1621e296e40ad/python/lsst/ci/hsc/validate.py#L78]: {code:python} def validateMatches(self, dataId): # XXX lsst.meas.astrom.readMatches is gone! return {code} {{readMatches}} (or its successor) should be back in place as of DM-3633. Please enable this test.",2 +"DM-4937","01/28/2016 09:12:05","multiple CVEs relevant to mariadb 10.1.9 and mysql","Multiple CVEs have been released this week for mysql & mariadb. The current eups product for mariadb is bundling 10.1.9, which is affected. Several of the CVEs do not yet provide details, which typically means they are ""really bad"". https://github.com/lsst/mariadb/blob/master/upstream/mariadb-10.1.9.tar.gz https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0505 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0546 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0596 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0597 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0598 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0600 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0606 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0608 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0609 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0616 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-2047",0.5 +"DM-4938","01/28/2016 10:33:52","Update scisql to v0.3.5","In order to update MariaDB to v10.1.10 {{scisql}} needs to also be updated to deal with the hard-coded version checking. For the current version we get this error with the latest MariaDB: {code} ::::: [2016-01-28T16:51:40.539306Z] user_function(self) ::::: [2016-01-28T16:51:40.539334Z] File ""/home/build0/lsstsw/build/scisql/wscript"", line 63, in configure ::::: [2016-01-28T16:51:40.539346Z] ctx.check_mysql() ::::: [2016-01-28T16:51:40.539392Z] File ""/home/build0/lsstsw/build/scisql/.waf-1.6.11-30618c54883417962c38f5d395f83584/waflib/Configure.py"", line 221, in fun ::::: [2016-01-28T16:51:40.539410Z] return f(*k,**kw) ::::: [2016-01-28T16:51:40.539432Z] File ""tools/mysql_waf.py"", line 85, in check_mysql ::::: [2016-01-28T16:51:40.539451Z] (ok, msg) = mysqlversion.check(version) ::::: [2016-01-28T16:51:40.539473Z] File ""tools/mysqlversion.py"", line 74, in check ::::: [2016-01-28T16:51:40.539514Z] if not comparison_op(version_nums, constraint_nums): ::::: [2016-01-28T16:51:40.539547Z] UnboundLocalError: local variable 'constraint_nums' referenced before assignment Failed during rebuild of DM stack. {code}",0.5 +"DM-4939","01/28/2016 11:51:18","IRSA developer mentoring effort","IRSA is contributing to the Firefly package development. we need to put in time to mentor the developers. ",2 +"DM-4940","01/28/2016 11:56:35","IRSA developer mentoring effort","IRSA is contributing to Firefly development. We need to mentor the new developers.",2 +"DM-4952","01/29/2016 09:44:52","delegate argument parsing to CmdLineTask instances","Command-line argument parsing of data IDs for {{CmdLineTask}} s is currently defined at the class level, which means that we cannot make data ID definitions dependent on task configuration. That in turn requires custom {{processCcd}} scripts for cameras that start processing at a level other than ""raw"" (SDSS, DECam with community pipeline ISR, possibly CFHT). Instead, we should let {{CmdLineTask}} *instances* setup command-line parsing; after a {{CmdLineTask}} is constructed, it will have access to its final configuration tree, and can better choose how to parse its ID arguments. I've assigned this to Process Middleware for now, since that's where it lives in the codebase, but it may make more sense to give this to [~rowen], [~price], or [~jbosch], just because we've already got enough familiarity with the code in question that we could do it quickly. I'll leave that up to [~swinbank], [~krughoff], and [~mgelman2] to decide.",2 +"DM-4955","01/29/2016 12:44:30","Update pyfits","The final version of {{pyfits}} has just been released. This ticket covers updating to that version. This will be helpful in determining whether the migration to {{astropy.io.fits}} will be straightforward or complicated.",1 +"DM-4957","01/29/2016 13:42:33","Generate JSON output from validate_drp for inclusion in a test harness","Generate JSON output from validate_drp for inclusion in a test harness. Generate a file that summarizes the key metrics calculated by `validate_drp`. Develop naming conventions that will make it easy to plug into the eventual harness being developed as part of DM-2050.",2 +"DM-4959","01/30/2016 19:50:52","ci_hsc fails to execute tasks from with SCons on OSX 10.11/SIP","The {{ci_hsc}} package executes a number of command line tasks directly from SCons based on {{Command}} directives in a {{SConstruct}} file. On an OSX 10.11 system with SIP enabled, there are two distinct problems which prevent the necessary environment being propagated to the tasks: * -The {{scons}} executable starts with a {{#!/usr/bin/env python}}. Running through {{/usr/bin/env}} strips {{DYLD_LIBRARY_PATH}} from the environment.- (duplicates DM-4954) * SCons executes command using the [{{sh}} shell on posix systems|https://bitbucket.org/scons/scons/src/09e1f0326b7678d1248dab88b28b456fd7d6fb54/src/engine/SCons/Platform/posix.py?at=default&fileviewer=file-view-default#posix.py-105]. By default, that means {{/bin/sh}} on a Mac, which, again, will strip {{DYLD_LIBRARY_PATH}}. Please make it possible to run {{ci_hsc}} on such a system.",0.5 +"DM-4961","01/31/2016 15:37:38","Obs_Subaru camera mapper has wrong deep_assembleCoadd_config","When lsst switched to using SafeClipAssembleCoaddTask, the camera mapper for hsc was not updated accordingly. This causes ci_hsc to fail when it attempts to verify the config class type for the deep_coadd. Camera mapper should be updated accordingly",0.5 +"DM-4983","02/01/2016 11:39:26","upstream patches/deps from conda-lsst","Where ever possible, missing dep information and patches from conda-lsst should be upstreamed. The patches have already been observed to cause builds to fail due to upstream changes.",3 +"DM-18241","02/01/2016 14:19:59","Create initial M1M3, M2 simulators","Initial simulator support",3 +"DM-17268","02/01/2016 14:34:10","SAL release 4 build and distribute","Release new version",5 +"DM-4991","02/01/2016 14:45:18","Save algorithm metadata in multiband.py","The various {{Tasks}} in {{multiband.py}} do not attach the {{self.algMetadata}} instance attribute to their output tables before writing them out, so we aren't actually saving information like which radii were used for apertures. We should also make sure this feature is maintained in the processCcd.py rewrite.",3 +"DM-4993","02/01/2016 20:37:35","review of dependency on the third party packages","We need to periodically review the status of the third party software packages that Firefly depends on. Making a plan to do upgrade if needed. package.json lists out the dependencies Firefly has on the third party software. The attached file was last modified 2016-02-09. package.json_version lists the current version of the third party packages, major changes were indicated by (M). The attached file was created on 2016-02-29. bq. ""babel"" : ""5.8.34"", 6.5.2 (M) ""history"" : ""1.17.0"", 2.0.0 (M) ""icepick"" : ""0.2.0"", 1.1.0 (M) ""react-highcharts"": ""5.0.6"", 7.0.0 (M) ""react-redux"": ""3.1.2"", 4.4.0 (M) ""react-split-pane"": ""0.1.22"", 2.0.1 (M) ""redux-thunk"": ""0.1.0"", 1.0.3 (M) ""redux-logger"": ""1.0.9"", 2.6.1 (M) ""validator"" : ""4.5.0"", 5.1.0 (M) ""chai"": ""^2.3.0"", 3.5.0 (M) ""esprima-fb"": ""^14001.1.0-dev-harmony-fb"", 15001.1001.0-dev-harmony-fb (M) ""babel-eslint"" : ""^4.1.3"", 5.0.0 (M) ""babel-loader"" : ""^5.3.2"", 6.2.4 (M) ""babel-plugin-react-transform"": ""^1.1.0"", 2.0.0 (M) ""babel-runtime"" : ""^5.8.20"", 6.6.0 (M) ""eslint"" : ""^1.10.3"", 2.2.0 (M) ""eslint-config-airbnb"": ""0.1.0"", 6.0.2 (M) works with eslint 2.2.0 ""eslint-plugin-react"": ""^3.5.1"", 4.1.0 (M) works with eslint 2.2.0 ""extract-text-webpack-plugin"": ""^0.8.0"", 1.0.1 (M) ""html-webpack-plugin"": ""^1.6.1"", 2.9.0 (M) ""karma-sinon-chai"": ""^0.3.0"", 1.2.0 (M) ""redux-devtools"" : ""^2.1.2"", 3.3.1 (M) ""webpack"": ""^1.8.2"" 1.12.14, 2.1.0 beta4 (M) ",2 +"DM-4995","02/02/2016 01:07:10","Extend webserv API to pass security tokens","Extend the [API|https://confluence.lsstcorp.org/display/DM/AP] to pass security tokens.",8 +"DM-4996","02/02/2016 09:34:33","Update validate_drp for El Capitan","validate_drp does not work on El Capitan due to SIP (System Integrity Protection) stripping DYLD_LIBRARY_PATH from shell scripts. The simple fix is to add {code} export DYLD_LIBRARY_PATH=${LSST_LIBRARY_PATH} {code} near the top of the scripts.",1 +"DM-4998","02/02/2016 12:54:03","Fix rotation for isr in obs_subaru","Approximately half of the HSC CCDs are rotated 180 deg with respect to the others. Two others have 90 deg rotations and another two have 270 deg rotations (see [HSC CCD layout|http://www.naoj.org/Observing/Instruments/HSC/CCDPosition_20150804.png]) . The raw images for the rotated CCDs thus need to be rotated to match the rotation of their associated calibration frames prior to applying the corrections. This is accomplished by rotating the exposure using the *rotated* context manager function in {{obs_subaru}}'s *isr.py* and the *nQuarter* specification in the policy file for each CCD. Currently, *rotated* uses {{afw}}'s *rotateImageBy90* (which apparently rotates in a counter-clockwise direction) to rotated the exposure by 4 - nQuarter turns. This turns out to be the wrong rotation for the odd nQuarter CCDs as shown here: !ccd100_nQuarter3.png|width=200! top left = raw exposure as read in top right = flatfield exposure as read in bottom left = _incorrectly_ rotated raw exposure prior to flatfield correction",2 +"DM-5002","02/02/2016 16:02:48","Make ci_hsc resumable","if ci_hsc fails for any reason, (or is cancelled) it must start from the beginning of processing again. This is because of the use of functools.partial to generate dynamic function. These differ enough in their byte code that scons thinks each build has a new function definition passed to the env.command function. Using lambda would suffer from the same problem. This ticket should change how the function signature is calculated such that scons can be resumed. This work does not prevent this from being used as a ci tool, as the .scons directory can be deleted which will force the whole SConstruct file to run again.",2 +"DM-5005","02/02/2016 16:11:36","Please trim config overrides in validate_drp","validate_drp will test more of our code if it uses default config parameters wherever possible. To that effect I would like to ask you to eliminate all config overrides that are not essential and document the reasons for the remaining overrides. For DECam there are no overrides that are different than the defaults, so the file can simply be emptied (for now). For CFHT there are many overrides that are different, and an important question is whether the overrides in this package are better for CFHT data than the overrides in obs_cfht; if so, please move them to obs_cfht. As a heads up: the default star selector is changing from ""secondMoment"" to ""objectSize"" in DM-4692 and I hope to allow that in validate_drp, since it works better and is better supported. Sorry for the incorrect component, but validate_drp is not yet a supported component in JIRA (see DM-5004)",0.5 +"DM-5013","02/03/2016 09:34:31","Convert Confluence DM Developer Guide to Sphinx (hack day) ","This is a hack day sprint to convert all remaining content on https://confluence.lsstcorp.org/display/LDMDG to reStructuredText content in the Sphinx project at https://github.com/lsst-sqre/dm_dev_guide and published at http://developer.lsst.io. The top priority for this sprint is to port all content into reST and have it tracked by Git. h2. Sprint ground rules # Before the sprint, clone {{https://github.com/lsst-sqre/dm_dev_guide.git}} and {{pip install -r requirements.txt}} in a Python 2.7 environment so that you can locally build the docs ({{make html}}). # Claim a page from the list below by putting your name on it. Put a checkmark on the page when you’ve merged it to the ticket branch (see below). # See http://developer.lsst.io/en/latest/docs/rst_styleguide.html for guidance on writing our style of reStructuredText. Pay attention to the [heading hierarchy|http://developer.lsst.io/en/latest/docs/rst_styleguide.html#sections] and [labelling for internal links|http://developer.lsst.io/en/latest/docs/rst_styleguide.html#internal-links-to-labels]. # If you use Pandoc to do an initial content conversion, you still need to go through the content line-by-line to standardize the reStructuredText. I personally recommend copy-and-pasting-and-formatting instead of using Pandoc. # Your Git commit messages should include the URL of the original content from Confluence. # Merge your work onto the {{tickets/DM-5013}} ticket branch. Rebase your personal work branch before merging. JSick is responsible for merging this ticket branch to {{master}}. # Put a note at the top of the confluence page with the new URL; root is {{http://developer.lsst.io/en/latest/}}. h2. Planned Developer Guide Table of Contents We’re improving the organization of DM’s Developer Guide; there isn’t a 1:1 mapping of Confluence pages to developer.lsst.io pages. Below is a proposed section organization and page structure. These sections can still be refactored based on discussion during the hack day. h3. Getting Started — /getting-started/ * ✅ *Onboarding Checklist* (Confluence: [Getting Started in DM|https://confluence.lsstcorp.org/display/LDMDG/Getting+Started+in+DM]). I’d like this to eventually be a quick checklist of things a new developer should do. It should be both a list of accounts the dev needs to have created, and a list of important developer guide pages to read next. The NCSA-specific material should be spun out. [[~jsick]] * *Communication Tools* (new + DM Confluence [Communication and Links|https://confluence.lsstcorp.org/display/DM/Communication+and+Links]). I see this as being an overview of what methods DM uses to communicate, and what method should be chosen for any circumstance. * *Finding Code on GitHub* (new). This should point out all of the GitHub organizations that a developer might come across (DM and LSST-wide), and point out important repositories within each organization. Replaces the confluence page [LSST Code Repositories|https://confluence.lsstcorp.org/display/LDMDG/LSST+Code+Repositories] h3. Processes — /processes/ * ✅ *Team Culture and Conduct Standards* (confluence) * ✅ *DM Development Workflow with Git, GitHub, JIRA and Jenkins* (new & Confluence: [git development guidelines for LSST|https://confluence.lsstcorp.org/display/LDMDG/git+development+guidelines+for+LSST] + [Git Commit Best Practices|https://confluence.lsstcorp.org/display/LDMDG/Git+Commit+Best+Practices] + [DM Branching Policy|https://confluence.lsstcorp.org/display/LDMDG/DM+Branching+Policy]) * ✅ *Discussion and Decision Making Process* (new & [confluence|https://confluence.lsstcorp.org/display/LDMDG/Discussion+and+Decision+Making+Process]) * ✅ *DM Wiki Use* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/DM+Wiki+Use]) [[~swinbank]] * ✅ *Policy on Updating Doxygen* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Policy+on+Updating+Doxygen]); needs to be addressed with TCT. Inter-link with the developer workflow page. [[~jsick]] (we’re just re-pointing the Confluence page to the workflow document) * ✅ *Transferring Code Between Packages* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Transferring+Code+Between+Packages]) [[~swinbank]] * -*Policy on Changing a Baseline Requirement*- ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Policy+on+Changing+a+Baseline+Requirement]) * ✅ *Project Planning for Software Development* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Project+Planning+for+Software+Development]) [[~swinbank]] * ✅ *JIRA Agile Usage* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/JIRA+Agile+Usage]) [[~swinbank]] * -*Technical/Control Account Manager Guide*- ([confluence|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=21397653]) (Do not port; see discussion below.) * *Licensing* (new) Need a centralized page to discuss license and copyright policies; include boilerplate statements. h3. Coding Guides — /coding/ * ✅ *Introduction* and note on stringency language (confluence: [DM Coding Style Policy|https://confluence.lsstcorp.org/display/LDMDG/DM+Coding+Style+Policy]) * ✅ *DM Python Style Guide* (confluence: [Python Coding Standard|https://confluence.lsstcorp.org/display/LDMDG/Python+Coding+Standard]) * ✅ *DM C++ Style Guide* (confluence pages: [C++ Coding Standard|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908666] + [C++ General Recommendations|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908756] + [C++ Naming Conventions|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908685] + [C++ Files|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908674] + [C++ Statements|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908706] + [C++ Layout and Comments|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908737] + [Policy on use of C++11/14|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20283399] + [On Using ‘Using’|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20283856]) * Coding Style Linters (new; draft from confluence [C++ Coding Standards Compliance|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20283861] and [Python Coding Standards Compliance|https://confluence.lsstcorp.org/display/LDMDG/Python+Coding+Standards+Compliance] * ✅ *Using C++ Templates* ([confluence|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20284190]); this page needs to severely edited or re-written, however. * ✅ *Profiling* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Profiling|]). Also add a section ‘Using Valgrind with Python' (new) [[~jsick]] * ✅ *Boost Usage* ([TRAC|https://dev.lsstcorp.org/trac/wiki/TCT/BoostUsageProposal]) [[~tjenness]] * ✅ *Software Unit Test Policy* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Software+Unit+Test+Policy]) [[~swinbank]] * ✅ *Unit Test Coverage Analysis* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Coverage+Analysis]) [[~swinbank]] * ✅ *Unit Testing Private C++ Functions* ([trac|https://dev.lsstcorp.org/trac/wiki/UnitTestingPrivateFunctions]) [[~swinbank]] h3. Writing Docs — /docs/ * *Introduction* (new): Overview of DM’s documentation needs; links resources on technical writing. * *English Style Guide* (new): Supplement the [LSST Style Manual|https://www.lsstcorp.org/docushare/dsweb/Get/Document-13016/LSSTStyleManual.pdf] and provide English style guidance specific to DM. Capitalization of different heading levels; use of Chicago Manual of Style; a ‘this, not that’ table of spelling and word choices. * ✅ *ReStructuredText Style Guide* (new) * ✅ *Documenting Stack Packages* (new) * ✅ *Documenting Python Code* (new) * ✅ *Documenting C++ Code* (confluence, adapted from [Documentation Standards|https://confluence.lsstcorp.org/display/LDMDG/Documentation+Standards]); needs improvement * ✅ *Writing Technotes* (new; port README from [lsst-technote-bootstrap|https://github.com/lsst-sqre/lsst-technote-bootstrap/blob/master/README.rst]) h3. Developer Tools — /tools/ * ✅ *Git Setup and Best Practices* (new) * ✅ *Using Git Large File Storage (LFS) for Data Repositories* (new) * ✅ *JIRA Work Management Recipes* (new) * ✅ *Emacs Configuration* ([Confluence|https://confluence.lsstcorp.org/display/LDMDG/Emacs+Support+for+LSST+Development]). See DM-5045 for issue with Emacs config repo - [~jsick] * ✅ *Vim Configuration* ([Confluence|https://confluence.lsstcorp.org/display/LDMDG/Config+for+VIM]) - [~jsick] h3. Developer Services — /services/ * ✅ *NCSA Nebula OpenStack Guide* (Confluence: [User Guide|https://confluence.lsstcorp.org/display/LDMDG/NCSA+Nebula+OpenStack+User+Guide] + [Starting an Instance|https://confluence.lsstcorp.org/display/LDMDG/Introduction+to+Starting+a+Nebula+Instance] + [Using Snapshots|https://confluence.lsstcorp.org/display/LDMDG/Start+an+Instance+using+a+base+snapshot+with+the+LSST+Stack]. Add the [Vagrant instructions too from SQR-002|http://sqr-002.lsst.io]? [[~jsick]] * ✅ *Using lsst-dev* (Confluence: [notes Getting Started|https://confluence.lsstcorp.org/display/LDMDG/Getting+Started+in+DM] + [Developer Tools at NCSA|https://confluence.lsstcorp.org/display/LDMDG/Developer+Tools+at+NCSA] * ✅ *Using the Bulk Transfer Server at NCSA* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Using+the+Bulk+Transfer+Server+at+NCSA]) [[~jsick]] h3. Build, Test, Release — /build-ci/ * *Eups for LSST Developers* (new) [[~swinbank]] * ✅ *The LSST Software Build Tool* → ‘Using lsstsw and lsst-build' ([confluence|https://confluence.lsstcorp.org/display/LDMDG/The+LSST+Software+Build+Tool]); lsstsw and lsst-build documentation. [[~swinbank]] * *Using DM’s Jenkins for Continuous Integration* (new) [~frossie] * ✅ *Adding a New Package to the Build*([confluence|https://confluence.lsstcorp.org/display/LDMDG/Adding+a+new+package+to+the+build]) [[~swinbank]] * ✅ *Distributing Third-Party Packages with Eups* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Distributing+third-party+packages+with+EUPS]) [[~swinbank]] * ✅ *Triggering a Buildbot Build* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Triggering+a+Buildbot+Build]) [~frossie] * ✅ *Buildbot Errors FAQ* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Buildbot+FAQ+on+Errors]) [~frossie] * * Buildbot configuration ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Buildbot+Configuration+and+Setup] [~frossie] * *Creating a new DM Stack Release* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Creating+a+new+DM+Stack+Release]); though this page or a modern equivalent should probably belong with the software docs? [~frossie] _A lot of work should go into this section._ Have something about Scons? Or maybe that belongs in the doc of each relevant software product. h2. Leftover Confluence pages h3. The following pages should be moved to a separate Confluence space run by NCSA: * [NCSA Nebula OpenStack Issues|https://confluence.lsstcorp.org/display/LDMDG/NCSA+Nebula+OpenStack+Issues] * [DM System Announcements|https://confluence.lsstcorp.org/display/LDMDG/DM+System+Announcements] * [NCSA Development Servers|https://confluence.lsstcorp.org/display/LDMDG/DM+Development+Servers] h3. The following pages are either not relevant, generally misplaced, or need to be updated/recalibrated: * [Git Crash Course|https://confluence.lsstcorp.org/display/LDMDG/Git+Crash+Course] * [Basic Git Operations|https://confluence.lsstcorp.org/display/LDMDG/Basic+Git+Operations] * [Handling Git Push Problems|https://confluence.lsstcorp.org/display/LDMDG/Handling+Git+Push+Problems] * [LSST Code Repositories|https://confluence.lsstcorp.org/display/LDMDG/LSST+Code+Repositories]; see the proposed “Finding Code on GitHub” page for a replacement. * [Standards and Policies|https://confluence.lsstcorp.org/display/LDMDG/Standards+and+Policies]: this is a good TOC for the Confluence docs; but not longer needed for the new docs. * [Documentation Guidelines|https://confluence.lsstcorp.org/display/LDMDG/Documentation+Guidelines]. Some of this could be re-purposed into an intro to the ‘Writing Documentation’ section; some of this should go in a ‘Processes' page. * [DM Acknowledgements of Use|https://confluence.lsstcorp.org/display/LDMDG/DM+Acknowledgements+of+Use]: this probably belongs in documentation for the software projects that actually used this work.",5 +"DM-5014","02/03/2016 11:35:35","Set doRenorm default to False in AssembleCcdTask","Change the default value of {{AssembleCcdConfig.doRenorm}} to {{False}} for the reasons given in RFC-157 and to implement that RFC.",1 +"DM-5018","02/03/2016 12:17:28","Modernize version check scripts in matplotlib and numpy packages","The version check scripts in the stub {{matplotlib}} and {{numpy}} eups packages use old Python conventions. They should be updated to work with 2.7+.",0.5 +"DM-5022","02/03/2016 13:55:47","Modernize python code in Qserv scons package","The {{site_scons}} Python code is not using current project standards. For example, print is not a function, exceptions are not caught {{as e}}, {{map}} is called without storing the result and {{map/filter/lambda}} are used where list comprehensions would be clearer. Most of these fixes are trivial with {{futurize}}.",0.5 +"DM-5026","02/03/2016 14:45:53","Fix dependencies for eups-packaged sqlalchemy","Eups-packaged sqlalchemy lists {{mysqlclient}} as required dependency which is not really right. sqlalchemy does not directly depend on mysql client stuff, instead it determines at run time which python modules it needs to load depending on what exact driver client code is requesting (and {{mysqlclient}} does not actually provides python module so this dependency does not even make anything useful). So dependency on specific external package should be declared on client side and not in sqlalchemy, {{mysqlclient}} should be removed from sqlalchemy.table.",1 +"DM-5030","02/03/2016 16:56:05","Tests fail on Qserv on OS X El Capitan because of SIP","OS X El Capitan introduced System Integrity Protection which leads to dangerous environment variables being stripped when executing trusted binaries. Since {{scons}} is launched using {{/usr/bin/env}} the tests that run do not get to see {{DYLD_LIBRARY_PATH}}. This causes them to fail. The same fix that was applied to {{sconsUtils}} (copying the path information from {{LSST_LIBRARY_PATH}}) needs to be applied to the test execution code used by Qserv's private {{site_scons}} utility code.",2 +"DM-5050","02/04/2016 13:10:42","SingleFrameVariancePlugin takes variance of entire image","{{SingleFrameVariancePlugin}} takes the median variance of the entire image, rather than within an aperture around the source of interest. A {{Footprint}} is constructed with the aperture, but it is unused. This means that this plugin takes an excessive amount of run time (255/400 sec in a recent run of processCcd on HSC {{visit=1248 ccd=49}} with DM-4692).",1 +"DM-5052","02/04/2016 17:52:19","Design replacement for A.net index files","We need a simple way to hold index files that will be easy to use and simple to set up.",2 +"DM-5084","02/05/2016 13:32:30","PropagateVisitFlags doesn't work with other pipeline components","{{PropagateVisitFlags}}, which was recently ported over from HSC on DM-4878, doesn't work due to some inconsistencies with earlier packages/tasks: - The default fields to transfer have new names: ""calib_psfCandidate"" and ""calib_psfUsed"" - We're not currently transferring these fields from icSrc to src, so those fields aren't present in src anyway. I propose we just match against icSrc for now, since it has all of the fields we're concerned with. - It makes a call to {{afw.table.ExposureCatalog.subsetContaining(Point, Wcs, bool)}}, which apparently exists in C++ but not in Python; I'll look into seeing which HSC commits may have been missed in that port.",1 +"DM-5085","02/05/2016 13:35:55","Please add a package that includes obs_decam, obs_cfht and all validation_data datasets","It would be very helpful to have an lsstsw package that added all supported obs_* packages (certainly including obs_cfht and obs_decam, and I hope obs_subaru) and all validation_data_* packages. This could be something other than lsst_apps, but I'm not sure what to call it.",0.5 +"DM-5086","02/05/2016 13:58:51","Enable aperture correction on coadd processing","Aperture corrections are now coadded, so we can enable aperture corrections in measurements done on coadds.",0.5 +"DM-5094","02/08/2016 14:15:59","HSC backport: Set BAD mask for dead amps instead of SAT","This is a port of [HSC-1095|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1095] and a leftover commit from [HSC-1231|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1231]: [isr: don't perform overscan subtraction on bad amps|https://github.com/HyperSuprime-Cam/obs_subaru/commit/d6fe6cf5c4ecadebd5a344d163e1f1e60137c7e4] (noted in DM-3942).",3 +"DM-5095","02/08/2016 14:51:18","Redirect confluence based pages to new developer guide.","Delete and apply redirects to all migrated pages in old Confluence-based Developer Guide",0.5 +"DM-5100","02/09/2016 10:02:18","Docs for ltd-keeper","Create a documentation project within ltd-keeper that documents the RESTful API while it is being developed. This will allow the [SQR-006|http://sqr-006.lsst.io] technote to have a place to link to for detailed information.",1 +"DM-5107","02/09/2016 17:12:57","Fix effective coordinates for defects in obs_subaru","The defects as defined in {{obs_subaru}} (in the {{hsc/defects/20NN-NN-NN/defects.dat}} files) are defined in a coordinate system where pixel (0, 0) is the lower left pixel. However, the LSST stack does not use this interpretation, preferring to maintain the coordinate system tied to the electronics. As such, the defect positions are being misinterpreted for the rotated CCDs in HSC (see [HSC CCD layout|http://www.naoj.org/Observing/Instruments/HSC/CCDPosition_20150804.png]). This needs to be remedied.",2 +"DM-5120","02/10/2016 12:46:51","Add intelligence to `validate_drp` so it does ""A Reasonable Thing"" on an unknown output repo","validate_drp current takes as input both a repository and a configuration file. The configuration file contains information to construct the list of dataIds to analyze. However, these dataIds could be extracted from the repo itself, in cases where the desired is to analyze the entire repo. 1. Add a function that loads the set of dataIds from the repo. (/) 2. Select reasonable defaults for the additional parameters specified in the config file. (/) 3. Design how to handle multiple filters. (/)",5 +"DM-5121","02/10/2016 12:50:27","Add multiple-filter capabilities to `validate_drp`","Design and refactor `validate_drp` to produce results for multiple filters. 1. Decide on the syntax for the YAML configuration file that denotes the multiple filters. E.g., which visit goes with what filter? (/) 2. Organize the running of multiple filters in `validate.run` to sequentially generate statistics and plots for each filter. (/) 3. Add a filter designation to the default output prefix. (/) Note: matching objects *across* filters is out-of-scope for this ticket.",1 +"DM-5122","02/10/2016 13:30:01","LOAD DATA LOCAL does not work with mariadb","After we un-messed mariadb-mysqlclient we see errors now when trying to run integration tests: {noformat} File ""/usr/local/home/salnikov/dm-yyy/lib/python/lsst/qserv/wmgr/client.py"", line 683, in _request raise ServerError(exc.response.status_code, exc.response.text) ServerError: Server returned error: 500 (body: ""{""exception"": ""OperationalError"", ""message"": ""(_mysql_exceptions.OperationalError) (1148, 'The used command is not allowed with this MariaDB version') [SQL: 'LOAD DATA LOCAL INFILE %(file)s INTO TABLE qservTest_case01_mysql.LeapSeconds FIELDS TERMINATED BY %(delimiter)s ENCLOSED BY %(enclose)s ESCAPED BY %(escape)s LINES TERMINATED BY %(terminate)s'] [parameters: {'terminate': u'\\n', 'delimiter': u'\\t', 'enclose': u'', 'file': '/home/salnikov/qserv-run/2016_02/tmp/tmpWeAj6u/tabledata.dat', 'escape': u'\\\\'}]""}"") 2016-02-10 14:17:40,836 - lsst.qserv.admin.commons - CRITICAL - Error code returned by command : qserv-data-loader.py -v --config=/usr/local/home/salnikov/testdata-repo/datasets/case01/data/common.cfg --host=127.0.0.1 --port=5012 --secret=/home/salnikov/qserv-run/2016_02/etc/wmgr.secret --delete-tables --chunks-dir=/home/salnikov/qserv-run/2016_02/tmp/qserv_data_loader/LeapSeconds --no-css --skip-partition --one-table qservTest_case01_mysql LeapSeconds /usr/local/home/salnikov/testdata-repo/datasets/case01/data/LeapSeconds.schema /usr/local/home/salnikov/testdata-repo/datasets/case01/data/LeapSeconds.tsv.gz {noformat} It looks like mariadb client by default disables LOCAL option for data loading and it needs to be explicitly enabled. ",1 +"DM-5125","02/10/2016 15:05:54","qserv fails when it mixes mariadb and mariadbclient directories","When I tried to run qserv-configure after installing qserv 2016_01-7-gbd0349f I got this error: {noformat} 2016-02-10 16:03:16,915 - lsst.qserv.admin.commons - CRITICAL - Error code returned by command : /home/salnikov/qserv-run/2016_02/tmp/configure/mysql.sh {noformat} Running script configure/mysql.sh: {noformat} $ sh -x /home/salnikov/qserv-run/2016_02/tmp/configure/mysql.sh + echo '-- Installing mysql database files.' -- Installing mysql database files. + /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/scripts/mysql_install_db --basedir=/u2/salnikov/STACK/Linux64/mariadbclient/10.1.11 --defaults-file=/home/salnikov/qserv-run/2016_02/etc/my.cnf --user=salnikov + echo 'ERROR : mysql_install_db failed, exiting' ERROR : mysql_install_db failed, exiting + exit 1 {noformat} and {noformat} $ /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/scripts/mysql_install_db --basedir=/u2/salnikov/STACK/Linux64/mariadbclient/10.1.11 --defaults-file=/home/salnikov/qserv-run/2016_02/etc/my.cnf --user=salnikov FATAL ERROR: Could not find mysqld The following directories were searched: /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/libexec /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/sbin /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/bin {noformat} So it looks for mysqld in mariadbclient, the same directory as mysql_install_db script, mysql_install_db should be actually running from mariadb. ",1 +"DM-5129","02/11/2016 10:26:34","Create InputField for generic use cases.","Create a composable, validating InputField so it can use outside of the form/submit use-case.",2 +"DM-5130","02/11/2016 11:50:35","B-F correction breaks non-HSC custom ISR, ci_hsc","The addition of brighter-fatter correction on DM-4837 breaks obs_cfht's custom ISR, since it slightly changes an internal ISR API by addding an argument that isn't expected by the obs_cfht version. It also breaks ci_hsc, since the B-F kernel file isn't included in the calibrations packaged there. ",0.5 +"DM-5132","02/11/2016 19:27:30","obs_subaru install with eups distrib fails","Thus: {code} $ eups distrib install -t w_2016_06 obs_subaru ... [ 52/52 ] obs_subaru 5.0.0.1-60-ge4efae7+2 ... ***** error: from /Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/obs_subaru-5.0.0.1-60-ge4efae7+2/build.log: ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/hscRepository.py"", line 91, in setUp self.repoPath = createDataRepository(""lsst.obs.hsc.HscMapper"", rawPath) File ""tests/hscRepository.py"", line 63, in createDataRepository check_call([ingest_cmd, repoPath] + glob(os.path.join(inputPath, ""*.fits.gz""))) File ""/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py"", line 540, in check_call raise CalledProcessError(retcode, cmd) CalledProcessError: Command '['/Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/obs_subaru-5.0.0.1-60-ge4efae7+2/obs_subaru-5.0.0.1-60-ge4efae7+2/bin/hscIngestImages.py', '/var/folders/jp/lqz3n0m17nqft7bwtw3b8n380000gp/T/tmptUSKuf', '/Users/jds/Projects/Astronomy/LSST/stack/DarwinX86/testdata_subaru/master-gf9ba9abdbe/hsc/raw/HSCA90402512.fits.gz']' returned non-zero exit status 1 ---------------------------------------------------------------------- Ran 8 tests in 9.928s FAILED (errors=7) The following tests failed: /Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/obs_subaru-5.0.0.1-60-ge4efae7+2/obs_subaru-5.0.0.1-60-ge4efae7+2/tests/.tests/hscRepository.py.failed 1 tests failed scons: *** [checkTestStatus] Error 1 scons: building terminated because of errors. + exit -4 {code} Please fix it.",1 +"DM-5135","02/12/2016 10:01:49","Make ci_hsc buildable by Jenkins","1. Make sure {{ci_hsc}} is buildable by {{lsstsw}} / {{lsst_build}} (/) 2. Add {{ci_hsc}} to lsstsw/etc/repos.yaml so that one can request that Jenkins builds it. (/) 3. Verify that the test in {{ci_hsc}} fails on known broken tags and passes on known successful tags. (/) No dependencies will be added to {{lsst_sims}} or {{lsst_distib}}. This is meant to provide the ability to request that Jenkins do these builds and to fail if something has broken them. This will later be expanded to new packages {{ci_cfht}}, {{ci_decam}}, and {{ci_sim}}. The key goal is to make sure one hasn't broken obs_ packages in their butler interface or in their processCcd Additional Notes and Thoughts from HipChat Discussion [~ktl] Sounds good to me; we might have an ""lsst_ci"" top-level metapackage depending on all of them which is what Jenkins would run regularly. If the goal is to test obs_ packages, then my first instinct would be to put that in the obs_ package. Longer term goal to test the stack with different precursor datasets. If this is testing obs_ packages on a slower cadence than the built-in tests, it's OK for that to be a separate package. [~jbosch] Eventually, I think we need to run a CI dataset for each camera, then run some camera generic tests on each of those, then run some camera-specific tests on each of those.  So we don't want to go too far down a road in which all tests are camera-specific, but maybe we don't have a choice until we have some better unifying framework for them. I've certainly been putting some checks in {{ci_hsc}} that would be valid for all other cameras, if we had a CI package for them that went through to coadd processing.",2 +"DM-5139","02/12/2016 11:57:08","Update apr and apr_util","{{apr}} and {{apr-util}} are outdated and lagging behind the versions on RHEL6. They should be updated as agreed in RFC-76.",0.5 +"DM-5140","02/12/2016 12:39:57","Move luaxmlrpc to lsst-dm/legacy-","We no longer need luaxmlrpc because we run czar inside proxy. We should move it to lsst-dm/legacy-, and remove mentioning it in readme.",0.5 +"DM-5147","02/12/2016 21:12:52","Provide usable repos in {{validation_data_*}} packages.","Re-interpreted ticket: 1. Provide already-initialized repositories in the `validation_data_cfht`, `validation_data_decam`, and `validation_data_hsc` packages alongside the raw data. The goal is to allow both easy quick-start analyses as well as comparisons of output steps from processCcd.py and friends at each step of the processing. (/) 2. Add (Cfht,Decam,HSC).list files to provide for easy processing of the available dataIds in the example data. (/) 3. Update README files to explain available data. (/) [Original request:] In validation_drp when I run examples/runXTest.sh I find that any data I had saved in CFHT or DECam is lost, even if I have carefully renamed it. This is very dangerous and I lost a lot of work due to it. At a bare minimum please do NOT touch any directories not named ""input"" or ""output"". Lower priority requests that I hope you will consider: - Have the the input repo be entirely contained in the validation_data_X packages, ready to use ""as is"". That would simplify the use of those packages by other code. It would also simplify validate_drp, and it would just leave the output repo to generate (which already has a link back to the input repo). - Have runXTest.sh accept a single argument: the path to the output. (The path to the input is not necessary if you implement the first suggestion).",3 +"DM-5156","02/15/2016 12:43:37","Please document MemoryTestCase","{{lsst.utils.tests.MemoryTestCase}} is used extensively throughout our test suite, but it is lacking in documentation and it's not clear under what circumstances its use is required or encouraged. Please add appropriate documentation to the [Software Unit Test Policy |http://developer.lsst.io/en/latest/coding/unit_test_policy.html]. See also [this thread on clo|https://community.lsst.org/t/what-is-the-policy-for-using-lsst-utils-tests-memorytestcase].",0.5 +"DM-5159","02/15/2016 16:59:16","Please use angle and Coord where possible","validate_drp would be easier to follow and safer if it took advantage of lsst.afw.geom.Angle and lsst.afw.coord.IcrsCoord. For instance {{averageRaDecFromCat}} could return an IcrsCoord and positionRms could use coord1.angularSeparation(coord2) and handle wraparound and other effects simply and safely.",1 +"DM-5161","02/16/2016 14:53:45","HSC backport: Support a full background model when detecting cosmic rays","This is a port of the following two standalone HSC commits: [Support a full background model when detecting cosmic rays|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/3bae328e0fff4b2a02267e97cc1e53b5bbe431cb] {code} If there are strong gradients (e.g. M31's nucleus) we need to do more than treat the background as a constant. However, this requires making a copy of the data so the background-is-a-constant model is preserved as a special case {code} [Fixed cosmicRay() in RepairTask for the case background is subtracted.|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/2cdb7c606270d84c7a05baf9949ff5724463fa6b] {code} When the background is subtracted with finer binsize, new exposure will be created and cosmic rays will be detected on that exposure. But the image of that exposure was not properly returned back. {code} ",1 +"DM-5164","02/17/2016 12:25:31","Tests in daf_persistence should skip properly","Some of the tests in {{daf_persistence}} have a couple of problems that cause difficulties with modern test frameworks: # unittest is not being used at all in some cases # Skipping is done with a print and a {{sys.exit}} They need to be modernized.",2 +"DM-5169","02/17/2016 16:16:22","Fastly API interactions for LSST the Docs","Using Fastly’s API, have ltd-keeper setup new builds and editions: - Add {{Surrogate-Key}} to headers of objects uploaded to S3 (happens on ltd-mason side) - Configure Varnish to serve specific bucket directories as specific domains (DM-4951 has added Route 53 interactions to ltd-keeper) - Purge content when editions switch or content is deleted. DM-5167 is covering non-API driven work to configure fastly. See https://www.hashicorp.com/blog/serving-static-sites-with-fastly.html for a write-up on serving static site via fastly. See also http://sqr-006.lsst.io for an overview of LSST the Docs.",8 +"DM-5179","02/18/2016 15:02:48","miniconda2 eups package fails to install on OS X","The {{miniconda2}} eups package attempts to install the relevant conda packages by downloading a list from the {{lsstsw}} repository. This fails for the same reason that {{lsstsw}} fails in DM-5178 in that the list of packages is not OS-specific. This means that {{newinstall.sh}} does not work any more on OS X.",0.5 +"DM-5182","02/18/2016 17:15:33","Hook up help system","We need to help system like we have in GWT.",8 +"DM-5187","02/18/2016 19:02:07","Set Qserv master in env variable for Docker containers","This would allow use of pre-configured container on all clusters, indeed the only parameter which currently change in cluster install is master fqdn. See http://xrootd.org/doc/dev42/Syntax_config.htm and {code} if defined ?~EXPORTPATH set exportpath = $EXPORTPATH else set exportpath = /tmp fi all.export $exportpath {code}",3 +"DM-5196","02/19/2016 11:49:49","swift API availability?","The downtime announcement email for {{Nebula unavailable Feb 9-10}} mentioned a ""roadmap"" for swift. I have checked and post maintenance, there is not a swift endpoint available in the catalog. Is there a time line for availability?",1 +"DM-5197","02/19/2016 12:34:00","Test and robustify shapelet PSF approximations","The CModel code ported from HSC only works as well as the ShapeletPsfApproximation algorithm that runs before it, but we've switched on the LSST side to a more flexible algorithm that isn't as nearly as battle-tested as what's been running on the HSC side, and there are some concerning indications from [~pgee]'s work that it can be catastrophically slow on some reasonable PSFs. On this issue, I'll run it on some real HSC data and try to improve it, even if that means reducing the flexibility back to what was on the HSC side in some ways.",8 +"DM-5198","02/19/2016 13:14:41","FITS Visualizer porting: Statistics - part 2 - drawing overlay & 3 color support","drawing overlay 3 Color Support",8 +"DM-5200","02/19/2016 15:57:33","instance I/O errors","The kernel dmesg for Instance {{bbfd7458-6dd6-4412-a8ba-8d417c3df56b}} has started reporting thousands of block I/O errors and these are starting to trickle up as a filesystem I/O errors. I suspect this is likely a hypervisor I/O issue. {code} [687301.556430] Buffer I/O error on device dm-3, logical block 3768490 [687301.556433] Buffer I/O error on device dm-3, logical block 3768491 [687301.556436] Buffer I/O error on device dm-3, logical block 3768492 {code} {code} $ openstack server show bbfd7458-6dd6-4412-a8ba-8d417c3df56b +--------------------------------------+-----------------------------------------------------------------------+ | Field | Value | +--------------------------------------+-----------------------------------------------------------------------+ | OS-DCF:diskConfig | MANUAL | | OS-EXT-AZ:availability_zone | nova | | OS-EXT-STS:power_state | 1 | | OS-EXT-STS:task_state | None | | OS-EXT-STS:vm_state | active | | OS-SRV-USG:launched_at | 2016-02-11T23:36:25.000000 | | OS-SRV-USG:terminated_at | None | | accessIPv4 | | | accessIPv6 | | | addresses | LSST-net=172.16.1.115, 141.142.209.121 | | config_drive | | | created | 2016-02-11T23:36:12Z | | flavor | m1.xlarge (5) | | hostId | f7fbf308022d02f52e1111c91cf578d852784d290d0e0ddb0d69635c | | id | bbfd7458-6dd6-4412-a8ba-8d417c3df56b | | image | centos-7-docker-20151116230205 (59a2a478-11ab-41c5-affc-29706d38d65a) | | key_name | vagrant-generated-comshorc | | name | el7-docker-jhoblitt | | os-extended-volumes:volumes_attached | [] | | progress | 0 | | project_id | 8c1ba1e0b84d486fbe7a665c30030113 | | properties | | | security_groups | [{'name': 'default'}, {'name': 'remote SSH'}] | | status | ACTIVE | | updated | 2016-02-11T23:36:25Z | | user_id | 83bf259d1f0c4f458e03f9002f9b4008 | +--------------------------------------+-----------------------------------------------------------------------+ {code}",1 +"DM-5202","02/19/2016 17:54:03","Remove LOGF macros from log package","We have removed all uses of LOGF macros from qserv and as far as I know no other clients use those macros. It's time to clean up log package itself from those macros.",1 +"DM-5204","02/19/2016 18:13:36","Remove remaining LOGF macros from qserv","There are still few cases of LOGF macros in qserv, have to replace them all.",1 +"DM-5206","02/20/2016 11:21:58","Please do not write garbage to the FITS EQUINOX","The equinox is not relevant when dealing with ICRS coordinates. When {{afw}} manipulates {{Wcs}} objects, it simply doesn't bother initializing the {{equinox}} field of its {{_wcsInfo}} struct when dealing with an ICRS system. When {{afw}} persists the {{Wcs}} to FITS, it blindly writes whatever happens to be in that uninitialized field to the FITS header. Thus, we end up with something like: {code} EQUINOX = 9.87654321E+107 / Equinox of coordinates {code} This should be no problem, since, per the [FITS standard|http://fits.gsfc.nasa.gov/standard30/fits_standard30aa.pdf] (page 30), the {{EQUINOX}} is ""not applicable"" if they {{RADESYS}} is {{ICRS}}. The reader should thus ignore this value. However, [SAOimage DS9|http://ds9.si.edu] version 7.4.1 (the latest release at time of writing) does _not_ ignore the {{EQUINOX}}. Rather, it refuses to handle the WCS for the image. Note that version 7.3 of DS9 does not seem to have the same issue. While this does seem to be a bug in DS9, it's easy enough to work around by simply not writing {{EQUINOX}}.",0.5 +"DM-5247","02/26/2016 12:01:14","Segfault in shapeHSM centroid extractor","[~boutigny] reports a segfault in {{meas_extenstions_shapeHSM}}. He provides the following backtrace: {code} Program received signal SIGSEGV, Segmentation fault. 0x00007fffe7043156 in lsst::afw::table::BaseRecord::getElement (this=0x21c8d60, key=...) at include/lsst/afw/table/BaseRecord.h:61 61 typename Field::Element * getElement(Key const & key) { (gdb) bt #0 0x00007fffe7043156 in lsst::afw::table::BaseRecord::getElement (this=0x21c8d60, key=...) at include/lsst/afw/table/BaseRecord.h:61 #1 0x00007fffdc8775f2 in set (value=, key=..., this=0x21c8d60) at /home/boutigny/CFHT/lsstsw/stack/Linux64/afw/11.0-8-g38426eb/include/lsst/afw/table/BaseRecord.h:137 #2 setValue (value=true, i=0, record=..., this=0x1da2500) at include/lsst/meas/base/FlagHandler.h:73 #3 lsst::meas::base::SafeCentroidExtractor::operator() (this=, record=..., flags=...) at src/InputUtilities.cc:134 #4 0x00007fffd03655c6 in lsst::meas::extensions::shapeHSM::HsmPsfMomentsAlgorithm::measure (this=0x1da2410, source=..., exposure=...) at src/HsmMoments.cc:115 #5 0x00007fffd06708d5 in _wrap_HsmPsfMomentsAlgorithm_measure (args=0x7fffccc67b90) at python/lsst/meas/extensions/shapeHSM/hsmLib_wrap.cc:14337 #6 0x00007ffff7aee37f in ext_do_call (nk=-859407472, na=, flags=, pp_stack=0x7fffffff7d18, func=0x7fffd0c21878) at Python/ceval.c:4345 #7 PyEval_EvalFrameEx (f=, throwflag=) at Python/ceval.c:2720 #8 0x00007ffff7aefdbe in PyEval_EvalCodeEx (co=0x7fffd0a9ceb0, globals=, locals=, args=, argcount=3, kws=0x7fffccd43b08, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:3267 {code} See the discussion at DM-4780.",2 +"DM-5251","02/26/2016 12:35:45","lsstsw breakage with spaces in paths","There are still some issues relating to using {{lsstsw}} to build the stack when spaces are in the path to the {{$LSSTSW}} location. This is a fine thing to sort out on Rodeo Day...",1 +"DM-5252","02/26/2016 13:21:52","Base ""bright star"" cut on S/N instead of magnitudes","The astrometry histogram generated by validateDrp.py conflates astrometric and photometric calibration because it uses magnitude for brightness, and this relies on the accuracy of the photometric calibration. [~ctslater] suggests (and I agree) that brightness should be based on signal to noise ratio, thus making the astrometry histogram independent of photometric calibration. ",2 +"DM-5264","02/26/2016 15:41:01","Modernize python in lsst_build","The python in {{lsst_build}} uses old-style print and exception handling. These should be updated to the current standard.",1 +"DM-5265","02/26/2016 15:47:11","Turn on bias-jump fix for all CCDs ","The overscan fix to handle bias jump in an amplifier done in DM-4366 introduced a new config parameter {{overscanBiasJumpBKP}}, and the fix is applied for CCDs on the backplanes specified in {{overscanBiasJumpBKP}}. Previously, the default is to only fix CCDs on backplanes next to the focus chips. But [~mfisherlevine] also see the bias jump features in other CCDs. It would make more sense to turn it on for all CCDs by default. ",0.5 +"DM-5275","02/29/2016 17:17:05","make floating point exception handling cross-platform (or remove it)","jointcal currently has a couple of trapfpe() functions that wrap feenableexcept, which doesn't exist on OSX. Were these an important part of error handling in meas_simastrom, or can I just remove them?",2 +"DM-5277","03/01/2016 08:30:39","replace buildbot with jenkins job(s)","Removing buildbot and replacing it with jenkins would provide a number of benefits * one less dashboard for developers to know about / interact with * one less system for SQRE to maintain * lessening the cost of refactoring the CI drivers scripts as synchronized updates to two CI system configurations would no longer be necessary It should also be easy to go one step further and try to eliminate the need for developers to manually log into the {{lsstsw}} account on {{lsst-dev}} to publish eups distrib packages. ",3 +"DM-5279","03/01/2016 10:09:57","arrays not properly transmitted","Sending a property set with an array as one of the entries only passes the last element of the array.",1 +"DM-5281","03/01/2016 11:20:16","Port HSC skymap, shapelet changesets to LSST","We identified in DM-5162 some changesets that still need to be ported from HSC to LSST: * skymap: ** f83f71718eac5307d575d3113ee3757a63a16de2: Set vertex list in ExplicitTractInfo. * shapelet: ** bb928df3fc2fafe5183e0d075da19994f0af4fc7: Let the value to normalize to be specified in [Multi]ShapeletFunction ",0.5 +"DM-5283","03/01/2016 11:24:33","Port HSC daf_butlerUtils changesets to LSST","We identified in DM-5162 several changesets that still need to be ported from HSC to LSST: * daee24edba01b01a0412df7f9b4cf70be5b10860: CameraMapper: allow a default filter name to be provided * e3fee95d6a1850dd2309d3ebe4e3ef3ffe38eef0: CameraMapper: normalize path names, and remove leading double slash * 476b6ddccd9d0cceb2b89ca34bee7d0fdcd70694: preserve timestamps in cameraMapper.backup() * b2491ef60e5e23afa7d9f0297f257e694aa1af35: Only attempt to update Wcs if it's available * 9f62bcce588fa9abc8e1e44ff2f0275e5230f629: Registry: hold registry cache for a single thread only (HSC-1035) * 412f03b95b7a5e82003ab33a61bd43adbf465188: Registry: use a pool of registries to avoid having too many open files",2 +"DM-5284","03/01/2016 11:27:32","Port HSC meas_extensions_simpleShape package to LSST","HSC uses a package, meas_extensions_simpleShape, which needs to be ported to LSST. The package is used for basic shape measurements for determining focus, and also serves as a simple guide for writing measurement plugins.",3 +"DM-5286","03/01/2016 11:32:39","Port HSC meas_deblender changesets to LSST","We identified in DM-5162 a few changesets that still need to be ported from HSC to LSST: * a8cf6c22df14494d6dcf2d7354c695cba9506301: Clarify tiny footprint limit * 624790aa63a38fb7a328ebc21abfd1b10503aa26: config: change default strayFluxRule * db7d705de93b43a5f32f771c716b1c5c7368d124: consolidate failed peak logic and downgrade warning We also identified a few differences that should be resolved: * clipStrayFluxFraction defaults to 0.01 for LSST, 0.001 for HSC * Stray file, src/Baseline.cc.orig, on LSST side ",1 +"DM-5287","03/01/2016 11:34:32","Port HSC ip_isr changesets to LSST","We identified in DM-5162 some changesets that still need to be ported from HSC to LSST: * f1cee734998f1faf86c02af42ea599b077847eeb: IsrTask: allow fallback to a different filter when loading calibrations * 89cd629bb8e1a72a545176311b1ef659358d95af: saturationDetection: apply to overscan as well as image ",1 +"DM-5288","03/01/2016 11:38:22","Port HSC pipe_tasks changesets to LSST","We identified in DM-5162 some changesets that still need to be ported from HSC to LSST: * 31ab5f02f7722650ad0a0eb4e2f7f8b3e0073366, 0c9a4a06bfb34ed26c72109131ef9f4a8c8f237a: multiBand: save background-subtracted coadd as deepCoadd_calexp * e99e140feafe28e6f034143e8ee2ae58e9a9358d: Rejig interface for DetectCoaddSourcesTask to provide non-dataRef-centric API * 829ee0cdd605ed027af1fada4446b715d9a5180d: multiband: activate sky objects * MeasureMergedCoaddSources.doMatchSources defaults to False * ProcessImageConfig.doWriteHeavyFootprintsInSources defaults to False ? * 56666e8feba6893ac95fd4982d3e0daf6baf2d34: WcsSelectImagesTask: catch imagePoly is None We also noticed some differences: * * CalibrateConfig.setDefaults doesn't call parent * CalibrateTask.run isn't returning apCorrMap * reserveFraction=-1 instead of 0.2 ",3 +"DM-5289","03/01/2016 11:45:57","Port HSC obs_subaru changesets to LSST","We identified in DM-5162 several changesets that still need to be ported from HSC to LSST: * 8948917de4579e032c7bbb2c8316014446e3841b: config: add astrometry filter map for HSC narrow-band filters * 69d35a890234e37c1142ddbeff43e62fe36e6c45: Set radius for flux.naive, adjust comment for flux.sinc * 8ea54d10f5ae56f8b6f244bca76d5796ae015216: config: disable sigma clipping in coadd assembly * 8d2f4a02d0d668fc82e853b633444d8e0fe80010: config: reduce coadd subregionSize * e36bd1b4410812ca314f50c01f899d92acc0e7a5: config: set pixelScale for jacobian correction * Remove processCcdOnsiteDb.py, processStack.py * Rename stacker.py to coaddDriver.py or whatever Nate chooses in DM-3369 * 49e9f5dcf16490f6be6438b89b17911a0cd35fb2: Fixed obvious errors caused by introducing VignetteConfig * 8948917de4579e032c7bbb2c8316014446e3841b: config: add astrometry filter map for HSC narrow-band filters * daa43eeac46e8708de6f37feeb5d5d16a3caca11: HscMapper: set unit exposure time for dark * 77ff7c89d56bed94bca4f320f839dbd20fbab641: Set BAD mask for dead amps instead of SAT We also noticed the following need to be done: * Forced photometry configuration (CCDs and Coadds) * Sanitize config of OBS_SUBARU_DIR (use getPackageDir) * multiband config files need ""root"" --> ""config"" * No astrometry in measureCoaddSources * Narrow bands missing from priority list * detectCoaddSources removed from multiband * Move filterMap from config/processCcd.py into own file",5 +"DM-5290","03/01/2016 13:37:42","Add z-index for dialogs components","Some of the outside modules that we have brought in have a z-index. We need to make sure that our dialog components stay on top of them.",2 +"DM-5291","03/01/2016 13:44:37","Docker-ready configuration system for LTD Keeper","To deploy LTD Keeper in a Docker container (DM-5194), it’s best practice to handle all configurations through environment variables. In DM-4950, LTD Keeper was configured through files for test and dev/deployment profiles. What we should do is continue to allow hard-coded configurations for test and dev environments, but have a third fully fledged configuration environment that’s driven entirely by environment variables. The environment variables should allow fine grained configuration (for example, to turn off calls to individual external services for testing). This should also resolve how to deal with Google Container Engine/Kubernetes auth flow works with environment variables, config files, and profiles.",1 +"DM-5298","03/01/2016 16:54:28","Document simple simulator","Document the simple simulator produced in DM-4899. This will also involve some refactoring and adding unit tests to make it usable by others in the group.",8 +"DM-5302","03/02/2016 11:03:46","manage jenkins core + plugin versions","There have been a couple of issues that have arisen when deploying test instances vs updating an existing instance due to slight differences between plugin versions. This would be avoided by putting all plugin versions under change control. Including: * The versions of all jenkins components need to be explicitly specified * The stored job {{config.xml}}'s should be updated to reflect plugin version changes * The hipchat notification configuration should be updated to fix breakage caused by the production core/plugin update earlier this week ",5 +"DM-5312","03/02/2016 15:12:36","Additional vertical partitioning tests","Test potential improvements in many-vertical-shards test (20,50) run-times with query optimizer settings.",5 +"DM-5319","03/03/2016 07:45:00","Fix mariadb CI","patch package is missing in docker container used by travis-CI.",1 +"DM-5320","03/03/2016 08:04:34","Make Bright Object Masks compatible with all cameras","Currently all of the logic that goes into using bright object masks falls into obs_subaru and pipe_tasks. This ticket should move parts (such as the bright object mask class) out of obs_subaru, into a camera agnostic location. The work should also duplicate relevant camera configurations and parameter overrides in the other camera packages. Bright object masks were originally introduced in DM-4831",2 +"DM-5321","03/03/2016 09:22:08","MeasureApCorrTask should use slot_CalibFlux as default ref flux","{{MeasureApCorrTask}} uses ""base_CircularApertureFlux_17_0"" as its default reference flux. It should use ""slot_CalibFlux"" instead. Also check obs_sdss packages for overrides that can be removed; obs_sdss certainly has one in {{config/processCcdTask.py}}",0.5 +"DM-5324","03/03/2016 13:13:09","Convert GWT code to pure JavaScript (X16, part2, basic)","Continue to work on the GWT code conversion to JavaScript.",100 +"DM-5336","03/04/2016 08:25:53","Fix minor issues in docker procedure","- params.sh was missing at configuration - startup.py wasn't importing correctly module ""utils"" - remove unused parameters in params.sh",1 +"DM-5348","03/04/2016 12:07:31","Get rid of ProcessCcdSdssTask and ProcessCcdDecamTask","Update {{ProcessCcdTask}} so that it can be used with different datasete types as appropriate for the ISR task. This will allow us to get rid of obs-specific variants {{ProcessCcdSdssTask}} and {{ProcessCcdDecamTask}} The plan is to change {{ProcessCcdTask}} as follows: - set {{doMakeDataRefList=False}} in the call to {{add_id_argument}} - get the dataset type from the ISR task (default to ""raw"") and set it in data container - make the dataRef list by calling {{makeDataRefList}} on the data container Question for DECam folks: do you want two executable scripts for DECam (one that processes data from the community pipeline and one that performs ISR)? Or do you prefer one exectutable (in which case you switch between performing ISR and reading the output of the community pipeline output by retargeting the ISR task)? If you prefer one binary, then which should be the default: perform ISR or read the output of the community pipeline?",2 +"DM-5349","03/04/2016 12:15:18","Revise LSE-140 to account for recent changes to calibration instrumentation","Produce a revision of LSE-140, the DM - to - auxiliary instrumentation ICD, taking into account recent changes to the calibration instrumentation.",5 +"DM-5350","03/04/2016 12:20:25","Establish goals and create EA framework for LSE-140 update","Deliverable: together with [~pingraham], identify the changes needed and develop initial content in EA.",2 +"DM-5351","03/04/2016 12:27:24","Create change request for LSE-140","Deliverable: change request and document diffs for LSE-140",1 +"DM-5355","03/04/2016 16:45:18","meas_algorithms uses packages that are not listed in table file","{{meas_algorithms}} directly uses the following packages not expressed in the table file: * Minuit2 * daf_persistence * daf_base * pex_config * pex_exceptions * pex_policy ",0.5 +"DM-5356","03/05/2016 10:37:51","Test consistency of Shear Measurements with different Psfs","DM-1136 was done with a single Psf, partly to avoid some of the problems we found with PsfShapeletApprox. In this issue, I will look at consistency of the measurement for different Psfs.",8 +"DM-5359","03/05/2016 22:53:37","Update DMTN-002 to reflect last changes","Need to update documentation with latest changes on {{pipe_base}}, {{pipe_supertask}} and {{pipe_flow}}",1 +"DM-5364","03/06/2016 17:53:15","Image Select Panel: Support add or modify of plot","previously the image select panel would only modify a plot. Now give it the ability to add a plot.",8 +"DM-5370","03/07/2016 13:56:37","Create lsst_ci package as a continuous integration build target","Create an {{lsst_ci}} package to be built for the continuous integration testing. Plan: 1. Create empty package that has dependencies on {{obs_cfht}}, {{obs_decam}}, {{obs_subaru}}, {{testdata_cfht}}, {{testdata_decam}}, {{testdata_subaru}}. (/) 2. Ensure above builds. (/) 3. Add {{obs_lsstSim}} and ensure that it builds. (/) The following were moved to DM-5381: [ [~tjenness] : How can I get strikethrough to work in the following list?] 3. Add dependencies on {{validation_data_cfht}} and {{validation_data_decam}}, and {{validate_drp}}. 4. Run CFHT, DECam quick examples in {{validate_drp}}. 5. Test for successful running of the above examples. Fail and trigger Jenkins FAILURE message if these examples fail. 6. Check performance of CFHT, DECam runs against reference numbers. Fail if there is a significant regression. 7. Decide how to include {{ci_hsc}}, which currently can take at least 30 minutes to process the image data.--",1 +"DM-5372","03/07/2016 18:06:30","Fix obs_* packages and ci tests broken by DM-4683","The butler changes in DM-4683, in particular the removal of {{.mapper}} from the interface exposed by a {{Butler}} object, broken {{obs_cfht}}, {{obs_decam}}, and {{ci_hsc}}. This issue will fix those changes, and search for additional broken things. This work is proceeding in conjunction with DM-5370 to test that the CI system, e.g. {{lsst_ci}}, is sensitive to these breakages and fixes.",1 +"DM-5384","03/08/2016 12:28:38","Port SdssShape changes from HSC meas_algorithms to LSST meas_base","In porting {{meas_algorithm}} changes from HSC to LSST, modifications to the {{SdssShape}} algorithm were discovered. These changes should be transferred to LSST.",3 +"DM-5385","03/08/2016 13:55:47","calib_psfReserved is only defined when candidate reservation is activated","The schema should in general not be a function of whether particular features are enabled or disabled so that users can have confidence looking for columns. However, {{MeasurePsfTask}} only creates the {{calib_psfReserved}} column when {{reserveFraction > 0}}. This causes warnings when attempting to propagate flags from calibration catalogs to deep catalogs.",1 +"DM-5390","03/08/2016 21:22:32","JavaScript loading/caching plan","We need to ensure that the latest version of the application(javascript) is loaded. Conditions: 1. once loaded, it should be cached by the browser. 2. name of the script has to be a static, so it can be referenced by api user. 3. it also has to load dependencies(gwt scripts) after the main script is loaded. To do this, we created a tiny firefly_loader.js script whose role is to load the main script and then its dependencies. firefly_loader.js is configured to never cache so that the latest main script is always picked up. The main script is appended with a unique hash on every build. This ensures that the browser will pick up the new script the very first time, and then cache it for future use. ",2 +"DM-5392","03/09/2016 09:53:03","Please stop leaving repoCfg.yaml files around","After a recent change to {{daf_persistence}} and possibly other packages I'm finding that many packages leave {{repoCfg.yaml}} files lying around after they run unit tests. I'm not sure what is best to do about these files. If they are temporary, as I am guessing, then I think we need some way to clean them up when the tests that generated them have run. If they are intended to be permanent (which would be surprising for auto-generated files) then they should probably be committed? I hope we can do better than adding them to .gitignore.",1 +"DM-5394","03/09/2016 11:41:15","Investigate boost compiler warnings and update boost to v1.60","As reported in comments in DM-1304 clang now triggers many warnings with Boost v1.59: {code} /Users/rowen/UW/LSST/lsstsw/stack/DarwinX86/boost/1.59.lsst5/include/boost/archive/detail/check.hpp:148:5: warning: unused typedef 'STATIC_WARNING_LINE148' [-Wunused-local-typedef] BOOST_STATIC_WARNING(typex::value); ^ /Users/rowen/UW/LSST/lsstsw/stack/DarwinX86/boost/1.59.lsst5/include/boost/serialization/static_warning.hpp:100:33: note: expanded from macro 'BOOST_STATIC_WARNING' #define BOOST_STATIC_WARNING(B) BOOST_SERIALIZATION_BSW(B, __LINE__) ^ /Users/rowen/UW/LSST/lsstsw/stack/DarwinX86/boost/1.59.lsst5/include/boost/serialization/static_warning.hpp:99:7: note: expanded from macro 'BOOST_SERIALIZATION_BSW' > BOOST_JOIN(STATIC_WARNING_LINE, L) BOOST_STATIC_ASSERT_UNUSED_ATTRIBUTE; ^ /Users/rowen/UW/LSST/lsstsw/stack/DarwinX86/boost/1.59.lsst5/include/boost/config/suffix.hpp:544:28: note: expanded from macro 'BOOST_JOIN' #define BOOST_JOIN( X, Y ) BOOST_DO_JOIN( X, Y ) ^ /Users/rowen/UW/LSST/lsstsw/stack/DarwinX86/boost/1.59.lsst5/include/boost/config/suffix.hpp:545:31: note: expanded from macro 'BOOST_DO_JOIN' #define BOOST_DO_JOIN( X, Y ) BOOST_DO_JOIN2(X,Y) ^ /Users/rowen/UW/LSST/lsstsw/stack/DarwinX86/boost/1.59.lsst5/include/boost/config/suffix.hpp:546:32: note: expanded from macro 'BOOST_DO_JOIN2' #define BOOST_DO_JOIN2( X, Y ) X##Y ^ :25:1: note: expanded from here STATIC_WARNING_LINE148 ^ {code} v1.60 is the current version so we should see if these warnings have been fixed in that version.",2 +"DM-5402","03/09/2016 14:25:39","Make cluster deployment scripts more generic and enable ccqserv100...124","These scripts will be improved (i.e. more genericity) and integrated inside Qserv code. Qserv will be deployed on ccqserv100 to ccqserv125",3 +"DM-5406","03/09/2016 18:13:18","Require fields listed in icSourceFieldsToCopy to be present","{{CalibrateTask}} presently treats config field {{icSourceFieldsToCopy}} as a list of fields to copy *if present*. This was required because one of the standard fields to copy was usually missing. However, [~price] fixed that problem in DM-5385. Now we can raise an exception if any field listed is missing (though I propose to continue ignoring {{icSourceFieldsToCopy}} if isSourceCatalog is not provided).",1 +"DM-5410","03/10/2016 10:18:51","DecamIngestTask is mis-calling openRegistry","`DecamIngestTask` is mis-calling `lsst.pipe.tasks.RegistryTask`. Line 59: {code} with self.register.openRegistry(args.butler, create=args.create, dryrun=args.dryrun) as registry: {code} {{openRegistry}} is expecting a directory name, not a butler object for the first argument Thanks to [~wmwood-vasey] for diagnosing this.",1 +"DM-5416","03/10/2016 15:27:11","Ci Deploy and Distribution Improvements part IV","This is a bucket epic for ongoing improvements to the CI system",8 +"DM-5419","03/10/2016 15:45:31","ci_hsc fails test requiring >95% of PSF stars to be stars on the coadd","Since the first week of March 2016, ci_hsc fails its test that requires that >95% of the PSF stars be identified as stars in the coadd. I suspect this is related to the DM-4692 merge. Here is a sample job that fails: https://ci.lsst.codes/job/stack-os-matrix/9084/label=centos-6/console The relevant snippet of the failure is: {code} [2016-03-10T17:12:06.667778Z] : Validating dataset measureCoaddSources_config for {'filter': 'HSC-R', 'tract': 0, 'patch': '5,4'} [2016-03-10T17:12:06.697383Z] CameraMapper: Loading registry registry from /home/build0/lsstsw/build/ci_hsc/DATA/registry.sqlite3 [2016-03-10T17:12:06.697615Z] CameraMapper: Loading calibRegistry registry from /home/build0/lsstsw/build/ci_hsc/DATA/CALIB/calibRegistry.sqlite3 [2016-03-10T17:12:07.716310Z] CameraMapper: Loading registry registry from /home/build0/lsstsw/build/ci_hsc/DATA/registry.sqlite3 [2016-03-10T17:12:07.716443Z] CameraMapper: Loading calibRegistry registry from /home/build0/lsstsw/build/ci_hsc/DATA/CALIB/calibRegistry.sqlite3 [2016-03-10T17:12:08.663566Z] : measureCoaddSources_config exists: PASS [2016-03-10T17:12:08.721051Z] : measureCoaddSources_config readable (): PASS [2016-03-10T17:12:08.721077Z] : Validating dataset measureCoaddSources_metadata for {'filter': 'HSC-R', 'tract': 0, 'patch': '5,4'} [2016-03-10T17:12:08.721249Z] : measureCoaddSources_metadata exists: PASS [2016-03-10T17:12:08.721663Z] : measureCoaddSources_metadata readable (): PASS [2016-03-10T17:12:08.721715Z] : Validating dataset deepCoadd_meas_schema for {'filter': 'HSC-R', 'tract': 0, 'patch': '5,4'} [2016-03-10T17:12:08.721878Z] : deepCoadd_meas_schema exists: PASS [2016-03-10T17:12:08.726703Z] : deepCoadd_meas_schema readable (): PASS [2016-03-10T17:12:08.726834Z] : Validating source output for {'filter': 'HSC-R', 'tract': 0, 'patch': '5,4'} [2016-03-10T17:12:10.203469Z] : Number of sources (7595 > 100): PASS [2016-03-10T17:12:10.204166Z] : calib_psfCandidate field exists in deepCoadd_meas catalog: PASS [2016-03-10T17:12:10.204772Z] : calib_psfUsed field exists in deepCoadd_meas catalog: PASS [2016-03-10T17:12:10.205468Z] : Aperture correction fields for base_PsfFlux are present.: PASS [2016-03-10T17:12:10.206159Z] : Aperture correction fields for base_GaussianFlux are present.: PASS [2016-03-10T17:12:10.207193Z] FATAL: 95% of sources used to build the PSF are classified as stars on the coadd (0 > 0): FAIL [2016-03-10T17:12:10.207455Z] scons: *** [.scons/measure-HSC-R] AssertionError : Failed test: 95% of sources used to build the PSF are classified as stars on the coadd (0 > 0) [2016-03-10T17:12:10.207481Z] Traceback (most recent call last): [2016-03-10T17:12:10.207525Z] File ""/home/build0/lsstsw/stack/Linux64/scons/2.3.5/lib/scons/SCons/Action.py"", line 1063, in execute [2016-03-10T17:12:10.207556Z] result = self.execfunction(target=target, source=rsources, env=env) [2016-03-10T17:12:10.207593Z] File ""/home/build0/lsstsw/build/ci_hsc/python/lsst/ci/hsc/validate.py"", line 133, in scons [2016-03-10T17:12:10.207611Z] return self.run(*args, **kwargs) [2016-03-10T17:12:10.207646Z] File ""/home/build0/lsstsw/build/ci_hsc/python/lsst/ci/hsc/validate.py"", line 122, in run [2016-03-10T17:12:10.207663Z] self.validateSources(dataId) [2016-03-10T17:12:10.207732Z] File ""/home/build0/lsstsw/build/ci_hsc/python/lsst/ci/hsc/validate.py"", line 191, in validateSources [2016-03-10T17:12:10.207749Z] 0.95*psfStars.sum() [2016-03-10T17:12:10.207786Z] File ""/home/build0/lsstsw/build/ci_hsc/python/lsst/ci/hsc/validate.py"", line 52, in assertGreater [2016-03-10T17:12:10.207816Z] self.assertTrue(description + "" (%d > %d)"" % (num1, num2), num1 > num2) [2016-03-10T17:12:10.207853Z] File ""/home/build0/lsstsw/build/ci_hsc/python/lsst/ci/hsc/validate.py"", line 43, in assertTrue [2016-03-10T17:12:10.207877Z] raise AssertionError(""Failed test: %s"" % description) [2016-03-10T17:12:10.207919Z] AssertionError: Failed test: 95% of sources used to build the PSF are classified as stars on the coadd (0 > 0) [2016-03-10T17:12:10.209935Z] scons: building terminated because of errors. {code} This is the test that fails https://github.com/lsst/ci_hsc/blob/74303a818eb5049a2015b5e885df2781053748c9/python/lsst/ci/hsc/validate.py#L169 {code} class MeasureValidation(Validation): _datasets = [""measureCoaddSources_config"", ""measureCoaddSources_metadata"", ""deepCoadd_meas_schema""] _sourceDataset = ""deepCoadd_meas"" _matchDataset = ""deepCoadd_srcMatch"" def validateSources(self, dataId): catalog = Validation.validateSources(self, dataId) self.assertTrue(""calib_psfCandidate field exists in deepCoadd_meas catalog"", ""calib_psfCandidate"" in catalog.schema) self.assertTrue(""calib_psfUsed field exists in deepCoadd_meas catalog"", ""calib_psfUsed"" in catalog.schema) self.checkApertureCorrections(catalog) # Check that at least 95% of the stars we used to model the PSF end up classified as stars # on the coadd. We certainly need much more purity than that to build good PSF models, but # this should verify that flag propagation, aperture correction, and extendendess are all # running and configured reasonably (but it may not be sensitive enough to detect subtle # bugs). psfStars = catalog.get(""calib_psfUsed"") extStars = catalog.get(""base_ClassificationExtendedness_value"") < 0.5 self.assertGreater( ""95% of sources used to build the PSF are classified as stars on the coadd"", numpy.logical_and(extStars, psfStars).sum(), 0.95*psfStars.sum() ) {code} Note that the assertion failure messages is a bit confusing. It should say ""Fewer than 95% of the sources used to build the PSF are classified as stars on the coadd.""",1 +"DM-5421","03/10/2016 17:57:52","Add --show history option to cmdLineTask","{{pex_config}} is able to report where a config parameter is set. Please add a command line option {{--show history=config.parameter.name}} to the cmdLineTask parser. The implementation will probably want to use something like: {code:python} import lsst.pex.config.history as pch pch.Color.colorize(False) print pch.format(config.calibrate.astrometry.solver, ""matchingRadius"") {code} ",2 +"DM-5424","03/11/2016 10:11:50","Switch PropagateVisitFlags to use src instead of icSrc","On DM-5084 [~jbosch] switched PropagateVisitFlags to match against icSrc instead of src because we weren't yet matching `icSrc` to `src` in ProcessCcdTask. That's now been done on DM-4692, so we can revert this. After doing so, please verify with ci_hsc that this is working, as that's where the only test of this feature lives.",2 +"DM-5427","03/11/2016 13:11:00","SingleFrameVariancePlugin can give numpy warnings","SingleFrameVariancePlugin can produce the following numpy warning, with no hint as to where the problem is coming from: {code} /Users/rowen/UW/LSST/lsstsw/miniconda/lib/python2.7/site-packages/numpy/core/_methods.py:59: RuntimeWarning: Mean of empty slice. warnings.warn(""Mean of empty slice."", RuntimeWarning) {code} I tracked it down by adding the following code to the calling code: {code} import warnings with warnings.catch_warnings(): warnings.filterwarnings('error') {code} It would be nice if the measurement plugin handled this situation more gracefully, such as turning the warning into an exception or testing for it and handling it. One way to reproduce this problem is to run {{tests/testProcessCcd.py}} in {{pipe_tasks}}. However, it is commonly seen when running {{processCcd}} on other data, as well.",2 +"DM-5428","03/11/2016 13:16:32","ObjectSizeStarSelector can produce numpy warnings","`ObjectSizeStarSelector` can produce the following numpy warning: {code} RuntimeWarning: invalid value encountered in less {code} This occurs at the following point in the code: {code} for i in range(nCluster): # Only compute func if some points are available; otherwise, default to NaN. pointsInCluster = (clusterId == i) if numpy.any(pointsInCluster): centers[i] = func(yvec[pointsInCluster]) {code} where `func` has been assigned to `numpy.mean`. When I have seen this occur I have found that `dist` is an array of `nan` I suggest that the star selector handle this situation more gracefully, e.g. by reporting an appropriate exception or handling the data in an appropriate way. If logging a message would be helpful, then please do that (and if RFC-154 is adopted, a log will be available). One way to reproduce this is to run `tests/testProcessCcd.py` in `pipe_tasks`. However, I often see it when running `processCcd.py` on other data, as well.",2 +"DM-5431","03/11/2016 14:51:34","Changes to galaxy_shear_experiments Python code","This ticket describes changes which were made to the test runner and analysis scripts during the Dec 2015 - Feb 2016 period. Most of these changes were made as a part of moving to a large computing cluster, where both the units of work and the output file organization had to be changed to make parallelization possible. The large number of tests run during this period and the need to more efficiently analyze and compare also introduced some changed to the analysis and plot modules. Since these changes do not pertain to any single test (though many were done during Dm-1136), I have put them on a separate ticket.",5 +"DM-5435","03/13/2016 14:27:25","Provide a shared stack on lsst-dev & other relevant systems","Following the discussion in RFC-156, ensure that a documented, fast, easy to initialize shared stack is available for developers to use on shared systems, certainly to include {{lsst-dev}}.",3 +"DM-5447","03/14/2016 16:15:19","Write technical note describing galaxy shear fitting experiments","Through S15 (DM-1108) and W16 (DM-3561), [~pgee] has conducted a large-scale investigation into galaxy shear fitting. Please summarize the motivation, methodology and results of this study as a [technical note|http://sqr-000.lsst.io/en/master/].",8 +"DM-5448","03/14/2016 16:22:50","Familiarization with ngmix codebase","Download the ngmix codebase from https://github.com/esheldon/ngmix. Install it and its dependencies in the same environment as the LSST stack. Experiment with using it and understanding how it works",3 +"DM-5449","03/14/2016 16:34:37","Convert GWT code to pure JavaScript (F16)","The remaining work for converting GWT code to pure JavaScript",100 +"DM-5463","03/15/2016 15:39:43","Don't restore the mask in CharacterizeImageTask.characterize","CharacterizeImageTask.characterize presently restores the mask from a deep copy for each iteration of the loop to compute PSF. This is unnecessary because repair and detection both clear the relevant mask planes before setting new values.",1 +"DM-5472","03/16/2016 05:47:26","Update meas_mosaic for compatibility with new single frame processing","Following [recent changes to single frame processing|https://community.lsst.org/t/backward-incompatible-changes-to-processccdtask-and-subtasks/581], {{icSrc}} no longer includes celestial coordinates and {{icMatch}} is no longer being written. {{meas_mosaic}} requires this information. Provide a work-around.",3 +"DM-5473","03/16/2016 07:30:59","Jenkins/ci_hsc failure: 'base_PixelFlags_flag_clipped' already present in schema","Since 15 March, the {{ci_hsc}} build in Jenkins has been failing as follows: {code} [2016-03-16T14:23:13.548928Z] Traceback (most recent call last): [2016-03-16T14:23:13.548956Z] File ""/home/build0/lsstsw/stack/Linux64/pipe_tasks/2016_01.0-23-gcf99090/bin/measureCoaddSources.py"", line 3, in [2016-03-16T14:23:13.548969Z] MeasureMergedCoaddSourcesTask.parseAndRun() [2016-03-16T14:23:13.548999Z] File ""/home/build0/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869/python/lsst/pipe/base/cmdLineTask.py"", line 450, in parseAndRun [2016-03-16T14:23:13.549011Z] resultList = taskRunner.run(parsedCmd) [2016-03-16T14:23:13.549040Z] File ""/home/build0/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869/python/lsst/pipe/base/cmdLineTask.py"", line 192, in run [2016-03-16T14:23:13.549048Z] if self.precall(parsedCmd): [2016-03-16T14:23:13.549076Z] File ""/home/build0/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869/python/lsst/pipe/base/cmdLineTask.py"", line 279, in precall [2016-03-16T14:23:13.549087Z] task = self.makeTask(parsedCmd=parsedCmd) [2016-03-16T14:23:13.549115Z] File ""/home/build0/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869/python/lsst/pipe/base/cmdLineTask.py"", line 369, in makeTask [2016-03-16T14:23:13.549132Z] return self.TaskClass(config=self.config, log=self.log, butler=butler) [2016-03-16T14:23:13.549160Z] File ""/home/build0/lsstsw/stack/Linux64/pipe_tasks/2016_01.0-23-gcf99090/python/lsst/pipe/tasks/multiBand.py"", line 1008, in __init__ [2016-03-16T14:23:13.549179Z] self.makeSubtask(""measurement"", schema=self.schema, algMetadata=self.algMetadata) [2016-03-16T14:23:13.549206Z] File ""/home/build0/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869/python/lsst/pipe/base/task.py"", line 226, in makeSubtask [2016-03-16T14:23:13.549846Z] subtask = configurableField.apply(name=name, parentTask=self, **keyArgs) [2016-03-16T14:23:13.549901Z] File ""/home/build0/lsstsw/stack/Linux64/pex_config/2016_01.0+1/python/lsst/pex/config/configurableField.py"", line 77, in apply [2016-03-16T14:23:13.549915Z] return self.target(*args, config=self.value, **kw) [2016-03-16T14:23:13.549943Z] File ""/home/build0/lsstsw/stack/Linux64/meas_base/2016_01.0-12-gf26bc28+1/python/lsst/meas/base/sfm.py"", line 248, in __init__ [2016-03-16T14:23:13.549954Z] self.initializePlugins(schema=self.schema) [2016-03-16T14:23:13.549985Z] File ""/home/build0/lsstsw/stack/Linux64/meas_base/2016_01.0-12-gf26bc28+1/python/lsst/meas/base/baseMeasurement.py"", line 298, in initializePlugins [2016-03-16T14:23:13.550004Z] self.plugins[name] = PluginClass(config, name, metadata=self.algMetadata, **kwds) [2016-03-16T14:23:13.550032Z] File ""/home/build0/lsstsw/stack/Linux64/meas_base/2016_01.0-12-gf26bc28+1/python/lsst/meas/base/wrappers.py"", line 15, in __init__ [2016-03-16T14:23:13.550616Z] self.cpp = self.factory(config, name, schema, metadata) [2016-03-16T14:23:13.550647Z] File ""/home/build0/lsstsw/stack/Linux64/meas_base/2016_01.0-12-gf26bc28+1/python/lsst/meas/base/wrappers.py"", line 223, in factory [2016-03-16T14:23:13.550660Z] return AlgClass(config.makeControl(), name, schema) [2016-03-16T14:23:13.550688Z] File ""/home/build0/lsstsw/stack/Linux64/meas_base/2016_01.0-12-gf26bc28+1/python/lsst/meas/base/baseLib.py"", line 3401, in __init__ [2016-03-16T14:23:13.552891Z] this = _baseLib.new_PixelFlagsAlgorithm(*args) [2016-03-16T14:23:13.552924Z] lsst.pex.exceptions.wrappers.InvalidParameterError: [2016-03-16T14:23:13.552967Z] File ""src/table/Schema.cc"", line 563, in lsst::afw::table::Key lsst::afw::table::detail::SchemaImpl::addField(const lsst::afw::table::Field&, bool) [2016-03-16T14:23:13.552986Z] Field with name 'base_PixelFlags_flag_clipped' already present in schema. {0} [2016-03-16T14:23:13.553012Z] lsst::pex::exceptions::InvalidParameterError: 'Field with name 'base_PixelFlags_flag_clipped' already present in schema.' [2016-03-16T14:23:13.553014Z] [2016-03-16T14:23:13.613484Z] scons: *** [.scons/measure] Error 1 [2016-03-16T14:23:13.617577Z] scons: building terminated because of errors. {code} Please fix it.",1 +"DM-5474","03/16/2016 10:37:43","Bugs in obs_subaru found by PyFlakes","I ran pyflakes on the code in obs_subaru and found a few bugs (beyond a few trivial ones that I am fixing as part of DM-5462) {{ingest.py}} has undefined name {{day0}} {{ccdTesting.py}} has at least three undefined variables: {{x}}, {{y}} and {{vig}} in the following: {code} ngood += pupilImage[y[good], x[good]].sum() vig[i] = float(ngood) {code} {{crosstalkYagi.py}} has many undefined names, starting with {{makeList}}, {{estimateCoeffs}}",1 +"DM-5478","03/16/2016 12:35:13","Write script to derive and collate QA metrics from data repository of processed data","I wrote a python script using stack components to derive QA metrics and collate other QA-relevant information for a data repository of processed data. This is currently output to a CSV file that can be loaded into a SQL database.",20 +"DM-5479","03/16/2016 12:37:07","Wrote script to print the names of all visits that overlap a patch","In order to finish the IDL workflow module for makeCoaddTempExp I needed a program to say which visits overlap a given path. That's what this script does.",5 +"DM-5480","03/16/2016 12:40:18","Processing of COSMOS data - Part II","Continued work on processing and QA work on the COSMOS verification dataset. Running processCcDecam, making diagnostic plots, and nvestigating the results. Most recently I've reprocessed the COSMOS data through processCcdDecam using SDSS as the astrometric and photometric reference catalog and am redoing the QA work on those results.",20 +"DM-5482","03/16/2016 12:45:20","Write presentation on verification datasets for AAS","Prepared and gave a talk at the NSF booth at the Florida AAS meeting on the progress of the verification datasets effort.",5 +"DM-5483","03/16/2016 12:47:11","Work on script to test the astrometric matcher","We encouraged astrometric matching problems for the Bulge verification dataset. Therefore, I wrote a script that tests the matcher by systematically shifting the coordinates of one sets of the data to see if the matcher still works. It worked well until ~80 arcsec.",5 +"DM-5484","03/16/2016 12:48:05","SdssMapper.paf has wrong python type for processCcd_config","[~npease] reports that {{Sdssmapper.paf}} has the wrong python data type for the dataset {{processCcd_config}}: it is {{lsst.obs.sdss.processCcdSdss.ProcessCcdSdssConfig}} instead of {{lsst.pipe.tasks.processCcd.ProcessCcdConfig}}",0.5 +"DM-5485","03/16/2016 12:48:48","Work on plan to test specific algorithmic components of the stack","After working on a script to test the astrometric matcher, I decided to put together a plan to run similar tests on our algorithmic code. The rough plan is here: https://confluence.lsstcorp.org/display/SQRE/Stack+Testing+Plan",2 +"DM-5486","03/16/2016 12:51:15","Work on putting together page of ""tips and tricks for using the stack""","Due to the incomplete state of the stack documentation and tutorials, I decided to write down various ""tips and tricks"" for using the stack as I learn them. https://confluence.lsstcorp.org/display/SQRE/Tips+and+Tricks+for+using+the+Stack",2 +"DM-5487","03/16/2016 14:34:54","Revise operations concept for Observation Processing System","Turn the L1 ConOps document into appropriate sections of LDM-230, specifying automated operations sequences, how human intervention can occur, and processes to handle changes and updates. (Story points are for KTL drafting and initial contributions)",2 +"DM-5488","03/16/2016 14:36:42","Field group updates","After some work we have realized that the following needs to be done to field groups: * Tabs group should have a field group smart wrapper component * field group needs to reinit on id change * remove mixin, use Higher-Order Components instead * support a function for a value, this function will return a value or a promise * hidden fields - init field group with key/value object * Sub-field groups? study only, unless it is easy to implement. * maintain an option to keep unmount field value available * determine if InitValue needs to be passed around * passing fieldState around too much * find reason for react warning every time popup is raised * look at promise code make sure it is working the way we think * if practical, remove all export default FieldGroupConnector.  It is the high order component that replaces the mixin. FieldGroupUtils.js: (~line 33): The field value would be a function on the file upload case. Therefore the upload does not activate until validation. In the upload case the function would return a promise. However, It could return a value or an object with a value and a valid status. Now the value key of a field can contain a promise or function or primitive. The function can return a primitive, a promise, or an object with primitive and status. fftools.js lines 102-158 you can see my experimenting with taking out the connector. It works fine and does eliminate one of the warning messages. ",8 +"DM-5489","03/16/2016 14:37:23","improvement of the north/east arrow on image","make the compass sticky when scroll the image",1 +"DM-5490","03/16/2016 14:38:54","Develop operations concept for Batch Processing System","Develop a ConOps document that can be included as appropriate sections of LDM-230 describing the batch processing environment, specifying automated operations sequences, how human intervention can occur, and processes to handle changes and updates. (Story points are for KTL drafting and initial contributions)",3 +"DM-5491","03/16/2016 14:39:35","Develop operations concept for Data Backbone","Develop a ConOps document that can be included as appropriate sections of LDM-230 describing the Data Backbone that contains, manages, and provides access to the Science Data Archive, specifying automated operations sequences, how human intervention can occur, and processes to handle changes and updates. (Story points are for KTL drafting and initial contributions)",3 +"DM-5492","03/16/2016 14:40:35","Develop operations concept for Data Access Processing System","Develop a ConOps document that can be included as appropriate sections of LDM-230 describing the Data Access Processing System that manages L3 computing in and interfaces to the Data Access Center, specifying automated operations sequences, how human intervention can occur, and processes to handle changes and updates. (Story points are for KTL drafting and initial contributions)",3 +"DM-5493","03/16/2016 14:47:26","Develop functional breakdown for Observation Processing System","Write sections that can be incorporated into LDM-148 describing the functional breakdown of the Observation Processing System, including, for each major element: * overall function * inputs, outputs, and control interfaces * components used * descriptions of functions to be performed (Story points are for KTL drafting and initial contributions)",3 +"DM-5494","03/16/2016 14:47:51","Develop functional breakdown for Batch Processing System","Write sections that can be incorporated into LDM-148 describing the functional breakdown of the Batch Processing System, including, for each major element: * overall function * inputs, outputs, and control interfaces * components used * descriptions of functions to be performed (Story points are for KTL drafting and initial contributions)",2 +"DM-5495","03/16/2016 14:48:13","Develop functional breakdown for Data Backbone","Write sections that can be incorporated into LDM-148 describing the functional breakdown of the Data Backbone, including, for each major element: * overall function * inputs, outputs, and control interfaces * components used * descriptions of functions to be performed (Story points are for KTL drafting and initial contributions)",2 +"DM-5496","03/16/2016 14:48:41","Develop functional breakdown for Data Access Center Processing System","Write sections that can be incorporated into LDM-148 describing the functional breakdown of the Data Access Center Processing System, including, for each major element: * overall function * inputs, outputs, and control interfaces * components used * descriptions of functions to be performed (Story points are for KTL drafting and initial contributions)",2 +"DM-5498","03/16/2016 15:24:44","Coordinate completion of operations concepts","Coordinate the creation of a new version of LDM-230 incorporating DPS-WG-generated operations concepts.",2 +"DM-5499","03/16/2016 15:25:35","Coordinate completion of functional breakdowns","Coordinate the creation of a new version of LDM-148 incorporating DPS-WG-generated functional breakdowns.",2 +"DM-5501","03/16/2016 15:42:57","Solve the metadata sanitization problem","Applications need access to visit specific metadata: e.g. pointing, airmass, exposure length. This information is typically carried around in a FITS header, but there are no conventions on spelling or even necessarily units of these metadata key, value pairs. There needs to be a easy to use metadata sanitization process that allows data from many different systems to present a standardized interface to observation metadata to the algorithm code.",100 +"DM-5502","03/16/2016 15:47:20","Collect usage of header metadata","Collect a comprehensive set of exposure oriented metadata used by science code. This should also include metadata that is not currently needed but that could be utilized in the future. In practice, I suspect this will involve looking for all calls to PropertySet.get since that is how FITS header metadata is currently passed around.",5 +"DM-5515","03/17/2016 00:48:50","prepare Slack RFC"," https://jira.lsstcorp.org/browse/RFC-140",1 +"DM-5530","03/17/2016 11:48:24","Documentation of Firefly functions and API (F16)","We are concentrating on the coding in X16. This epic will be capture the effort to write the document for using Firefly functions and API. ",40 +"DM-5542","03/17/2016 14:33:15","AFW rgb.py has undefined variable that breaks a test in some situations","The {{rgb.py}} test is failing for me with current AFW master: {code} tests/rgb.py .E............ ====================================================================== ERROR: testMakeRGBResize (__main__.RgbTestCase) Test the function that does it all, including rescaling ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/rgb.py"", line 313, in testMakeRGBResize with Tempfile(fileName, remove=True): NameError: global name 'Tempfile' is not defined ---------------------------------------------------------------------- Ran 16 tests in 7.296s FAILED (errors=1) {code} {{Tempfile}} is definitely only used in line 313. It was introduced with commit c9864f49. I'm not entirely sure how this is not picked up by Jenkins as the test will run if matplotlib and scipy are installed and Jenkins does have those. ",0.5 +"DM-5552","03/17/2016 15:18:14","Add renderer option to js table","TablePanel and BasicTable now accept optional renderers. For each column, you can set a custom renderer for the header, cell, or both. Also, created several commonly used renderer for images, links, and input field.",2 +"DM-5553","03/17/2016 15:36:05","Z-scale stretch for image display","The z-scale stretch in current system is different from the one in OPS",8 +"DM-5560","03/18/2016 13:29:55","Participate in October 2015 OCS-subsystems teleconference","Prepare for, attend, and follow up on the OCS-subsystems teleconference on October 8, 2015.",2 +"DM-5563","03/18/2016 13:43:09","Participate in November 2015 OCS-subsystems teleconference (LSE-74)","Prepare for, attend, and follow up on the OCS-subsystems teleconference on November 11, 2015. This story covers work related to LSE-74; LSE-70 and LSE-209 work was also done under a separate epic.",1 +"DM-5564","03/18/2016 13:57:43","Participate in December 2015 OCS-subsystems teleconference (LSE-70, LSE-209)","Prepare for, attend, and follow up on the OCS-subsystems teleconference on December 9, 2015. This story covers work related to LSE-70 and LSE-209; LSE-74 work was also done under a separate epic.",2 +"DM-5565","03/18/2016 14:01:43","Participate in December 2015 OCS-subsystems teleconference (LSE-74)","Prepare for, attend, and follow up on the OCS-subsystems teleconference on December 9, 2015. This story covers work related to LSE-74; LSE-70 and LSE-209 work was also done under a separate epic.",1 +"DM-5566","03/18/2016 14:16:47","Review of LSE-70 and LSE-209 drafts, September 2015","Arrange, prepare for, and attend a joint call with the Camera team to review the end-of-summer-2015 drafts of LSE-70 and LSE-209 from the OCS group.",3 +"DM-5567","03/18/2016 14:25:26","CCB review of LCR-567 (LSE-70) and LCR-568 (LSE-209)","Review the LSE-70 and LSE-209 drafts submitted with change requests LCR-567 and LCR-568 in January 2016.",2 +"DM-5568","03/18/2016 14:41:59","CCB review of LCR-603 (LSE-74)","Review LCR-603, ""LSE-74 document revision""",2 +"DM-5580","03/21/2016 01:48:20","Docgen draft from EA content for LSE-140","Create a docgen from the LSE-140 content in Enterprise Architect.",2 +"DM-5582","03/21/2016 08:10:03","Support LCR-385","Support getting LCR-385 against LSE-78 through the CCB.",3 +"DM-5585","03/22/2016 08:52:17","SQuaRE Communication and Publication Platforms Document and Presentation - Clone","This is a clone of DM-5581 tracking [~frossie]'s SPs",5 +"DM-5586","03/22/2016 09:28:44","Fix obs_decam butler level","There is a bug in {{obs_decam/policy/DecamMapper.paf}}, causing some butler features for the ""visit"" level or above working incorrectly. The {{hdu}} key is irrelevant for the visit level or above, but wasn't included in the policy file. Because of this bug, the {{DemoTask}} in {{ctrl_pool}} (ctrlPoolDemo.py) runs incorrectly with DECam data. It incorrectly treats dataRef with different {{hdu}}s as they are from different visits, hence reads each ccd image multiple times (61 times for one visit with 61 hdu). Instead, each ccd image should be read once. Besides fixing the policy file, I also added an optional test that only runs if {{testdata_decam}} is set up. The part with level=""visit"" in the test fails without the ticket changes in the policy. (p.s. The raw data file in {{testdata_decam}} is modified and has only 2 hdus.) ",3 +"DM-5590","03/22/2016 14:13:55","Fix afw build issues with recent clang","{{afw}} fails to build with recent versions of clang: {code} include/lsst/afw/image/MaskedImage.h:553:65: error: '_loc' is a protected member of 'lsst::afw::image::MaskedImage::MaskedImageLocatorBase, boost::mpl::range_c > > *> >,       boost::gil::memory_based_2d_locator, boost::mpl::range_c > > *> >,       boost::gil::memory_based_2d_locator, boost::mpl::range_c > > *> >, Reference>'                                      const_VarianceLocator(iter._loc.template get<2>()) {code} and issues with statistics.i so far, more errors may turn up as these are cleared. These problems are apparent with {{Apple LLVM version 7.3.0 (clang-703.0.29)}} (as shipped with the latest release of XCode, hence this now becoming an issue) and {{clang version 3.8.0 (branches/release_38 262722)}} (a recent release from LLVM; note that Apple uses its own versioning scheme). {{clang version 3.7.1 (tags/RELEASE_371/final)}} is not affected.",1 +"DM-5591","03/22/2016 16:45:54","Archive in a box v1 (F16)","Several times we were asked a question about Firefly: Great software. How could I use it for my data now? This epic capture the preparation work in Firefly for its final version of ""Archive in a box"". The work is also needed to improve the user experience in using Firefly. 8/1/2017 Future related work will be captured in stories of epic DM-10853. ",40 +"DM-5593","03/23/2016 12:55:36","fix issue where butler repository search returns list for single item","Backwards compatible behavior is that when butler returns a single item, it is NOT in a list. A recent change (when the Repository class was added) broke this behavior. Change it back so that if an operation in repository would return a list with a single item, it pulls it from the list. Note this is only related to the case where a repository's parentJoin field is set to 'outer' and since no one is using this yet (they should not be, anyway) then the point is moot. ",1 +"DM-5594","03/23/2016 14:32:28","Fix qserv service timeout issue","After Qserv services have been running over ~couple of days, new queries fail and can also lead to a crash. Investigate and implement a solution.",5 +"DM-5595","03/23/2016 20:31:14","daf_persistence build failure on OSX","I see the following build failure in {{daf_persistence}} on OSX 10.11: {code} c++ -o python/lsst/daf/persistence/_persistenceLib.so -bundle -F/ -undefined suppress -flat_namespace -headerpad_max_install_names python/lsst/daf/persistence/persistenceLib_wrap.os -Llib -L/private/tmp/ssd/swinbank/shared_stack/DarwinX86/mariadbclient/10.1.11-2-gd04d8b7/lib -L/private/tmp/ssd/swinbank/shared_stack/DarwinX86/pex_policy/2016_01.0+4/lib -L/private/tmp/ssd/swinbank/shared_stack/DarwinX86/pex_logging/2016_01.0+4/lib -L/private/tmp/ssd/swinbank/shared_stack/DarwinX86/daf_base/2016_01.0+4/lib -L/private/tmp/ssd/swinbank/shared_stack/DarwinX86/utils/2016_01.0+4/lib -L/private/tmp/ssd/swinbank/shared_stack/DarwinX86/pex_exceptions/2016_01.0+3/lib -L/private/tmp/ssd/swinbank/shared_stack/DarwinX86/base/2016_01.0+3/lib -L/private/tmp/ssd/swinbank/shared_stack/DarwinX86/boost/1.59.lsst5/lib -L/tmp/ssd/swinbank/shared_stack/DarwinX86/miniconda2/3.19.0.lsst4/lib/python2.7/config -ldaf_persistence -lboost_serialization -lmysqlclient_r -lpex_policy -lpex_logging -lboost_filesystem -lboost_system -ldaf_base -lutils -lpex_exceptions -lbase -lboost_regex -lpthread -ldl -lpython2.7 ld: file not found: libz.1.dylib for architecture x86_64 clang: error: linker command failed with exit code 1 (use -v to see invocation) scons: *** [python/lsst/daf/persistence/_persistenceLib.so] Error 1 scons: building terminated because of errors. {code} This happens with the current master ({{3484020}} at time of writing), but also with a recent weekly ({{3878625}}). ",1 +"DM-5607","03/24/2016 11:35:18","check & correct comparison operators in daf_persistence and daf_butlerUtils","per comments in DM-5593, an incorrect comparison operator was found, that used {{is}} instead of {{==}} in a string comparison (e.g. {{var is 'left'}} which is incorrect, it should be {{var == 'left'}}. This needs to be corrected in {{Repository}} (see DM-5593 for details), and the rest of daf_persistence and daf_butlerUtils should be checked for correct use of is vs. ==.",1 +"DM-5633","03/24/2016 16:22:00","Add data products and config in obs_decam for multi-band processing","Add necessary data products and default config in order to run forcedPhotCcd, coaddDriverTask, and multiBandDriverTask with DECam data. ",3 +"DM-5635","03/25/2016 07:14:37","not flagged NaN in sdssCentroid","In some rare cases base_SdssCentroid_x(y)Sigma can be NaN while base_SdssCentroid_flag is False",1 +"DM-5641","03/26/2016 20:31:53","finish up afw.table to astropy.table view support","At an LSST/AstroPy summit hack session, we've put together a functional system for viewing afw.table objects as astropy.table objects on branch u/jbosch/astropy-tables of afw and https://github.com/astropy/astropy/pull/4740. Before merging, we should add support for ""object"" columns for subclasses to hold e.g. Footprints in SourceCatalog, and add some documentation. We may also want to add a convenience method to return an astropy.table.Table directly.",1 +"DM-5643","03/26/2016 20:40:39","add method to convert Property[Set,List] to nested dict","In interfacing with AstroPy it'd be useful to easily convert PropertySet and PropertyList to nested dict and OrderedDict (respectively), converting elements with multiple values to lists in the process.",2 +"DM-5659","03/30/2016 21:58:47","multiple dialog are not working well together","When several dialogs are up together. The most recently click one should be one top. When table are in the dialogs such a fits header view. The scroll bars will go over other dialogs. This needs some though and work. Another thing- when a message dialog is show because of a dialog error. It should center on the dialog. Update- I don't think I will do the error centering now. I am going to leave that and see if it is a real problem.",3 +"DM-5660","03/31/2016 00:20:57","Add motivated model fits to validate_drp photometric and astrometric scatter/repeatability analysis and plots","Implement well-motivated theoretical fits to the astrometric and photometric performance measurements based on derivations from LSST Overview paper. http://arxiv.org/pdf/0805.2366v4.pdf Photometric errors described by Eq. 5 sigma_rand^2 = (0.039 - gamma) * x + gamma * x^2 [mag^2] where x = 10^(0.4*(m-m_5)) Eq. 4 sigma_1^2 = sigma_sys^2 + sigma_rand^2 Astrometric Errors error = C * theta / SNR Based on helpful comments from [~zivezic] {quote} I think eq. 5 from the overview paper (with gamma = 0.039 and m5 = 24.35; the former I assumed and the latter I got from the value of your analytic fit that gives err=0.2 mag) would be a much better fit than the adopted function for mag < 21 (and it is derived from first principles). Actually, if you fit for the systematic term (eq. 4) and gamma and m5, it would be a nice check whether there is any “weird” behavior in analyzed data (and you get the limiting depth, m5, even if you don’t go all the way to the faint end). Similarly, for the astrometric random errors, we’d expect error = C * theta / SNR, where theta is the seeing (or a fit parameter), SNR is the photometric SNR (i.e. 1/err in mag), and C ~ 1 (empirically, and 0.6 for the idealized maximum likelihood solution and gaussian seeing). {quote}",5 +"DM-5663","03/31/2016 10:49:33","Config override fixes needed due to new star selector","As of DM-5532 a few config files need updating to not refer to star selector config fields as registries (not ones run by our normal CI, which is how I missed this).",2 +"DM-5664","03/31/2016 11:40:20","Delete or document and test config/psfex.py","The file {{config/psfex.py}} has no documentation and is bit rotting. If you feel it should be kept then please document it and add a simple unit test that loads it and runs data using it. If it is not needed, then please get rid of it.",1 +"DM-5675","03/31/2016 13:33:30","Cannot enable shapeHSM because RegistryField fails validation","When running ci_hsc after setting-up the meas_extensions_shapeHSM, meas_extensions_photometryKron and dependencies using setup -v -r . in the respective cloned folders, I get {code} Cannot enable shapeHSM (RegistryField 'calibrate.detectAndMeasure.measurement.plugins' failed validation: Unknown key 'ext_shapeHSM_HsmMoments' in Registry/ConfigChoiceField For more information read the Field definition at: File ""/home/vish/code/lsst/lsstsw/stack/Linux64/pex_config/2016_01.0+3/python/lsst/pex/config/registry.py"", line 179, in __init__ ConfigChoiceField.__init__(self, doc, types, default, optional, multi) And the Config definition at: File ""/home/vish/code/lsst/lsstsw/stack/Linux64/meas_base/2016_01.0-13-g779ee14/python/lsst/meas/base/sfm.py"", line 109, in class SingleFrameMeasurementConfig(BaseMeasurementConfig): ): disabling HSM shape measurements {code} Find out why this is happening and find a fix ",0.5 +"DM-5681","03/31/2016 14:08:24","Provide single-visit processing capability as required by HSC","In DM-3368, we provided a means of running multiple processCcd tasks across an exposure, but without performing global calibration etc as provided by HSC's ProcessExposureTask. Please augment this with whatever additional capability is required to enable HSC data release processing.",2 +"DM-5686","03/31/2016 17:41:18","Accommodate pixel padding when unpersisting reference catalog matches","The reference object loader in {{meas_algorithm}}'s *loadReferenceObjects.py* grows the bbox by the config parameter pixelMargin: doc = ""Padding to add to 4 all edges of the bounding box (pixels)"" . This is set to 50 by default but is not reflected by the radius parameter set in the metadata, so some matches may reside outside the circle searched within this radius. This increase needs to be reflected in the radius set in the metadata fed into {{joinMatchListWithCatalog()}}. ",2 +"DM-5689","03/31/2016 21:31:00","Table needs to fire another action when data completely loaded","When the data for a table is completely loaded fire another action such as TABLE_NEW_LOADED_DONE. This way the xyplots and the image overlays know to go fetch the data. 4/22/2026 from the pull request: added new action TABLE_NEW_LOADED to table; fired when table is completely loaded. added table error handling. fix active table not updating after an active tab is removed.",2 +"DM-5694","04/01/2016 12:11:29","Run StarFast simulated images through diffim","Determine the metadata and dependencies needed to fully process two images simulated with StarFast through diffim. ",2 +"DM-5695","04/01/2016 12:23:32","Implement simple 1D DCR correction on simulated data","Nate Lust wrote a simple DCR correction recipe that runs in 1D in an ipython notebook. This ticket is to re-write the notebook in python modules that can be run on StarFast simulated images prior to image differencing. For this ticket, the simulated images will be 2D, but DCR will be purely along the x or y pixel grid, allowing columns or rows of pixels to be treated separately in 1D.",8 +"DM-5702","04/01/2016 15:40:55","Create a new model in AST/GWCS to represent a complex distortion","Using lessons learned from DM-5701, create a more complex distortion model that cannot be represented from the basic models in GWCS or AST. A good example for this might be a rapidly varying sinusoidal tree-ring-like function that is not well represented by the standard polynomial basis functions. This will test our ability to extend each framework with new models that have not yet been decided on. Once completed, we could plug this back into the composite model in DM-5701.",8 +"DM-5703","04/01/2016 15:54:41","Evaluate performance of AST/GWCS over a range of numbers of pixels","Once we have a composite distortion model from DM-5701, evaluate the performance of AST and GWCS over a range of numbers of pixels, likely from ~100 through full-CCD (4k^2). As part of this process, we will try to determine whether there is a way to efficiently warp images/postage stamps using python-only models in GWCS and whether bottlenecks could be worked around via optimizations in cython.",8 +"DM-5720","04/05/2016 11:47:24","JIRA fixes","This tracks SPs spent on JIRA requests. ",2 +"DM-5723","04/05/2016 18:34:56","make sure table can be resized properly","Test table to make sure it can be resized under a variety of layout.",2 +"DM-5726","04/05/2016 22:19:00","attend the weekly meeting with UIUC camera team (May 2016)","Tatiana will attend the weekly meeting. Xiuqin and Gregory also attends when needed. ",2 +"DM-5729","04/06/2016 10:24:38","Config.loadFromStream suppresses NameError","Within a config override file being executed via {{Config.load}} or {{Config.loadFromStream}}, using a variable that hasn't been defined results in a {{NameError}} exception, but this is silently suppressed and the user has no idea the following overrides have not been executed.",0.5 +"DM-5734","04/06/2016 17:55:09","Fix the issues in the server side and the client side introduced by FitsHeaderViewer 's work","* The testing data ""table_data.tbl"" in the testing tree was accidentally moved. It should be added back so that IpactTableTest.java can run. * The request in JsontableUtil was mistakenly moved out from the tableModel by the the line * {code} * if (request != null && request.getMeta().keySet().size()>1) { tableModel.put(""request"", toJsonTableRequest(request)); } {code}. The meta can be null but the request is not null, the request should be put into the TableModel. ",1 +"DM-5748","04/07/2016 17:12:10","Upgrade mpi4py to latest upstream","[mpi4py|https://bitbucket.org/mpi4py/] version 2.0 was released in October 2015 with a number of changes. We should upgrade. When upgrading, we should check whether it contains a proper fix for DM-5409 and, if not, file a bug report upstream. This issue should not be addressed until we have proper test coverage on code which uses mpi4py (DM-3845).",1 +"DM-5756","04/11/2016 10:04:49","Update Scons to v2.5.0","Scons 2.5.0 came out over the weekend. There were many fixes to the dependency determination code. The next version of Scons is intended to be 3.0 which will be the first version to support Python 3. Since we fully intend to switch to Python 3.0 in the summer it is prudent for us to ensuer that 2.5.0 works fine before switching to 3.0.0 so that we do not get confused as to why there is breakage in jumping straight to 3.0.0.",2 +"DM-5757","04/11/2016 10:34:47","FitsHeader's resize and sorting","DM-4494 has merged to the dev. However, there are still two issues remained: * Resize the popup with tabs does not work * Sorting is depending on the BasicTable's sorting",1 +"DM-5760","04/11/2016 11:42:23","XYPlot needs to be expandable","Make XYPlot expandable",2 +"DM-5763","04/11/2016 12:27:25","XYPlot: decimation options","User needs to be able to control number of bins and bin size.",3 +"DM-5765","04/11/2016 12:47:49","Remove unneeded imports in SConstruct","There's an outstanding pull request from an external contributor (Miguel de Val-Borro) [here|https://github.com/lsst/sconsUtils/pull/9] that makes some minor improvements to sconsUtils by cleaning up the imports. Somebody should review and (if appropriate) merge it. (Or, at least, reply to our community!)",1 +"DM-5767","04/11/2016 14:45:35","Create custom basic coaddition code","Create script to do the following: * Takes a list of DECam exposure numbers * for each CCD, loads the corresponding calexps * creates a naive pixel-by-pixel coadd of the underlying images * Possibly either ANDs or ORs the masks (though perhaps not necessary) * Either sums the expusure time info from the headers, or averages them, depending on whether the images were normalised to exposure times or not * write the corresponding images out as coadded fits",1 +"DM-5769","04/11/2016 15:00:27","Write spot visualisation snippets","Write some snippets to aide in the processing and visualisation of the CBP data/analysis. Essentially, write some helper functions that you can throw sections of images at to help look at the shape of the CBP spots, as ds9 isn't great ideal this. Some nice features would be: A function that takes a list of images or arrays, and plots them side-by-side, which provides some intelligent options for the stretches, and optionally stretches each image as is best for it, or ties them all to be the same. This would be as 2D colour plots. A function that takes part of an image and displays it as a colour-graded surface. A function that takes part of an image and displays it as a 3D bar-chart (as in ROOT, but without using ROOT because there is already enough evil in the world)",2 +"DM-5770","04/11/2016 15:05:01","Investigate image processing for feature enhancement","Whilst looking at an individual spot from the CBP on DECam I noticed a weird feature, and upon further investigation, several more, though these were very hard to see. This ticket is to investigate what image processing techniques will make these hard-to-see features pop out so that they can be examined more closely.",2 +"DM-5771","04/11/2016 15:12:09","Update config files","DM-46921 and DM-5348 changed ProcessCcd to the point where past config files are no longer valid as stuff has moved a lot (see https://community.lsst.org/t/backward-incompatible-changes-to-processccdtask-and-subtasks/581) This ticket is to go through past configs and create a new config file to reproduce the reductions done, or at least make something sensible come out the end of processCcd",2 +"DM-5773","04/11/2016 17:38:58","Firefly API plan and decision","We need a plan for all the Firefly APIs development in the new React/Redux based JS framework, including JS API and Python API. - Backward compatibility - Syntax format for JS API - Syntax format for Python API - Schedule - convert the existing API first - list of new ones to be added, when ",2 +"DM-5775","04/12/2016 10:59:09","Change the TabPanel.jsx and TabPanel.css's properties to allow its children can be resizable","When an outside container is resizable (using css properties: resize: 'both', overflow: 'auto'...), in order for the child inside the container to be resizable, the child has to specify its height and width properties using percentage format (height: 90%, width:100%). When the TabPanel is used, the table is put on TabPanel. The table needs to access the size information of the outside container, ie,, the grandparent's width and height. The TabPanel has to pass the height and width to its child component. Without specifying the height and width in the TabPanel, by default, the auto is used. When the width (height) is auto, it allows to use the child's width (height). However, the child replies on the parent to provide such information. When this circular relations occur, the default size of the child is used. That is why the table component forever has 75px when it was put in the TabPanel. To be able to resize with the outside (root) contains all the ancestors of the component have to specify the width and height explicitly. ",0.5 +"DM-5782","04/14/2016 07:36:52","Include obs_cfht, obs_decam in lsst-dev shared stack","The shared stack on {{lsst-dev}} provided in DM-5435 does not contain the {{obs_cfht}} or {{obs_decam}} camera packages. Please add them.",1 +"DM-5784","04/14/2016 10:02:50","Port region serializer and data structures from GWT","The region serializer in: firefly/src/firefly/java/edu/caltech/ipac/util * RegionFactory.java Region container data structures files in : firefly/src/firefly/java/edu/caltech/ipac/util/dd * ContainsOptions.java * Global.java * RegionFileElement.java * RegParseException.java * Region.java * RegionAnnulus.java * RegionBox.java * RegionBoxAnnulus.java * RegionCsys.java * RegionDimension.java * RegionEllipse.java * RegionEllipseAnnulus.java * RegionFont.java * RegionLines.java * RegionOptions.java * RegionPoint.java * RegionText.java * RegionValue.java Note - do not port CoordException, there are other ways to do this.",8 +"DM-5791","04/15/2016 17:17:44","Why is doSelectUnresolved an argument?","The {{run}} method in the {{PhotoCalTask}} has an argument that selects whether to use the extendedness parameter to select objects for photometric calibration. This is a good idea, but it should be configurable, I think. ",1 +"DM-5793","04/16/2016 10:08:34","FITS Visualizer porting: Convert Mask support","* convert the make support from GWT * Make a temporary dialog to control it * Add python/JS API supporty",8 +"DM-5794","04/16/2016 10:19:00","Image Visualizer: Support image and drawing layer subgrouping","This will give user finer control of turning on/off the catalog overlays on one image, a group of images, or all the displayed images. ",8 +"DM-5797","04/18/2016 14:50:56","Using 'CONSTANT' for background subtraction fails","Running processCcd (on a DECam file) with the following in the config file: {code} config.charImage.repair.cosmicray.background.algorithm='AKIMA_SPLINE' config.charImage.background.algorithm='CONSTANT' config.charImage.detectAndMeasure.detection.tempLocalBackground.algorithm='CONSTANT' config.charImage.detectAndMeasure.detection.background.algorithm='CONSTANT' config.calibrate.detectAndMeasure.detection.tempLocalBackground.algorithm='CONSTANT' config.calibrate.detectAndMeasure.detection.background.algorithm='CONSTANT' {code} fails, and throws the following: {code} Traceback (most recent call last): File ""/home/mfisherlevine/lsst/pipe_tasks/bin/processCcd.py"", line 25, in ProcessCcdTask.parseAndRun() File ""/ssd/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869+8/python/lsst/pipe/base/cmdLineTask.py"", line 450, in parseAndRun resultList = taskRunner.run(parsedCmd) File ""/ssd/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869+8/python/lsst/pipe/base/cmdLineTask.py"", line 199, in run resultList = mapFunc(self, targetList) File ""/ssd/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869+8/python/lsst/pipe/base/cmdLineTask.py"", line 324, in __call__ result = task.run(dataRef, **kwargs) File ""/ssd/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869+8/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/mfisherlevine/lsst/pipe_tasks/python/lsst/pipe/tasks/processCcd.py"", line 170, in run doUnpersist = False, File ""/ssd/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869+8/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/mfisherlevine/lsst/pipe_tasks/python/lsst/pipe/tasks/characterizeImage.py"", line 298, in run background = background, File ""/ssd/lsstsw/stack/Linux64/pipe_base/2016_01.0-6-g7751869+8/python/lsst/pipe/base/timer.py"", line 118, in wrapper res = func(self, *args, **keyArgs) File ""/home/mfisherlevine/lsst/pipe_tasks/python/lsst/pipe/tasks/characterizeImage.py"", line 356, in characterize image -= estBg.getImageF() File ""/home/mfisherlevine/lsst/afw/python/lsst/afw/math/mathLib.py"", line 5788, in getImageF return _mathLib.Background_getImageF(self, *args) lsst.pex.exceptions.wrappers.InvalidParameterError: File ""src/math/Interpolate.cc"", line 61, in std::pair, std::vector > lsst::afw::math::{anonymous}::recenter(const std::vector&, const std::vector&) You must provide at least 1 point {0} File ""src/math/BackgroundMI.cc"", line 196, in void lsst::afw::math::BackgroundMI::_setGridColumns(lsst::afw::math::Interpolate::Style, lsst::afw::math::UndersampleStyle, int, const std::vector&) const setting _gridcolumns {1} lsst::pex::exceptions::InvalidParameterError: 'You must provide at least 1 point {0}; setting _gridcolumns {1} {code}",2 +"DM-5799","04/18/2016 16:03:06","Asinh stretch algorithm corerction","in DM-2634, the Asinh stretch algorithm was implemented, but the behavior was not quite right. We need to figure out the issue and make it right. One possibility is that the understanding the relationship of zero point and black point, maximum point and white point. ",8 +"DM-5803","04/19/2016 12:35:20","fetchUrl is not handling post requests correctly.","Parameters are not sent to the server when requests are posted via fetchUrl.",2 +"DM-5810","04/19/2016 18:23:32","Update imageDifferenceTask to cast template ids and use ObjectSizeStarSelector","A couple recent changes to the stack break imageDifferenceTask. Requires updates to only a few lines. While I'm updating it to reflect the star selector API, I'm also changing the default star selector from SecondMoment to ObjectSizeStarSelector (which I learned today is what the stack has been using by default for a while). ",1 +"DM-5816","04/19/2016 19:13:00","Investigate behavior of Firefly and stretches for images with negative pixel values","Based on a discussion in the Tea Time HipChat room today, this is a ""note to ourselves"" to take a look at the behavior of Firefly when visualizing images with negative flux values. This is important for difference imaging and is therefore highly relevant to both LSST and ZTF (if ZTF difference images become visible through Firefly at some point). The behavior of the asinh stretch, in particular, should be looked at.",2 +"DM-5819","04/20/2016 07:57:09","Incorporate Price suggestions to make `validate_drp` faster","Increase the loading and processing speed of {{validate_drp}} following suggestions by [~price] 1. Don't read in footprints Pass {{flags=lsst.afw.table.SOURCE_IO_NO_FOOTPRINTS}} to {{butler.get}} 2. Work on speed of calculation of RMS and other expensive quantities. Current suggestions: a. {{calcRmsDistances}} b. {{multiMatch}} c. {{matchVisitComputeDistance}} d. Consider boolean indexing in {{afw}}'s {{multiMatch.py}} {code} objById = {record.get(self.objectKey): record for record in self.reference} to: objById = dict(zip(self.reference[self.objectKey], self.reference)) {code} Note that while this ticket will involve work to reduce the memory footprint of the processing, it will not cover work to re-architect things to enable efficient processing beyond the memory on one node.",2 +"DM-5821","04/20/2016 18:32:03","Intermittent fault building ci_hsc through Jenkins","Occasionally (see e.g. [here|https://ci.lsst.codes/job/stack-os-matrix/label=centos-7/10437//console] and [here|https://ci.lsst.codes/job/stack-os-matrix/label=centos-6/9594//console]) the {{ci_hsc}} job in Jenkins fails, reporting: {code} RuntimeError: dictionary changed size during iteration {code} The fault seems to be intermittent. Please fix it.",3 +"DM-5822","04/21/2016 08:46:25","Afw fails unit test for convolve depending on compiler optimisation level","On OSX 10.11.4 with Apple LLVM version 7.3.0 (clang-703.0.29) afw fails {{test/convolve.py}} with the following error when either {{-O0}} or {{-O1}} is enabled but works fine for {{-O2}} and {{-O3}}. {code:bash} tests/convolve.py .....FF/Users/pschella/Development/lsst/code/afw/python/lsst/afw/image/testUtils.py:283: RuntimeWarning: invalid value encountered in isnan nan0 = np.isnan(filledArr0) /Users/pschella/Development/lsst/lsstsw/miniconda/lib/python2.7/site-packages/numpy/lib/ufunclike.py:113: RuntimeWarning: invalid value encountered in isinf nx.logical_and(nx.isinf(x), ~nx.signbit(x), y) /Users/pschella/Development/lsst/lsstsw/miniconda/lib/python2.7/site-packages/numpy/lib/ufunclike.py:176: RuntimeWarning: invalid value encountered in isinf nx.logical_and(nx.isinf(x), nx.signbit(x), y) F.F... ====================================================================== FAIL: testSpatiallyVaryingAnalyticConvolve (__main__.ConvolveTestCase) Test in-place convolution with a spatially varying AnalyticKernel ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/convolve.py"", line 437, in testSpatiallyVaryingAnalyticConvolve rtol = rtol) File ""tests/convolve.py"", line 290, in runStdTest self.runBasicConvolveEdgeTest(kernel, kernelDescr) File ""tests/convolve.py"", line 317, in runBasicConvolveEdgeTest doVariance = True, rtol=0, atol=0, msg=msg) File ""/Users/pschella/Development/lsst/code/afw/python/lsst/afw/image/testUtils.py"", line 201, in assertMaskedImagesNearlyEqual testCase.fail(""%s: %s"" % (msg, ""; "".join(errStrList))) AssertionError: basicConvolve(MaskedImage, kernel=Spatially Varying Gaussian Analytic Kernel using brute force) wrote to edge pixels: image planes differ: maxDiff=1.09176e+38 at position (73, 18); value=-1.09176e+38 vs. 2825.0; NaNs differ ====================================================================== FAIL: testSpatiallyVaryingDeltaFunctionLinearCombination (__main__.ConvolveTestCase) Test convolution with a spatially varying LinearCombinationKernel of delta function basis kernels. ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/convolve.py"", line 556, in testSpatiallyVaryingDeltaFunctionLinearCombination rtol = rtol) File ""tests/convolve.py"", line 290, in runStdTest self.runBasicConvolveEdgeTest(kernel, kernelDescr) File ""tests/convolve.py"", line 317, in runBasicConvolveEdgeTest doVariance = True, rtol=0, atol=0, msg=msg) File ""/Users/pschella/Development/lsst/code/afw/python/lsst/afw/image/testUtils.py"", line 201, in assertMaskedImagesNearlyEqual testCase.fail(""%s: %s"" % (msg, ""; "".join(errStrList))) AssertionError: basicConvolve(MaskedImage, kernel=Spatially varying LinearCombinationKernel of delta function kernels using brute force) wrote to edge pixels: image planes differ: maxDiff=9.06659e+36 at position (75, 29); value=9.06659e+36 vs. 2865.0 ====================================================================== FAIL: testSpatiallyVaryingGaussianLinerCombination (__main__.ConvolveTestCase) Test convolution with a spatially varying LinearCombinationKernel of two Gaussian basis kernels. ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/convolve.py"", line 523, in testSpatiallyVaryingGaussianLinerCombination rtol = rtol) File ""tests/convolve.py"", line 290, in runStdTest self.runBasicConvolveEdgeTest(kernel, kernelDescr) File ""tests/convolve.py"", line 317, in runBasicConvolveEdgeTest doVariance = True, rtol=0, atol=0, msg=msg) File ""/Users/pschella/Development/lsst/code/afw/python/lsst/afw/image/testUtils.py"", line 201, in assertMaskedImagesNearlyEqual testCase.fail(""%s: %s"" % (msg, ""; "".join(errStrList))) AssertionError: basicConvolve(MaskedImage, kernel=Spatially Varying Gaussian Analytic Kernel with 3 basis kernels convolved using brute force) wrote to edge pixels: image planes differ: maxDiff=1.22472e+38 at position (74, 3); value=-1.22472e+38 vs. 2878.0; NaNs differ ====================================================================== FAIL: testTicket873 (__main__.ConvolveTestCase) Demonstrate ticket 873: convolution of a MaskedImage with a spatially varying ---------------------------------------------------------------------- Traceback (most recent call last): File ""tests/convolve.py"", line 623, in testTicket873 rtol = rtol) File ""tests/convolve.py"", line 290, in runStdTest self.runBasicConvolveEdgeTest(kernel, kernelDescr) File ""tests/convolve.py"", line 317, in runBasicConvolveEdgeTest doVariance = True, rtol=0, atol=0, msg=msg) File ""/Users/pschella/Development/lsst/code/afw/python/lsst/afw/image/testUtils.py"", line 201, in assertMaskedImagesNearlyEqual testCase.fail(""%s: %s"" % (msg, ""; "".join(errStrList))) AssertionError: basicConvolve(MaskedImage, kernel=Spatially varying LinearCombinationKernel of basis kernels with low covariance, using brute force) wrote to edge pixels: image planes differ: maxDiff=3.19374e+38 at position (1, 46); value=3.19374e+38 vs. 2774.0 ---------------------------------------------------------------------- Ran 13 tests in 43.252s FAILED (failures=4) The following tests failed: /Users/pschella/Development/lsst/code/afw/tests/.tests/convolve.py.failed 1 tests failed scons: *** [checkTestStatus] Error 1 scons: building terminated because of errors. {code}",2 +"DM-5823","04/21/2016 09:28:12","ECL_B1950 coordinate was not defined correctly","The CoordSys.js defined ECL_B1950 incorrectly. When I was testing WebGrid, the grid lines for Ecliptic B1950 were not right. Looked further, it was caused by wrong equinox value in its definition.",0.5 +"DM-5829","04/22/2016 11:11:53","Create outline of Level 3 ConOps","Create an outline of the sections of the Level 3 ConOps document",2 +"DM-5830","04/22/2016 11:13:57","Level 3 requirements flowdown","Document the flowdown of Level 3-related requirements from SRD, LSR, OSS, and DMSR.",3 +"DM-5832","04/22/2016 11:25:22","LSE-140 post-CCB implementation","Following CCB approval of LSE-140, perform minor document work required for full implementation (application of standard cover page, change log, etc.).",2 +"DM-5835","04/22/2016 12:18:01","Prepare a draft of the SUIT deployment timeline","Prepare a draft schedule, with some detail for 2016-2017, for deployments of the SUIT into (test) production, including the datasets that will be served.",2 +"DM-5837","04/22/2016 13:14:07","Document pipe_drivers","Please provide a minimal level of documentation for {{pipe_drivers}}, to include: * A {{doc}} directory with the usual content so that docstrings get generated by Doxygen; * A package overview; * All docstrings should be appropriate for parsing by Doxygen (ie, should start with {{""""""!}} where necessary).",2 +"DM-5838","04/22/2016 17:04:44","3-color image label change","When generating 3-color image, the label for the image display should 'Project 3-color', i.e. WISE 3-color, 2MASS 3-color ... ",0 +"DM-5839","04/23/2016 13:35:17","horizon console interface broken","It appears that at some point in the last few months the horizon console interface has stopped working. I am still able to access the console log output via the API/CLI.",1 +"DM-5840","04/23/2016 13:53:05","instance limit low vs available cores","The LSST project is currently at 81/100 instances but there are over 200 cores unused. Is it possible to increase the instance limit or are we being encouraged to use large instance flavors?",1 +"DM-5841","04/23/2016 14:19:56","unable to list nebula lsst project users","Currently, [with some difficulty] it is possible to discover the {{user_id}} that created an instance (might be possible for other resources as well) but it is not possible to map this back to a username / person. This can make it difficult to 'self police' instances. The administrative API endpoints are not publicly accessible and I doubt any end user has the appropriate permission. ",1 +"DM-5847","04/25/2016 15:00:23","libxml build issue with mpich on OS X","On OS X with Xcode installed {{mpich}} fails to build because it can not locate the libxml include files: {code} CC topology-xml-libxml.lo topology-xml-libxml.c:17:10: fatal error: 'libxml/parser.h' file not found #include ^ 1 error generated. {code} with {{pkg-config}} 0.29.1 installed. The problem is that {{configure}} determines that {{libxml-2.0}} is available and is installed into {{/usr}} with a CFLAGS of {{-I/usr/include/libxml2}}. {{configure}} does not itself test whether those parameters are reasonable. With Xcode there are no files installed into {{/usr/include}} and {{clang}} knows to look in specific SDK locations. When {{mpich}} builds it assumes that {{libxml2}} can be found but fails to find it. Strangely, {{pkg-config}} v0.28 does not seem to be able to find {{libxml-2.0}} so there is no issue. One solution is to install the Command Line Tools but it might be more portable to attempt to disable {{libxml2}}. ",2 +"DM-5848","04/25/2016 16:34:17","Investigate Jupyter internals, interactive widgets","In preparation for linking Jupyter notebooks with Firefly and other SUIT components, read Jupyter documentation. Learn how to build a sample widget or interactive dashboard in the Jupyter framework",2 +"DM-5849","04/25/2016 16:38:09","Investigate Ginga and Glueviz visualization tools","Ginga and Glue (glueviz) are community visualization tools in Python. Become familiar with the capabilities of both, thinking from the point of view of using Firefly for the display but using Python for many other things.",2 +"DM-5854","04/25/2016 17:36:26","Java array index out of bound error in VisSeverCommand.java","The class FileFluxCmdJson in VisServerCommand.java is calling {code} String[] res = VisServerOps.getFileFlux(fahAry, pt); {code} However, when the mouse is outside the image, the VisServerOps.getFileFlux(fahAry, pt) returns: {code} new String[]{PlotState.NO_CONTEXT} {code} It is fine for a single band. However, for 2 or 3 bands, the for loop below caused the index out of bound error because res is an array of length=1 and the expected res is an array of length=no of bands. {code} JSONObject obj= new JSONObject(); obj.put(""JSON"", true); obj.put(""success"", true); int cnt=0; JSONObject data= new JSONObject(); for(Band b : state.getBands()) { data.put(b.toString(), res[cnt++]); } data.put(""success"", true); {code} Thus, res\[cnt++\] caused array index out of bound error. To fix this issue, the for loop is changed as below: {code} int cnt=0; JSONObject data= new JSONObject(); Band[] bands = state.getBands(); for (int i=0; i