2018-2019 TC Meeting Minutes


ATTENDEES: Wendy, Jon, Dan S., Jeremy, Larry, Flavio, Arofan

Summary document: This is just a summary of the discussion that took place, not a fill-in all the details of what each product is etc. So really a general road map.
1 - is it accurate
2 - is anything lacking
3 - sending to Exec

Action item for DDI Alliance:
Name of DDI4-Core needs to be determined in terms of the product line

SDTL support: 3 points  (see TC-212)
1 - do we support this as a DDI product
2 - positioning within the product suite
3 - outline for setting up the group within DDI (what's needed...id persons, linking into github DDI site, oversight role of TC, etc....this may change with reorganization of ScientificBoard)

DDILIFE-3679 replace the ability to reference complience statements that was accidentally removed
Jeremy will create pull for that in the next couple days


ATTENDEES: Wendy, Jon, Dan S., Flavio, Larry, Jeremy

Updated on EDDI event
Reviewed outstanding issues and resolved or closed specific issues related to schemas
The remaining issues have to do with pulling together and getting out 3.3 for final review
Goal is to have everything together for a final approval for review release on Jan 9.
Meeting on Dec 19 dependent upon having any agenda items
No meetings December 26 or Jan 2


ATTENDEES: Wendy, Jon, Oliver, Larry

Reviewed all the remaining schema fixes and closed
All remaining changes are to documentation
Review of inline documentation will be made tomorrow

Documentation is being generated here: https://ddi-lifecycle-documentation.readthedocs.io/en/latest/index.html
editing it here until I move it to Bitbucket or Github

The documentation will be stripped back from user guide taking out element specifics which will be available in inline DocFlex or content generated by sphinx

Schema works looks done. Jon will prepare draft of announcement regarding Public Review and availability of Schemas
Goal is to have this out the week before EDDI
Public review to start as soon after EDDI as the high level documentation is in shape and continue through the end of January 2020 (minimum of 1 month but holidays mean extending that)

Issue filter for public - Issues addressed in preparation of DDI Lifecycle 3.3
Issues were reviewed during development of DDI Lifecycle 3.3 They provide information on the content, discussion, and disposition of the issue

Next week:
Draft of announcement to DDI-Users list regarding Public Review and availability of schema


ATTENDEES: Wendy, Jon, Dan S., Jeremy, Larry, Oliver, Flavio

3667 some question in red within spreadsheet, draft of a change sheet to accompany if deemed useful
Please review and comment on any changes
3669 resolved
3637 reopened with some examples - renamed to clarify content and moved to FUTURE (3.4)
3638 in progress
3636 resolved
3660 still to enter
3635 resolved
3670 resolved
3651 reopended and resolved

Verification to make sure no change was overwritten is down to 2 long lists of changes and the new SDI content to both verify and note where diagrams should be changed.

Oliver updated on Controlled Vocabularies:
CESSDA now produces HTML with the language selection option as well as a show all languages. This in on internal at the momement but will be available.
This means all we need is a tool to grab the three outputs for a controlled vocabulary, run the CodeList structure, zip and deliver

GOAL: complete entry of all resolved items and corrections found in verification work for next weeks meeting
NO MEETING: November 28 (US Thanksgiving) and December 5 (EDDI)


ATTENDEES: Wendy, Jon, Jeremy, Dan S., Oliver, Larry

Comment on summary document
    Let Wendy know if there are issues (for those who haven't read yet)
    Post and send to Jared when confirmed
TC-201 CV update
    updated others - we have an agreed solution with CV and ICPSR
    Oliver has talked to CESSDA about language bar on HTML
DDILIFE-3664 in review
DDILIFE-3660 information classification statement (if in review)
    not entered yet
DDILIFE-3663 Best Practice Doc update (if in review)
    on hold
DDILIFE-3665 package content agreement
    Jon has put on GITHUB ddilifecycle documentation - 3.2 and 3.3 together
DDILIFE-3662 readme.txt specify changes
    changes reviewed
    in progress assigned to Jeremy
DDILIFE-3667 change log - specify content
    in progress - assigned to Wendy
    Options listed
DDILIFE-3659 developers documentation
    added notes - post 3.3 release
DDILIFE-3639 create branch for 3.2
    added notes
    post 3.3 release

added new label 3.3_publication

 20192021 - 20191025

ATTENDEES: Attendees: Wendy Thomas, Jon Johnson, Larry Hoyle, Jeremy Iverson, Dan Smith, Johan Fihn Marberg, Oliver Hopt, Dan Gillman (as available), Barry Radler (Monday-Tuesday), Flavio Rizzolo (remote, as available), Jay Greenfield (remote, as available)

See meeting page

Summary of Monday/Tuesday discussion

Production Framework information

TC Meeting Summary

ATTENDEES: Wendy, Jon, Dan S.
Small meeting today so we just checked for any outstanding questions regarding the October meeting. I linked in the following page with the start of the meeting agenda. This may move due to technical issues but its here for now.
Next week will be 3.3 entry for review and approval


ATTENDEES: Wendy, Dan S., Oliver

Small group. No major agenda items. Discussed organization and prep for COGS work and Oliver and Dan S. will look at to see if there is any change in the organization of activities needed.

Updated on meeting facilities.

Next week: 3.3 pull content review


ATTENDEES: Wendy, Oliver, Johan, Jeremy, Barry, Dan S., Flavio

The content of the meeting and organization was discussed in particular focusing on what needed to be done prior to the meeting in terms of individual preparation; content of the first day discussion and what the outputs of that discussion needed to be in order to inform DDI members and to specify activities for the rest of the week.

TO DO:  wlt do asap

 find and summarize documents on design approaches over time (to inform overall production approach discussion)

 post Roadmap original and updates for individual review

 prepare a schedule for the week using the information below

Individual review of COGS using

Strategic direction:
What does it mean for the direction of DDI Lifecycle and DDI 4 work?
Take into account any issues raised in TC-189 and TC-203 particularly in terms of relationships between versions
How does DDI Codebook fit in? Does it?
What needs to be completed to move along the road map
Mapping between versions
Mapping to other standards (DDI Lifecycle) approach rather than specific mappings, priority mapping to sanity check approach
Mapping to other standards (DDI4/Core) - look at output from Dagstuhl sprint
Best Practicies document - what else should be included, how can they be made more 'sticky'
Technical meets marketing genius session over a couple days would a identify and describe the coolest things to focus on and how to do that at a higher level
Develop some general conscensus for how things need to cohere or align
Guidelines for the work to take place in terms of mapping and integration over the week - specific work to take place
Strategically where are we going with the production process - What is the starting point for design? UML? Who is doing the design? What are the tools? Who can be involved?
[pull together the history of approaches to DDI development - original documents, original DDI4 purpose, development rules for different versions- share prior to meeting - wlt]

2-Basic run through of COGS to answer questions and frame issues
Clarification of what is being done during meeting
Go through the outputs CSV to XSD to validate approach
Are we generating content the way we want to
--Break this out
--Do the deep dive into the transform process
--Verify that everything gets transfered correctly
--Have the current schemas been disected to capture everything we want (all the content in a more OO style of presentation)
--This is the custom tool that needs the most input
We have an existing schema to compare to in 3.3 and 2.5
DDI4 is a different issue and Oliver has been working on it
Review rules for trimming as currently prepared
Oliver provide a half hour review of his XMI to COGS and round tripping

Production pipeline
Binding relationships -
COGS is similar to Lion except that the interface is on your computer and using GIT to do the change management and iterate proposed changes for testing

Across DDI versions
Between DDI and FHIR plus others as appropriate
How to map - how will we go about this in terms of existing standards and going forward
How to express mapping to support understanding and implementation
What becomes the point of integration/mapping such as a specific version of DDI
What can be merged in terms of content and what it would look like
Where are their conflicts?
Structural - implications for future structure of Lifecycle
Conceptual - options for dealing with
Can we define areas where merger is difficult and general sources of conflict

Clearly define issues and approaches
M) Structured output on content we have
I) what doesn't map
What DDI provides that no other standard has
Where metadata gets passed on a broader level - what content are you passing/sharing for different purposes
Example: what DataVerse uses in from DDI to support their system
Interoperability - what needs to be 'sharable' and what

Use the document DDI 3.3 Production Validation Document as basis for work
Expand during first day meeting to determine scope (in terms of versions) and specific work and assignments
Test case for loading GitHub or bitbucket production lines
For each version
Can we input 3.3 and 2.5 and get out the full content?
Make sure we are lossless before we start trimmming
Means of comparing input to output
5b-Controlled Vocabulary stuff
--From SKOS to other representations
5c-DDI Resolution
--Agency ID compared to an arch or DOI handle
--Using another resolution system
--DNS resolution
--Alternative resolution (Dan)
--OAI-PMH endpoints and using them as DDI services (imbeded DDI specific versions in OMI-PMH embedding) Need a paper or article about how to do this correctly
--Also applies to API packets and what they look like (we have the fragment instance but no documentation on what to put inside of it - best practice minimum viable content etc.


ATTENDEES: Wendy, Dan S., Jon, Flavio, Larry, Jeremy

3.3 entry is in progress
No Dan, Jeremy, or Jon next week

Getting 2.5 into COGS will not be a trivial excercise
--3.2 required a lot of exception mapping
--References needed to be clearly defined
--We don't have unique references
--2.5 doesn't differentiate between item types and structure so creating documentation or JSON you may need to treat all as structure

One thing we could key off of have the DDILifecycle URN attribte so we could make sure they are used correctly
--note that everything deriving from BaseType with GLOBALS has a DDILifecycle URN

May be helpful in cross version mapping or simplification of the Codebook Schema
Increasingly persuaded by the argument that we may need to move things into 2.5 to assist in migration

3.3 COGS

Learning COGS
Biggest issue is its mostly CSV files we need to get used to
It would be helpful to see examples of CSV files

What other pieces of information need to be in the spreadsheet to capture what is needed at the class level

How do you handle docmentation with line ending - see article in modeler-guide use of markdown files
Oliver's transformation of DDI4 xmi to COGS

In current version of 3.3 COGS output handled choice and substitution differently because of limitations of non-XML bindings
Use a base type and then representation
Review deriving and extending from something - there isn't an explanation of inheritance - assumes you have some knowledge of objecting oriented design work and modeling

Helpful to have some How we use COGS in preparation
Who is this for?
--should people be chucking stuff in?
--differentiation between content and modeling
--what are the odds of getting lots of contributors?

**Half hour session walking through COGS for clarifications
Things on spreadsheet
Go through the outputs CSV to XSD to validate approach
Are we generating content the way we want to
--Break this out
--Do the deep dive into the transform process
--Verify that everything gets transfered correctly
--Have the current schemas been disected to capture everything we want (all the content in a more OO style of presentation)
--This is the custom tool that needs the most input
We have an existing schema to compare to in 3.3 and 2.5
DDI4 is a different issue and Oliver has been working on it

**Oliver provide a half hour review of his XMI to COGS and round tripping
**Get information out early to get some understanding of COGS - there are some quick starts on the documentation page and then the modelers guide
**Set up a webinar for walk-through
**Does it cover what we want to do with lifecycle?

Controlled Vocabulary stuff
--From SKOS to other representations

DDI Resolution
--Agency ID compared to an arch or DOI handle
--Using another resolution system
--DNS resolution
--Alternative resolution (Dan)
--OAI-PMH endpoints and using them as DDI services (imbeded DDI specific versions in OMI-PMH embedding) Need a paper or article about how to do this correctly
--Also applies to API packets and what they look like (we have the fragment instance but no documentation on what to put inside of it - best practice minimum viable content etc.

Roadmap to 3.4 - develop some general conscensus for how things need to cohere or align
Best practices guide

TO DO (Wendy):
Send out message regarding COGS review - have Dan S and Jon review before then
Pull together work plan spreadsheets and post


ATTENDEES: Wendy, Oliver, Flavio, Guillaume

Guillaume's last meeting with TC
INSEE has moved production application to 3.3

For TC meeting - production work:
Test case for loading GitHub or bitbucket production lines
For each version
Discuss with Dan
Can we input 3.3 and 2.5 and get out the full content?
Make sure we are lossless before we start trimmming
Review rules for trimming as currently prepared
Everyone needs to get up to speed on COGS
Means of comparing input to output

Wendy will do a first draft of an A to Z set of activities for moving each version to COGS


ATTENDEES: Larry, Dan S., Jon

DDI 3.3 progress

Dan has made a start and completed / nearly completed classifications work, bits and bobs outstanding, hurdle is  DDI-Lifecycle 3640 (Sampling etc), see next item

Jon has made a start on documentation, it will be more user focused, better context and how things fit together

DDI LIFECYCLE 3640 - PowerPoint is a bit confusing to translate into schema changes.

Discussed three options:

a. Not include (discounted as already promised and in Public Release and seen as useful)

b. Only add public review changes and minor updates for missed content (safe option)

c. Update to include rationalisation of process model as per PowerPoint. (concern that major surgery compared to Public Review, would cause problems further down line)

Jon would attempt to take powerpoint and identify changes (not just implied changes) to inform choice between option b and c.

TC Meeting (Minneapolis)

Following on discussion at previous meeting for items

a. Discussion of Roadmap to 3.4 serialisations and content update

b. Mapping between DDI versions - approach as a beginning of taking existing mappings and making them more usable

c. Mapping to other standards (DDI-Lifecycle) - approach rather than specific mappings - might be useful to consider some priority mappings to sanity check approach (e.g. FIHR, schema.org)

d. Mapping to other standards (DDI4/Core) - this would hopefully be an output from Dagstuhl sprint a few week earlier (e.g. GSIM / DCAT / GSBPM are currently on the list)

e. Discussion of Best Practices document- are there more things we should add, can we do something to make them more 'sticky', intersects with discussion on Roadmap above.


ATTENDEES: Wendy, Larry, Dan S., Flavio, Jeremy

See TC-210 for summary of agenda items for cross version commonality and transition

We have a spreadsheet from 2.5 to 3.1, change logs for each DDI-L version, Larry created a generic DDI2 to 4. There is a very generic between 3.2 and DDI4 prototype content. There is a generic model in Colectica that allows input/export between versions.

What gets missed explictly in the 2 to 4 issues of relationships expressed through nesting that need to be made explicit.
Example: in 2 you have contact with an affiliation and email but in 3 or 4 you have to create an agent as well as a (2 low level thing apply to the same composit type)

When you go from 3 to 4 there may be more coorespondences

We could do this with COGS model
Input of XML schema with a spreadsheet of typing information for all references
For 3.3 we are still doing XML editing

Roadmap was approved - (decision in Kansas and Discussed in Dagstuhl and approved by Exec Board)

The DDI 4 doesn't really follow the roadmap in terms of the Core - there seem to be different opinions on the role of Core and a continuing DDI4
The Core thing is a political thing in that DDI4 is taking too long and so its getting something out but it

The idea of the roadmap was to take the goodness of DDI4 feed it into 3.3 to create a 3.4
How much can we bring into 3.4
Some of it is already in 3.3 (Statistical Classificiation, measures, agents, etc.)

What can be merged what can't and why - can we define these
Can we define these areas

Where are things stable
DDI 4 could be more RnD work?
Publication in 3.4 etc.

Besides doing comparison of Lifecycle and DDI4
We need some means of recording mappings between standards and/or versions of DDI
- We have a lot of different products and lots of different mapping
- We could use "DDI Integration" model to center the mapping focusing on the portions that are in DDI Integration
- Don't need full mapping if it is seen as a bridge

The MRT workplan was focused on integrating with other standards

Agenda item: How would we go about doing that
How to go about doing and expressing mappings and itegrations

Speaking of integration and mapping NIH was looking for integration with FHIR

Defining how we're going to make these mappings is very important

Is there a discussion of "Views" in the MRT?
Latest discussion of views is that they will be documentation only
Haven't made much progress on integration tools although this may be discussed at Dagstuhl sprint


ATTENDEES: Wendy, Jon, Dan S., Jeremy, Flavio, Larry

TC-208 Generic Agency filing - linked to TC-210 as part of cross-version comparability
TC-162, TC-167 follow-up - added xml:space to list of XML attributes - updated guidelines page
TC-163 DocumentInformation - discussion


ATTENDEES: Wendy, Jon, Oliver, Larry, Dan S., Jeremy

TC-167 from https://github.com/ddialliance/DDI-TC-167-Discussion/issues/2
(2)Classes of default where there is no inheritance means we'd end up with a mixed set of rules. The simplest approach is to not use defaults.
Should not have 2 places it can be defined - it goes where what you are defining is defined
Example of decimal representation within a data file (column) - mixing data sets would still have to deal with this but it is clear what column of data is being defined as what
(3)Enforce the rule for default non-usage and the problem of tool support for default becomes moot.

TC-203, TC-189
We have version to version change sheets
Don't have anything to 4.0
Is this approach sustainable and how do we make it so. Useful to developers.

Jon will take a crack at content of TC-203 and TC-189
What needs to get info prior to TC meeting what in the meeting what post-meeting (talk to Barry)

What kinds of DDI expressions do we actually have (document, temporary collection of items, passing an object, etc.)
We need to flesh this out and provide some good common use cases to help sort out what is needed in various cases and then look at solutions

ZOOM notes:
Will continue for up to 24 hours
Repeating meeting works the same at GTM in terms of using the link at other times
Screen sharing: person needs to request screen with OK by organizer (rather than organizer assigning and then accepting)
Local recording of meeting is an option

Contact Barry about attending TC meeting for joining discussion of C-L-4 translation and usage; Barry is involved in this with survey work and marketing
Goal would be to have Barry involved in TC in this area over the next few years
We need to request this inclusion through Exec Co.


ATTENDEES: Wendy, Jon, Larry, Dan S., Oliver, Flavio

RESOLUTION: create XML schema reflecting Classification - Reusable Concept.jpg Note the following
Separate exclusion for concept and classification item to support different targets.
Notes in diagram refer to XKOS note relationship not DDI Lifecycle notes (document)
Dan S. will draft up the XML
Change where the level number is
Change some of the property names to reflect reference naming conventions

DDILIFE-3657 I have added recommendations for each case identified. PLEASE look at item (3) as we need to determine whether ConditionalResult needs to support multiple languages.

Additional topics:
TC October meeting in Minneapolis
--go out next week
Creation of a simpler means of filing issues
--web form on DDI Alliance (name, contact, version of DDI, description of issue) - respond to filer with link to JIRA issue
--concern that it disconnects people from issue tracker - check to see if you can track if not a registered user
Encourage people to use JIRA directly but then hook those submitting through form back into JIRA to follow the issue
Verify what link is needed to get an issue
add project number to link to specify - can do &pid=
https://ddi-alliance.atlassian.net/CreateIssue.jspa?pid=10000  DDILIFECYCLE
link to Lifecycle source page has list of all - use the TC as the generic

https://ddi-alliance.atlassian.net/CreateIssue.jspa?pid=11700  Technical Committee TC

Remaining DDI4 Prototype issues (broad discussion issues) - schedule for next week get additional information on DDI4 position and COGS approach for discussion


ATTENDEES: Wendy, Dan S., Flavio Larry, Oliver

DDILIFE-3657 Review 4 items and comment

DDILIFE-3651 (see issue for more discussion and decisions)
3 versions:
not reusable
ClassificationItem reusable
All reusable

Discussion document describes all use cases - NAICS, Geography

How much reusability is optimal
ClassificationItem has an ItemCode as a property



SideNotes from discussion:
Concepts - have includes and excludes on Concept
Currently this is part of a definition but shuould be able to parse into separate content

using the XKOS vocabulary is helpful in terms of mapping


ATTENDEES: Wendy, Jon, Larry, Flavio, Oliver, Jeremy

1) RDF vocabularies
XKOS published
Disco waiting for Achim to release
2) update on Statistical Classifications (status) see DDILIFE-3651
3) GitHub emails - OK to send (fix typos on wiki's and get issues transferred first)

The Technical Committee (TC) needs to tap into your knowledge and experience working with DDI.

A number of issues came up in during the DDI4 Prototype review that we considered as being broader than this one product. Some are technical, some philosophical, some practical. The TC noted that we needed broader input into the discussion than was represented by our membership. So we are bring the discussion to you.

We will be trying a number of approaches, but to start out we have put two issues on GitHub to let you comment, add new issues, and basically, join the discussion wherever and whenever you wish. Each issue will have a description including the goals, background, and some of the related issues noted in TC. Each discussion will be linked back to an issue in our standard issue tracker to ensure they are brought back into the official tracking system.

The first two issues will be:
Use of "default values" in DDI (cross-version) https://github.com/ddialliance/DDI-TC-167-Discussion
XML style in DDI (cross-version) https://github.com/ddialliance/DDI-TC-162-Discussion

Join the discussion!

The Technical Committee


4) general issue filing
Send it to the people who asked for the information and ask for feedback on instructions and process
We may need to create a form for the Alliance site which would require transfer but would be doable

5) TC face-to-face (planning/agenda)
Issue TC-210 add any additional thoughts as comments
Send out email to TC members regarding meeting


ATTENDEES: Wendy, Jon, Larry

review of expanded documentation for methodologies
add diagram on methodology relationships
Edit current schema documentation FROM:

Metadata regarding the methodologies used concerning data collection, determining the timing and repetition patterns for data collection, and sampling procedures.

Replace with:
Methodology covers approaches used for selecting samples, administering surveys or data collection approaches, timing repeated data collection activities, weighting, and quality control.

In DataCollectionMethodology you have..
"A basic structure for describing the methodology used for collecting data."

Repalce documentation in ref=DataCollectionMethodology, element DataCollectionMethodology and complexType DataCollectionMethodologyType with:
“Methodologies pertaining to the overall data collection such as primary or secondary data collection, qualitative or quantitative methods, mixed method approaches, GPS capturing methods, methods for collecting data from hard to reach communities, etc. Repeat this element if multiple methodologies are used.”

In DataColletion "This section covers the methodologies, events, data sources, collection instruments and processes which comprise the collection/capture and processing of data. Methodology covers approaches used for selecting samples, administering surveys, timing repeated data collection activities."

Replace last line with:
Methodology covers approaches used for selecting samples, administering surveys or data collection approaches, timing repeated data collection activities, weighting, and quality control.

New issue DDILIFE-3654 - Resolved
TC-162, TC-167 setting up github for issues and wiki
Add information on how to sign in to comment or add issue. Make point of where and how to comment on issues on home page


ATTENDEES: Wendy, Flavio, Dan S., Jeremy, Jon, Larry

Decoupling Hierarchies from Items and Levels in DDI 3.3 Statistical Classifications
Some of the problem was input to COGS and output to XMI - file an issue on that
Will continue to try to get information on any changes to GSIM 1.2 and how reuse of CategoryItem may be handled
If this takes longer than one-month we can go with what we have.

DataCollectionMethodologyType seems redundant and disconnected
This is a check off item for documentation

Exposing DDI in more generic search structures like DCAT assumes packaging structures to get at Dublin Core like information
Still a future item. Specified some specific steps we will take.

ResultDetailType's TypeOfResult should be a ref= not name= Need Validation Test for this general problem
Validation test for 3.3 would be good but is not a game stopper.

CLONE - extension of codeBook/stdyDscr/method/dataColl to provide specific code blocks to specify survey settings for data analysis
Resolved: addition of CommandCode (o..n) to UsageGuide is sufficent for now


ATTENDEES: Wendy, Flavio, Dan S., Jeremy, Oliver, Larry

1) Scientific Board Mtg report
2) Prep work for Codebook review
3) DDI4 review issues
4) TC face-to-face
5) COGS move prep

--Achim has informed me that Disco is ready for a public review. He recommended the minimum review followed by a vote as soon as issues are resolved.
--XKOS received only 12 votes all positive. This does not meet the minimum of 2/3rds of the members. There are no rules to cover this situation. Jared will inform the Exec Board that he will approve XKOS for publication pending the approval of the Exec Board.

--Scientific Board is proposing a group to review and propose a structure that will improve member input to decisions regarding the direction of scientific activities
--A small group is proposed to look at the procedures for review and voting to address new products and issues of non-response
----It was noted that the information needed by voters is not PR blurb but a summary of the review process etc. so that they can determine if due diligence took place. I asked that this happen sooner rather than later as we have a minimum of 3 votes coming up in the next year.
--Offered to track process of getting broader input on DDI4 review issues as a potential means of increasing input by members to issues for the Scientific Board. This would mean tracking approach, response, and success of using this approach to increase input. This would be reported back to the group looking at the organization of the Scientific Board.

--Talked to staff from ANU working on Codebook and DataVerse activities. There will be new issues filed from this work.
--Talked to WorldBank staff regarding new tool. There may be one or two issues filed from this. One definite item needed is a clear best practices for creating extensions and using current contents to capture local focused information (similar to Note, User Defined Key Value pairs etc.)
--Several people noted that how and where to file issues is not clear on the web site or to the users. We need to make sure this is clear. Easiest would be for them to file in TC and we could move as needed.

--Platforms for this activity...suggestions?
--GitHub and GitLab for open source wiki and issue tracking
----work in MarkDown plaintext for revision control - can edit directly on the web which is good so we don't have to deal with push etc.
----DDI Alliance already has an organization there
--XML would be a good place to start as the DDI4 group is already making internal recommedations

--We will know about funding for a face-to-face after the mid-June meeting of the Exec Board. While originally asked to delay meeting to deal with results of survey it appears that we will not be expecting that during the first part of the year so we should plan on a MPLS meeting focusing on COGS and cross-version content issues.
--Sprint is 30-4 October, Interoperability is 7-11 October

--One of last 2 weeks of October starting 21 or 28 (preference for 21st to avoid Halloween)
--Agenda: COGS, finalizing web based agency resolution, cross-version alignment (when to use one and not the other, difference points, frame it out - get some of this done in summer), profiles

----Is there going to be some type of viewer for state of the model out of COGS
----Every time a change is made in the repository or in your own branch an update will be run generating documentation

----clarify a wish list of what it does and what we'd like to be able to do
--Steps to take
----Prior to meeting
----During/post meeting
--Verification of input and output
Create a page on TC confluence to start gathering this information

How to define output formats for DateTime and Numeric Representations - discussed and Dan will file an issue relating to documenting this in 3.3

--3.3 final items
--profile of remaining open items on Lifecycle, Codebook, TC


ATTENDEES: Wendy, Jon, Larry, Oliver, Dan S., Jeremy

DDILIFE-3651 table until we get feedback from statistical agencies through Flavio

Went through issues in Prototype Review that require broader input and discussion
--The approach will be raised in the Scientific Board meeting
--Issues reviewed and clarified were: TC-203, TC-189, TC-167, TC-162


ATTENDEES: Wendy, Dan S., Flavio, Larry, Oliver


We either do this by changing the model or by documenting how to work with this model and create new classification items and add mapping to capture the change.

What is the right approach going forward. GSIM is not going to address the management just remove duplication of attributes (level number, things that are both relationships and attributes will just be relationships)

We should reflect the work of GSIM. GSIM is focusing on the presentation of the Statistical Classification in terms of the rigidity of the hierarchies.

Concerned with all the content within classification item that are sharable across the hierarchies?
If all content changes are the same then it is reusable within that context (concept related content, itemCode, etc.). If they change in terms of how they are grouped this should be handled by the the StatisticalClassification. What makes sense for the other properties (what is dependent on the classification what is consistent for multiple uses of the item?)

If it were something other than classifiction item that denotes the combination of a code and concept and then the order and hierarchy is defined by their context. How are the different properties broken up into those that are shared and those that are specific to the classification?

--Definately presentation, but clean up and then describe how to manage reuse and classification correspondence (standard varience and versions) but could be used to capture other major reorganizations
----how to create lineage, serious documentation explaining the management part

--Flavio will list the properties/attributes that were found by GSIM to be duplicative so we can correct these now
--Define which are reusable and which are context (classification) dependent and determine where they should be represented (Wendy will make a first pass)
--Statistical classification derived from ....how does this work without an item hierarchy...using same items but different levels (like SIC to NAICS which also brings in the Mexican and Canadian classifications)--ask someone who knows (Flavio will track this down)(StatsCan folks, Franck, Klas)


ATTENDEES: Wendy, Jon, Dan S., Oliver, Larry

Web-based agency id resolution
--typo correction; ok'd send out following meeting
DDILIFE-3644 Status and 3.3 finalization
--3644 - tied to the use for an analysis
--3651 - Dan will get updates done to statistical classification so we have updated content to work from and add a "refeerence" name rule check to validation tool
Preparation for Members/Scientific Board Meeting
--Disposition of Prototype review issues
--TC-189, TC-167, TC-162, TC-163 (outlining for presentation to the Scientific Board)
---- No overall official mapping between versions, interoperability,migration from 2 to 3 and beyond
---- Migration should be a major issue for TC both retrospective and how to maintain this going forward (capture as we do the work)
---- Problem of use/abuse of elements over time and the impact on translation
---- How to mitigate the problem in future - description, example, validation tools
---- Possible use of an underlying conceptual model
---- Explain DDI-MF needs to go to the Advisory or DMT to look at intended audience and use target. This should reflect current thinking and be clear in the context of DDI overall
---- Documentation targeted to appropriate user groups, implementers, content people, others?
---- How to interact with GIS and other systems/standardds we do a handshake with
---- Unofficial extension to official library and content. Guidence on how to extend, how to get new content into DDI and how to help interoperability. Can we provide places to manage extensions in a structured way...rules, plug-in points, whats supported in different bindings (xmlconstrain language, rdf full universe), model's custom metadata package, new types as well as extension types (Capture types, pipeline and autogeneration stuff for example). For example in Lifecycle having an expandable custom type or in version 4 that has an annotated identifiable of custom type to ensure correct naming and identification structure.

DDILIFE-3651 Statistical Classifications review
TC-167 Default value approach for DDI overall


ATTENDEES: Wendy, Jeremy, Dan S., Oliver, FLavio

review of TC-156, TC-182, TC-193 for closure;
update on XMI to COGS work;
--bring transformations to canonical XMI
--movement of content within XMI clarified, transformations need to change in terms of source location (attribute address to child note addresses for example)
--includes reworking XMI to COGS which produces XMI back
--check on EA XMI to canonical XMI to see if we can get canonical XMI from COGS
--Process flow from canonical XMI for 4 is also being done
--set up will rely on Bitbucket running pipelines
DDI 3.3 remaining work;
--Statistical Classification - GSIM 1.1
----New version of GSIM coming out, some changes are resolving problems in 1.1
----Issues of reuse of items in multiple hierarchies (no separate relation objects)
----What we have may not be enough for Classification Management
----Send Flavio list of issues related to Statistical Classification
----Get Guillaume and Franck on a call on this in May
----What do we want the 3.3 model to do
----Determine what we can get in 3.3 in terms of what we want it to do
----Primary goal was to publish them in a machine readable form - look at levels and move them from item to classification
----Need to reference an item as it is the right scope/context - need to reference point of use as well as the item
--Weighting - check Stos and Ben
--schema updates
--documentation- and best practices paper
--prep for vote
DDI 4 review remaining work
--5 broad issues
TC-204, TC-205
--Add an issue on use of common properties such as "typeOfXxx" across versions
--Pull these together as a broader discussion topic in TC in terms of cross-version comparability and transversing across versions

Get Statistical Classification sorted out in May for 3.3
Contact Stos and Ben regarding weighting


ATTENDEES: Wendy, Larry, Dan S., Jeremy, Oliver, Flavio

1) DDILIFE-3616
2) Post_Prototype disposition (14 items) - DONE
3) framing discussions for TC prototype review issues
-- One approach is to outline issues, identify communities/individuals, gather input, organize focused discussions
4)Send out an invitation to discuss web based resolution options and get use cases. Dan S. will write up introductory information.


ATTENDEES: Wendy, Dan S., Jeremy, Oliver, Flavio (Larry provided comments via email)

TC completed triage of Prototype_Review issues and moved 19 issues to DMT for resolution. Comments were added as needed. Of the other issues not previously moved, 3 issues required no changes as the content already existed, 3 issues were direct fixes (typos, minor addition, etc.), and 5 issues require broader discussion and response. TC will organize these issues and set up a process for obtaining community input and discussion. 
Note that there are still 14 outstanding issues identified by the TC during the technical review process which need attention within the context of the Prototype Review.


ATTENDEES: Wendy, Dan S., Jeremy, Oliver, Larry, Flavio

Prototype Review issues:
--Triage goals
--those that can be handled by TC or that can be made declaritive
--Approaches for disposal
--Identify where the fix is clear, state correction to implement
--Identify issues that are DDI4 specific modeling issues - comment where needed, return to DMT
--Identify broader cross-DDI issues - frame issue, organize discussion among stakeholders to gain input and inform decision. Implementation in DDI4 would then be coordinated with MRT group (who would have been part of discussion)
--TC will track action on Prototype_Review issues and report back to SB

Start working through issues
Triage work done on:
TC-156, TC-161 through TC-169


ATTENDEES: Wendy, Jon, Dan S., Jeremy, Oliver, Larry

TC Roadmap - finalize and prepare for passing on to Scientific Board and Exec
--web based resolution should be added - Dan S. will provide text
2019/2020 budget requests
--this should include a discussion of where we are with COGS, what needs to be done in terms of DDILIFE, DDI Codebook, DDI4
--a discussion about things in the road map - a face-to-face on Codebook 2.6 and other roadmap issues / alignment with GSIM / RDF serialization that can mix and match
--draft something about above
--would it be worth suggesting where it would be - who needs to be there and determine where is the most economically feasible
--Current work is all concluding up around July August so that might be good [August bad, Dagstuhl 9/30-10/5, October looks good so far hold 3rd week 14-18 or 21-25?]
--If 2.5 might want to get Mehmood, good to have Olof and Johan

--Where we are with DDILIFE entry for updates and documentation - entry this weekend
--XKOS update - ready for vote
--Disco - sometime in spring - will it need a public review
--Responses regarding CV usage - SKOS, JSON both mentioned as well language filter for HTML

DDILIFE-3644 read the example and be prepared to discuss

Add command code but then make a future issue to explore specifications for different types of weighting to help generalize to type. Document use of UserAttributes to capture some of this information.
Look at this as we review the related issue in for 2.5 - we want to be sure these align
One concern is that if we get too specific about one type of weighting it may become too specific in one direction only
What is stated for STATA could be machine processed to other language
Use example of translation of how to create Key/Value pairs
Resurrect suggested CV from Weighting group and Sampling Group and pass on to CV to create and support new content


AGENDA: Wendy, Jon, Oliver, Larry, Flavio, Dan S.

DDI4 Prototype Triage pass
--During full Triage we need to identify which issues have broad implications and make sure we have a list of "types" for triage that allow for more sorting of issues to be addressed. Do this before end of March.
--Move 4 identified issues to MRT
--Assign TC-189 to Jon to start forming a protocal for looking at these broader issues

TC Roadmap document:
Basically the roadmap reiterates that DDI-C and DDI-L should not be forgotten. We have new users of DDI-C as well as existing systems that are invested in DDI-C and DDI-L. This is an updated version of the 2017 document. We feel it would be helpful for the Scientific Board and Executive Committee to see where these products are, how they fit into the mix, and the type of continued support required. This also seems to be a means of re-engaging membership who are concerned with current systems.

There were some questions raised about the second to last paragaph in the Lifecycle section in terms of short- and long-term goals of MRT in regard to coverage of DDI4. Larry will provide some clarifying language.

We aren't addressing all things in DDI4. There is an issue of being able to move featues from DDI4 development into the 3.x series to make them available to users. This would be done in response to requests from the community.

One specific area is Classification Management. Flavio will look at what has been added to 3.3 (GSIM model) to see how it lines up with DDI4 and XKOS. The goal of this work now will be to prepare a best practices guide re: XKOS and Classification Management section.

In general we need to have mapping information built into the modeling. We need to be aware of interations and crosswalks in mapping. Overall we need to have a continuing focus on not only how a version of DDI interacts with other standards but how it interacts with other versions/products of DDI.

DDILIFE-3630 decision on supplement
DDILIFE-3648 decision
DDILIFE-3615 review new document for discussion
DDILIFE-3616 Dan S.? [no new info]
DDILIFE-3644 example and Universe question
--look at example for next week
--The proposal seemed a bit to specific to a specific type of sampling (might be good to have Dan G. look at it also)
--There are parameters of the sampling that need to be identified beyond just the weight
--should weights point to variables? (currently variables point to weights and the weight to its generation and usage

NEXT WEEK: DDILIFE-3644 (DDI perspective)


ATTENDING: Wendy, Jon, Dan S., Jeremy, Larry, Oliver, Taina, Sanda

NOTE: post call email discussion added at end to clarify these notes

- Does the DDI Alliance want to publish the translations of the CVs as well as the source CVs? That would require changes in the website. If not, I think there should be a reference to the CESSDA Vocabulary Service CV which has the translations. And even if yes, a reference anyway?
There is only one reference to the CV. The SKOS is already merged. The tools had separate versioning for languages. How to deal with versioning. The point is what we want to publish for the alliance. We want to publish multi-language where available. We are publishing different formats transformed from SKOS. We can include a reference to CESSDA Vocabulary Service CV in the documentation.

- On the DDI Alliance website, the different location URIs of different vocabulary formats take too much space. Can this be amended, for example, by presenting a button for each format?
Which page are you referring to? html - language or format

- URN: at present the ‘Canonical URI of this version’ element (=urn of the version) does not have a language tag. The IT people for the CESSDA tool say they would like also the source vocabulary have a language tag. This would mean that the urn of the English CV version would be:
Is this OK?
Good news is that the developer said in today that it is no problem for the system not to have language tags in the urns.

But there are a couple of other perspectives:
· An identifier should not have any semantic on which a system relies on. The fields agency and version are exceptions; they have a very specific definition( according ISO 11179).
· Common practice to deal with multiple languages of a resource (HTTP header, HTTP negotation, multiple languages in on RDF resource)
· Agreed way in the DDI Alliance context regarding CVs with multiple languages

Display issue in terms of viewing the full list (in single languages) or a term (in multiple languages)

- The tool produces a citation which does not exist in CVs now published on the DDI Alliance website. What to do with this in data transfer?

The purpose of the html is viewing and browsing so we have some flexibility in how to approach the display. We are publishing the standard not a tool to explore the standard so we need to make it clear to support users, but not necessarily creating the ultimate display. The primary aim is to publish the CV's in usable formats (SKOS, Genericode, CodeList, JSON, this should be decided by the user community, is there a codelist layout for Codebook as well as Lifecycle?). Need to come up with some box and arrow drawings of how we'd like this displayed.

- Dead link on URI
Which page?
if you mean
http://www.ddialliance.org/Specification/DDI-CV/AggreagationMethod_1.0_Genericode1.0_DDI-CVProfile1.0.xml in AggregationMethod note that there is a typo in the name of the CV. If that is corrected the link is good.

- Usage information may be presented differently, can the publication pipeline just publish whatever is there in the CVs in the CESSDA tool? Module/attribute/element names will be there with a link to the specification, as well as element numbers for DDI2 but no definitions of the element as now in DDI website.
It's useful but it would not be synched with the latest documentation. It should pull the documentation from the DDI publication if possible. This could be done in transforming SKOS to html with full language and content and then select box for display (language or term by term)

- The tool will automatically assign the publication year as the copyright year. For all CVs published from the CESSDA tool, it will be the current year, be the CV old or new.


- Urn resolver planned for DDI?
The currently situation. DDI agency administration tool is ready to resolve to the agency supported internal resolver. We need to check with ICPSR but don't think there should be a problem there. This still needs to be filed in order to achieve the first step, resolving urn:ddi to the agency administration tool. Current workaround is to recommend us of both urn AND the url if using thse CVs

The point is that the CV's have a citation and the citation has the URN not the URL

TC needs to determine what people are using through the broader user community (ddi-users, ddi-members, ddi-scientific board, ddi-srg) - send out soon

When to publish changes (partiularly language additions)
Production process - review, creation of bindings, input to DDI documentation (how to capture intermittent updates to documentation), approval, announcement
Prepared before summer meeting, get it on the agenda

License of translations - will be the same as the source ccBy4

Functionality of the CV site on DDI Alliance in a more formal way

In practical terms what triggers the output from the CESSDA system regarding a change in a CV. Harvest system to check for changes and push out a SKOS to the DDI repository for entry to the DDI production pipeline. DDI internal production system, to web. Clarify transfer points and actions and who does what. (Who being an agent - machine, person, etc.) Change logs for individual CVs. Clarify requirements for display, usage, versioning differences, and clarify what needs to change and priorities for those changes.


Q: If I understood correctly, DDI Alliance will want to version the whole package with all language versions? If there is change in any language versions, the whole package version changes. I can see this is useful if, for example, the CV is used as a multilingual filter in a search interface.
RESPONSE: If I understood Oliver correctly we could get a merged language output in SKOS from the CESSDA tool. If not it is simply an additional step in the production pipeline to create a merged version from the individual language versions prior to transformation to other formats and publication. Oliver should have this in his workflow.

Q: The CESSDA tool versions each language separately, though the translations have a three-digit version number linked to the source as the first two digits are the source version number (i.e. 2.0.2 means the second TL version of the source version 2.0). This is useful for users who are using the CV in their metadata. They are only interested in when and what has been changed in their language version to see what needs to be changed in their legacy metadata and in their systems where the CV is implemented.
Q: The issue maybe to discuss is how to produce the whole package versioning, in the merged SKOS of the CESSDA tool, or in the DDI publication pipeline? But this is technical, so out of my expertise.
RESPONSE: Basically we will publish a SKOS instance that contains all languages and therefore have a single version number. How the version number is defined can be up to us. So yes we could say that the major version doesn't change unless the terms change or there is a significant correction in the intellectual content of a definition. The reference could be to the major version and the user would get the most recent sub-version of that. Some people do sub-versions simply using dates and internal change log with a date, change, and reason for change. For example: 2019-03-15, Term X definition (en) Aggregaetion changed to Aggregation, Corrected misspelling OR 2019-03-15 Added labels and definitions in German (de), added translation option. In short, it is up to us to determine the versioning rules and make them clear to users. In making a decision on this the primary issue should be user needs. What changes actually have an impact on the user and the value of stability in the version numbers.

Q: The link to the Unesco vocabularies was just an example of the one typical html display of multilingual vocabularies. Based on open source publication platform skosmos. It displays the whole language version on the left and term-specific display on the right. In term-specific display all language versions of a specific term are displayed. What is lacking, I think, is the possibility to click on any language term on the right and change language that way.
RESPONSE: will add this to minutes to clarify

Q: Regarding the citation, the citation in the CESSDA tool has both URN and URL. Did we conclude that in CESSDA tool the citation URL is to that tool, and in DDI Alliance website the URL in citation is to the DDI Alliance webpage? That would need to be taken into account in the DDI publication pipeline.
RESPONSE: Yes, the DDI publication will provide DDI URN and URL (probably to the SKOS version as the source)...that can be decided. It should line up with the target of the resolved URN. Actually, since we have multiple formats we will need to consider just WHAT the URN is landing on. This may result in reorganization of the web content.

RESPONSE: The primary concerns of the call with TC was to make sure we had identified all the technical things that needed to be decided and completed. In particular to identify interaction points with the CESSDA tool, the production pipeline, CV/TC, and ICPSR/Web page. I will add a line to the minutes that the UNESCO link was simply and example


ATTENDEES: Wendy, Jon, Dan S., Jeremy, Larry, Guillaume

DDILIFE remaining items (x'd items have been dealt with see issues for comments):
3616 review with 3615
3619 waiting for validator
3647 x
3645 x
3646 x
3625 x
3641 x

Put on later agenda:
documentation production


ATTENDEES: Wendy, Jon, Dan S., Larry, Guillaume
GUESTS: Stas Kolenikov, Ben Hilland (DDILIFE-3644)

Research Organization working on Government Contracts (Federal Agencies)
STATA Code developer
Publication of survey design features in metadata (ICPSR data)
Processers need information on how to process information for accurate use of weights, estimations, etc. based on survey design, sampling design, variety of weights. Improve content to support use of complex weighting content.

Do you think there is a level of adoption in the community your working with? Hard to say.
People have become more aware of the need to reflect impact of sampling approaches on weighting the use of weights, therefore credibility of responses.

Alterntives: list a variable that states, strata, cluster, etc. The variable can know about itself by defining its role. Do you expect DDI 15 years from now? yes..its been in use for 20+ years already.
Parse R or Stata statement if avialable and self-identification of variable roles
Are we going go proprietorial around current issues?
Weights, Cluster, Strata and Calibration variables accounting for point and variance estimates. For some time R could do calibrations properly now STATA does (in past year).
Also use of Replicate weight.
If you could have command code that would provide a flexible specification that would address.
Ben: Would it help if we tried writing an XML stucture to describe this?
We should probably start off being ambicious in describing these sorts of issues.

Why isn't this already addressed in 3.3? Role of a variable can be stated, and preferred weights? Should there be a means of fully structuring how to get standard errors, replicate weights, common weights. A clear link between variable and specific survey design and weights.

Use the more complex example with to do an example to see what could be captured currently or with the additional commandcode content.

Roper poll data (replicate weights)

The specification of the cluster, strata, and weight variables as a set have a purpose and there are different variables used for different purposes. At the same time you can say a variable has an inate grouping of variables in terms of their case (person, household, etc.).

May be a benefit in

1-Stas and Ben try writing an XML stucture to describe
2-Add weighting information and variable content to issue so they don't have to wade through
3-ANES example and CPS example - see where they interact with the sampling process
4-Follow up with Stas and Ben in 2 weeks to see where we are


ATTENDEES: Wendy, Jon, Dan S., Larry, Guillaume

DDILIFE-3640 Sampling - ok'd for finalization
any remaining 3.3 issues - reviewed to make sure what has not been addressed by 3640 and scheduling
2019 workplan - reviewed and updated

DDI Agency Registry update:
New version is now up and running - more information for approvers
Cross platform (runs on Linex) - Jared is reviewing it
Now can have your own user account that can manage agencies rather than agency
The TC can think about means of resolving services for agencies over the web in addition to over DNS
--for discussion this year (added to workplan)
Offical URN namespace registration (added to workplan)


ATTENDEES: Wendy, Jon, Dan S., Jeremy, Olvier, Guillaume

--Put in link from DevelopmentImplementation to DevelomentProcess - got lost in shuffle

--Makes sense that its restricted right now as this is the bounded problem we are trying to solve. Leave at Question, Measurement, Instrument, ControlConstruct reference stack. Does anything else need this within the context of Data Capture development? Seems to cover the major development components.

--Use references to UnitType and Universe where applicable
--Complete example and send out next week following final review by TC

--Contact Stas and Ben about availability for call next week. See if Larry will be available

TC Agenda for 2019 To Do list:
--Codebook work to do (good summer project)
--Prototype dispensation of review comments (end of March)
--INSEE might get something in
--Oliver is checking on GESIS
--Publish 3.3 (end of March)
--Move to COGS (Spring/Summer)
--New system for editing controlled vocabularies - need process to move SKOS to DDI outputs - work with CV group, get a meeting with the CV group on this to speced out and get a time line set up

Wendy will draft work plan and send to Jon for additional discussion and then will bring to TC


ATTENDEES: Wendy, Larry, Oliver, Jon, Dan S., Jeremy

3626 - simplify name to ReviewProcess but "process" seems to imply the

3640 - got through Data Capture Development section. TC group will look in more detail. Need to get feedback from people doing this work to see if it sufficient and/or overkill. Send out ppt and example after providing the descriptive (non-actionable) implementation.

Look at new issue on weighing for next week


ATTENDEES: Wendy, Jon, Larry, Dan S., Jeremy (Oliver had connection problems)


3616, 3617, 3626 and 3643
Jon will take a bit more of look at 3626 but I think its pretty well resolved
Dan S. is providing a short list of items in 3616 that need to be addressed now 
Wendy reviewing the final list of changes for 3617


ATTENDEES Wendy, Jon, Dan S., Jeremy, Larry

Reviewed remaining open issues in DDIL_3.3_review
Remaining Methodology issue in Master Issue are waiting on examples to be completed - all should be resolvable at that point
DDILIFE-3617 was discussed; members should review the spreadsheet and identify where extension should be replaced by the creation of a element containing the original extension base and the added content
DDILIFE-3616 review Identifiables for consistency of practice and note if they should possibly be Versionable or a property instead of Identifiable
DDILIFE-3614 propose revision of documentation which was writen before UnitType was added. Verify with Guillaume concerning alignment with GSIM (Larry will file a related issue for DDI4 in TC)

Note that resolved issues with clear changes listed in issue have been sent to Dan S. for entry
Remaining items need specifications provided and will then be sent to Dan S. for entry


ATTENDEES: Wendy, Dan S., Jeremy, Larry

DDILIFE-3615 DataCollectionMethodologyType seems redundant and disconnected
DDILIFE-3616 Why are all the content types within MethodologyType identifiable?
Revision of Sampling and Development activities to make use of control constructs - what new Acts would this require
DDILIFE-3620 NOTE: Guillaume has added a number of comments to 3620 that were not reviewed by TC prior to the following discussion

(these notes also appear on the MASTER ISSUE: methodology review)

--Having the multiple ways of setting up ordering is much cleaner using existing control constructs creating more acts
--In parameters and bindings are we constrained on what we can pass through processes that have known inputs and outputs
--Is the question that there are other forms of metadata that may be passed through and we don't know such as embargo information on development work
--If we want to do sampling stratification and passing if we need further Acts they can be added in later revisions

TO DO: Write it up and run some examples to see if there are roadblocks

--Change to Data Capture Development
--Do example of both question and measure
--Use CPS and PUMS for sampling plans

Issues to clarify in documentation:
--Use Sample is the result of a Sampling activity. Need to document the plan, the execution (allows docuementation of specific iteration variance, specific activities) and the result (sample).
--UnitType, Universe, Population difference and which is associated with each (Frame, plan, event, result)
--Specification of Sampling Methdologies. The sampling plan is more of a methodology than the sampling (prodedure/event/action).


ATTENDEES: Wendy, Jon, Dan S., Jeremy

updated roadmap for coming year - Jon has updated version of mid-2017 document
- If "DDI4" is not a continuation in terms of coverage then it should be a separate product starting at a version 1.0
- The point is the discussion needs to take place in the broader community and it needs to cover both the definition of DDI4 and support for current usage
- One issue that will come up would be multiple RDF ontologies that will need to be namespaced so that there is not a clash between DISCO, DDI-L and a DDI4 (Research View)
- Circulate to SRG list ask for comments and a deadline (4 weeks)
- List of tasks for the 3.3 release
-- List of issue numbers that have resolutions for Dan

methodology organization and clarification (includes new content)
- Methodology is mostly textual content
- Do we need all the new types and if so should they be complex data types or versionable
- Identifiable is difficult to handle
- The number of types has increased in 3.2 when IdentifiedStructuredStrings became identiable bundles of typeOf and description - schedule for next week

Next week
DDILIFE-3615 DataCollectionMethodologyType seems redundant and disconnected
DDILIFE-3616 Why are all the content types within MethodologyType identifiable?
Revision of Sampling and Development activities to make use of control constructs - what new Acts would this require

Post spreadsheet of issues
Post spreadsheet of methodology etc.
Get Dan list of issue numbers with resolutions
Circulate the roadmap


ATTENDEES: Wendy, Larry, Dan S., Gulliame, Jeremy

DDILIFE-3630 Added comment - find and add examples in JIRA
DDILIFE-3628 discussion needed - assigned to FUTURE
following resolved:
DDILIFE-3634 Background information and recommendation added to ssue
DDILIFE-3631 typo - fix it
DDILIFE-3629 typo - fix it
DDILIFE-3628 discussion needed - assigned to FUTURE

More open discussion of future broader issues:
--Add new issue for DescribableType usage DDILIFE-3641
--Another broad change would be how the groupings are done - another sweeping change in terms of order and specifics etc.
--Switch just to item by item definitions rather than choices (inline or by reference)
--If we are talking about broad changes in future versions these would be main ones

CV related discussion:
We need to get back to the CV talk about publication process etc. The work on the management tool is progressing to a point where we need to restart these discussions. Put on work schedule for coming year.

A broader question for DDI is as we get beyond questionnaire should the CV group be broader to reflect broader coverage and interest (non-questionnaire/survey things). How these get into DDI CVs or acknowledged by DDI as recommended for use.


ATTENDEES: Wendy, Jon, Dan S, Jeremy, Larry, Johan

DDILIFE-3615 DataCollection/MethodologyType seems redundant and disconnected
DDILIFE-3616 Why are all the content types within MethodologyType identifiable?
DDILIFE-3621 There are many interlinked references in the new sampling and development area
DDILIFE-3605 Questionnaire development should probably use the work flow capabilities
DDILIFE-3620 DevelopmentProcessSteps seem to be reinventing the process model
DDILIFE-3622 How does questionnaire development relate to Instrument

TODO: make a master issue, do background work to visualize what we currently have, intent etc. (see notes on 3622)

DanS: make a list of identifiables (3621, 3616) for next week

Next Week:

January meeting:


ATTENDEES: Wendy, Jon, Oliver, Achim, Johan, Jeremy, Larry, Dan S
3.3 to COGS
     -image from yesterday
3.3 production flow using COGS
    -recommendation to use the same production pipeline for all versions in COGS using a canonical XMI base for production of representations
XSD format/style
    -Pure GardenOfEden
    -Nested XML
    -Imposed order
    -Single/Multiple Schema
XMI output style to support any of the versions
    -Handling Packages in XMI and imput process
Where is the best place to validate incoming CSV files
    -Possibility of a simple tool like Dan S. creted for the schemas
    -Validation within COGS of csv

Current XSD Objects Used

GANTT Chart for COGS

DDI 3.3 Production Validation

Decision for production line of DDI-L and DDI-C does not need to happen at once
Publication of 3.3 is not dependent upon getting everything working in COGS
We now have DocFlex output from 3.3

COGS maintains element and attribute and supports substitution group of any object derived from the complex abstract (substitution head)
COGS also retains order information so that publishers who need to specify order will be able to do so

COGS has a lot of internal validation testing
Validation points for testing
-validation of CSV prior to build validating against model construction rules
Webhooks for new commits are made when with the run build script with validation testing
Correct or failed are reported with link to build run
If needed can run these on a forked branch but this will require a hook up...directions will be added
DDI only works on pull request to these
Could also run on own machine

Oliver has created an XSL transformation to COGS csv based on canonical XMI
COGS XMI output is currently EA which we should be able to run through Achim's transforamtion to canonical to see where problems may arise

There are currently abstract classes in CSVs which must have some sobstitution class. Item-to-Item and Complex-to-Complex substitution groups are available

13 MTG with most memebers - Set up review of new SDI content (sampling, development, weighting)
20 Probably will meet
27 Probably will not meet

COGS has a lot of internal validation testing


ATTENDEES: Wendy, Larry, Dan G., Dan S., Oliver, Jeremy, Flavio, Jon, Johan

Primary agenda items:
Triage on Post_Prototype and Prototype_Review issues for EDDI Sprint - DONE
Key items for TC work at EDDI - send recommendations via email and we'll work that way to organize the agenda

No meeting 2018-11-22
Call will be held during sprint 2018-11-29


ATTENDEES: Wendy, Jon, Dan S., Jeremy, Larry, Oliver, Johan, Jay, Guillaume, Olof

Post-Prototype issues and sprint
Triage of Prototype_Review
Suggestion: review and move Post-Prototype to MT - triage next week to move
Review this week and assign to MT next week

Revisit DDILIFE-3625 Dan S. will provide more specifics on issues this change would address and requirements for the change. Issue was reopened for further discussion. The issue discussed last week regarding documentation changes is being moved to a new issue and placed under discussion as some concerns were raised. DDILIFE-3633
Decision made

Decision made

DDILIFE-3626 Discussed where this would link and determined that it might be more useful to have a more generic Administrative Review that would cover IRB (Ethical), OMB question review, NIH variable review, etc. Dan S and Larry will gather more information on these additional review types. The goal of this information is to track that reviews of specific items (project, question, variable, etc.) has taken place and approval obtained. It is not a review management system.
Still collecting background information approach of a more generic structure approved

DDILIFE-3624 Clarification of original request and some discussion of suggestion to do all inclusion of versionalbe and maintainable objects by reference. This may be too large a change for 3.3 but Wendy will look into level of change required. It may be just a best practice recommendation (use of reference) to prepare for future versions.
Specific issue of Instruction property in questions decided
Broader issue of versionable maintainable inclusion

Next week:
Triage of exisiting Post_Prototype and Prototype_Review

3626, 3616, 3617, 3629


ATTENDEE: Wendy, Jeremy, Dan S., Larry

Revisit DDILIFE-3625 Dan S. will provide more specifics on issues this change would address and requirements for the change. Issue was reopened for further discussion. The issue discussed last week regarding documentation changes is being moved to a new issue and placed under discussion as some concerns were raised. DDILIFE-3633

DDILIFE-3626 Discussed where this would link and determined that it might be more useful to have a more generic Administrative Review that would cover IRB (Ethical), OMB question review, NIH variable review, etc. Dan S and Larry will gather more information on these additional review types. The goal of this information is to track that reviews of specific items (project, question, variable, etc.) has taken place and approval obtained. It is not a review management system.

DDILIFE-3624 Clarification of original request and some discussion of suggestion to do all inclusion of versionalbe and maintainable objects by reference. This may be too large a change for 3.3 but Wendy will look into level of change required. It may be just a best practice recommendation (use of reference) to prepare for future versions.

Next week:
Come back to 3624, 3623, and 3625 if background work is done
3616, 3617, 3629

Availability for meetings prior to EDDI:
Not Thanksgiving - all Americans
Will have the TC call during sprint week


ATTENDEES: Wendy, Johan, Larry, Oliver, Olof



ATTENDEES: Wendy, Jon, Dan S., Jeremy, Larry, Guillaume

Started work on 3.3 review items


ATTENDEES: Wendy, Jon, Oliver, Dan S., Jeremy, Johan, Jay, Achim, Kelly

Scorecard -tc-8
TC review document tc-8
list for page tc-136

remove "read before reviewing"?


ATTENDEES: Wendy, Jon, Dan G. Flavio, Jay, Jeremy, Oliver, Kelly, Larry

TC-128 Flavio's comments on review of sampling methodology
Read to see if the documentation works to explain the methodology pattern. What wasn't clear was the difference between Algorithm and Process at differet levels of abstraction. Is there a clear cut between the two? Will people use them in the same way or as intended?
Documentation could be improved to convey this. Example to explain difference was difficult for someone who is not as mathematically literate to understand.
Did Flavio get it right? Are you questioning the model or the document? Is the alogrithm needed? What is its purpose - high level description of theory vs. practice where algorithm is the theory.
What actions do we take?
On documentation is the methodology pattern explaines the alogirthm but not really the process.
Questions on the structure of the model should be filed as issues and labeled post-prototype.
Jay ran into the issue Flavio talked about regarding lack of structure. Used structured text to provide steps.
Kelly will put in some text for Dan to check.

TC-8 DDI Prototype TC Review Comments and Comparison spreadsheet

TC-136 What if any guidance do we want to provide the reviewer? - See issue

Status report card https://docs.google.com/document/d/14k929H0_q9LAlqnee3Sm81XOE9e-FdloUy8CCqG0PG8/edit

Kelly: documentation issues


ATTENDEES: Wendy, Dan S., Jay, Johan, Larry, Jeremy

TC-8 Review of Prototype by TC document items 5-8

Wendy will summarize and draft documents for publication with prototype


ATTENDEES: Wendy, Dan S., Flavio, Jay, Johan, Larry, Dan G., Oliver

Review of documents from TC for Prototype Review:

Relation between Post-Prototype issues and Lessons learned section should be integrated for clarity

Another area we need to look at is what to use as a modeling environment. We need to relook at EA and modeling environments.

We've included the rules that we're using to go to bindings

Some bullets may need further clarifications.

Need to review sub-set used of UML and how they are represented in bindings.

Post-Prototype we need to have a real discussion of what UML features we use

Change ComplexDataTypes to StructuredDataTypes and double check for any other name changes

Went through items 1-4 on Review of Model by TC

NEXT WEEK: Review of Model by TC items 5-8


ATTENDEES: Wendy, Dan S., Flavio, Jay, Jeremy, Johan, Kelly, Larry, Jon, Dan G.

Validation Checks
Modeling Rules
We need to provide a framework for review - what TC has done, what is still open, what is the thinking
Goal is to have some description and where this stands
From a developers perspective - there are issue
Need to collect TC member perspecitives

Move documents to issues (attach or link to google doc)
During the next two weeks enter positions on the google docs for discussion next meeting (2018-09-06)


ATTENDEES: Wendy, Dan S, Flavio, Jay, Jeremy, Dan G, Johan, Jon, Kelly, Larry

Reviewed the list of work, primarily documentation, that needs to be completed between now and 21 September. Kelly has prepared a spreadsheet tracking what documentation is expected and how it is grouped for presentation. Members should review for completeness and provide review where they can. Notify Kelly of anything missing or any work they will be doing.

The list of existing documents are found here. Review especially documents that are older and may need updating.

TC-9 through TC-14 contain comments regarding additional documents that need to be prepared particularly specific TC review documents and best practices. If there is an open one that you can draft, add your name in the appropriate comment.

Wendy will start a set of TC comment drafts in each area to encompass current work. See TC-9 through TC-14 for links as they are added

DDI 3.3  

Documentation production is still a problem. Jon will follow up with Oliver and Achim regarding doc flex


ATTENDEES: Wendy, Dan S., Jay, Larry, Oliver, Flavio, Jeremy, Dan G.

TC-153 Documentation content

TC-125/TC-152 results of statistics review require changes

TC-151 Need addition to ConceptualComponent to do example

TC-150 InstanceVariable measure (target Population) is required MAKE OPTIONAL
It has not been used in some of the examples. We need to either change measures (target Population) from 1..1 to 0..1 or go back to the example authors for additional content information
Is this the differnce between RepresentedVariable and InstanceVariable. Both specialize Concept. Do you always know the Population when you measure? You might know it in a less structured or formal way. You require that formalization to have the Population object. So in overwriting the target it violates the rule of constraining to a sub-type and Population is not a sub-type of Universe.
Is this predicated on the fact that the relation is called Measures in each case. Call it something else and Instance Variable ends up with both Universe and Population. InstanceVariable can't be reused from Data set to Data set (for examples in repeated samples). Make a note that we need to think about how to rename or create extension of Population from Universe. Recommendation for a change in RepresentedVariable from Measures to DrawsFrom (Universe). Remove narrowsUniverse from Population. DID NOT REMOVE from population as the inheritance is structural not content

TC-143 Change in Process Pattern and related realizations completed. Specifically this has resulted in some changes to relationship "executes" in Workflows as this is now inherited from WorkflowControlStep (0..n) and is constrained to 0..1 in conditionalControlStep. Note that ElseIfAction also now uses executes as the relation name as conforming to consistency between name and target pairs where resonable.

TC-112 This has been filled in, any changes?

TC-91 Examples have been updated and required content added with the following exceptions:
measure in InstanceVariable until TC-150 is resolved
properties that have default values in the UML which have not yet been programmed for a standard translation into the bindings
Question about incomplete content (see exampleIssues.txt on issue) - are these intended to be examples of best practice or is the information simply not yet known?

Work with Jay to add content

TC-136 Issues to document in Prototype TC evaluation:
XML limitations: transformation of primitives xs:decimal, xs:float, xs:double, xs:date, xs:datetime; transfromation of default values and regular expressions; need for secondary validation to enforce cardinality constraints
Relationship of Statistical Classification and XKOS
DDI-C, DDI-L, DDI 4, GSIM, SDMX, ISO-19115, ISO-11179 relationships
Ability of each Functional View to support stated use cases
Use of Functional View (separately published XML schema) to address requirements for focused application guidence
RDF vocabulary - overall statement
Use of UML modeling options and translation rules to bindings
Statistical Summary (TC-125)
Formal documentation limitations, goals

TC-14 Prototype documentation - supplementary documents:
Data Structures - commentary document (TC-98)
Sampling options (TC-128 document)


ATTENDEES: Wendy, Dan S., Jeremy, Johan, Kelly, Larry, Achim

TC-98 Larry will pursue wording with Jay and Dan G. - updated wording, work is being done on a supporting document for review
TC-128 Dan G. will provide draft tomorrow - unable to complete (Dan G. is now on vacation)
TC-144 Wendy will enter - done
TC-143 Wendy will enter - done
TC-112 It's on Kelly's to do list - today

TC-9 - TC-128 (Dan G. working on this), TC-125 statistics packages - wlt is reviewing and writing up
TC-10 - Only outstanding is the document describing use - Achim is working on it
TC-11 - Working on Prototype workarounds - default values, translation of UML primitive Real to XML; paper on need for secondary validataion
TC-12 - Achim has completes an XML to RDF translation and is rendering examples
TC-13 - Round trip - see TC-12; statement regarding goals
TC-14 - Documentation - today
Make sure there is a discussion of Packages vs Function View in the high level

Extend 3.3 review for 2 weeks through the 20th - email to Barry and Jared
Follow up on Oxygen documentation for 3.3 with Jon


ATTENDEES: Wendy, Jeremy, Larry, Dan G., Flavio, Kelly, Achim, Eric

First half of meeting (many of these are just touching base and getting estimate for completion):
TC-144 resolved
TC-143 resolved
TC-128 eta tomorrow
TC-125 updated
TC-113 updated
TC-98 updated
TC-91 updated
TC-43 updated


Translation decisions:
Works with canonical version of XMI
Views and model presentation - captures the XMI inheritance faithfully
Enumerations (value set)
Various views of content (FHIR, ShEx, OWL viewed in Protoge)

Document describes choices made
Transformation rules
Existence content makefile
DDI specific
(goal) Arictecture

parse XMI
For each class and property remove realizes
For each package ending with Pattern remove
xsd datatypes were inserted as LION doesnt' really export this

Two properties with missing elements because they were in one of the pattern packages - send to Wendy

UML model that is DDI specific transformation
Turns into ShExJ (Jason representation)

Package doesn't have semantics in RDF but is handy for finding stull Used just for organizationally purposes.

Handy for organizational purposes, or that something is an enumeration (semantic information) otherwise navigational.

Who is the audience for the RDF material. Implementors usable but for general public need more high level description. Current audience should be fluent in semanitic web technologies. More of a review of the technical representation, is it good enough, etc.

Need a statement of what this is a review of. What is the purpose, how it is useful, etc.

An example in RDF of Australian National Election Study (parallel with XML) - can limit to a single instance of different types of variables. Wendy will provide the XML.

See FHIR example

Not worth writing the translation right now. Might work on it following.

There was an example from the last Dagstuhl meeting (Achim will find)

TC-98 Larry will pursue wording with Jay and Dan G.
TC-128 Dan G. will provide draft tomorrow
TC-144 Wendy will enter
TC-143 Wendy will enter
TC-112 It's on Kelly's to do list
- Achim will locate example from Dagstuhl
- Wendy will provide Eric with AES XML and PDF to create RDF example - Larry will provide consultation for questions
- Eric will file errors found (reference to classes in patterns)
- Achim will provide a high level statement on what the use of RDF is and what needs to be reviewed in the Prototype


ATTENDEES: Wendy, Dan S., Jeremy, Larry, Dan G., Jay, Flavio, Oliver, Kelly
TC-138 move to Post-Prototype

TC-128 review documents to be ready for discussion
The model in the prototype follows the SDA work. The model in TC-128 originated in Mpls and revised in Norway. Sampling was used tested at BAH and BLS. The starting point was translating that model into UML. The two approaches are not entirely equivilent but able to describe the same thing. Modeling view was simplified.

SampleOverview gives general approach and stages specify what is done.
How are stages different from a workflow process step - probably aren't but the problem with doing that.

What is in 4 now is a process model. What Jay and Dan G are trying to do was capture the design. Is the Workflow Process a design or run-time only? We have design, algorithm and process. Design tell you the framework, the algorithm is saying these are the steps I intend to take, Process says these are the steps being taken or having taken.

Laying out the criteria and design - at the same time the process pattern is similar to what is being described here. Can the process pattern be used to describe the same thing.

The process is more imperative design is more declarative.

There isn't an exact parallel between sampling and questionnaire - In some methodology/process there is a clear definition between design and process in Sampling there is.

How does a statistician think? They don't seem to be very interested in process and looks a lot like what is presented in this material. If they then turn around and implement it they provide a bunch of attributes for them to follow.

Even if people don't think in terms of process is our process pattern generic enough to capture what they are thinking and doing. There seem to be a great number of similarities.

It might be a very high level process and may not be thought of in those terms.

It is intended to be a metadata drivable stack as input to the process to run the sample, but the direct purpose here is to document the design.

What this states are Business rules that are declaritive. So it would be interesting to see the next step.


Put it in as informative - next steps, discuss how it relates to what is currently in the model and how it fits into the broader picture. This discussion on paper.

Dan G. can write an explanation.

First of perhaps many methodologies created to define specializations of Methodology, differentiation of methodology and process and how they plug together.

Design is a class in the Methodology Pattern. Difference between "design" and design of a process.

TC-14 Documentation
Looking at what we have is presented (Sphynx, examples, side papers)

Clear rules of transformation from model to XML

updated July 16

TC-12 RDF?
Is the actual RDF XML, how to use it, is there documentation on using it, how it fits in with binding. Eric will provide the basics. What the transformation process is. OWL, Shek work, Views. Missing overview how it fits in. Are there examples? SPAQL queries etc.

TC-13 Round-trip?
Since we have an example of Australian study it would be good to look at in RDF or snipits of that.

TC-9 Model
Modeling rules and how they are being realized


ATTENDEES: Wendy, Dan S., Jeremy, Oliver, Larry, Jay, Dan G., Achim, Kelly

Review examples in TC-43, TC-44
TC-44 Security and modeling difference - take a modeling approach and deal with security in the binding issue. Should document this concern.
The point was not to ensure security but to recognize the issue and inform the user regarding the vulnerability of structured content.
See pdf on issue
Difference between structured and unstructured you are basically saying "you shouldn't use structure here"
needs comprehensive enumeration
Currently can validate and now you'd have to use the escape characters so another layer of translation is needed
Should do it to
Use different scopes of text (internal, externally intended)
It would mean this feature would no longer be available
Scopes are written in markdown using the div in xhtml to differentiate different kinds of texts
A new attribute providing section identifier
Basically uses XHTML to break into divisions and then any form of markup
Add a scope attribute to both LanguageSpecificString and LanguageSpecificStructuredString
enumeration and otherDefined

TC-43 Will test resolution on one usage of LanguageSpecification
TC-73 Moved to Post-Prototype
TC-91 will send out link to examples with questions for TC members concerning review
TC-112 Update on actions to improve content
TC-113 Oliver will test by running build through sphinx
TC-125 Need to determine if there is anything here needing review - Wendy
TC-98 Jay will follow up

Next week:
TC-128 review documents to be ready for discussion
TC-14 Documentation
TC-12 RDF?
TC-13 Round-trip?
TC-9 Model


ATTENDEES: Wendy, Oliver, Larry, Jay, Kelly

TC-43 resolved Oliver will provide example prior to entry
TC-44, TC-135 tentatively resolved; Oliver will provide examples
TC-113 href added to images during transformation; need to verify results in read-the-docs

Sampling Methodology:
Could go in as a change for the Prototype or as a supplementary document to go with review.

Still no RDF

Kelly will look over documentation to come up with a plan of what is needed from TC regarding documentation review/preparation


ATTENDEES: Wendy, Dan S., Jeremy, Oliver, Larry, Achim, Flavio

TC-29 Documentation/Example will be added to package
TC-44, TC-135 - Oliver will look at options for next week
TC-43 next week continue
TC-113 Oliver and Achim will see if they can do this easily - not a show stopper, provide when available
TC-112 Manual workaround for Prototype
TC-98 Larry, Dan G., Jay are asked to review and rewrite - we will go with what they decide
TC-17 did not get to

added issue TC-136 to capture issues to raise in review documentation


ATTENDEES: Wendy, Jon, Dan S., Jeremy, Oliver, Larry, Achim

DDILIFE-3603 Resolved
DDILIFE-3604 Resolved
DDI 3.3 public review
We have schema in Bitbucket, new documentation in COGS, Oxygen documentation
If we do the review package but the readme links to the external documentation
Get the rest of the documentation into GOGS and could include the PDF version in the package
Page ok and Oxygen production will be discussed after call

TC-45, TC-129, TC-95, TC-16 in review (only issue raised was with TC-45 and Oliver has resolved)
TC-62 Post-prototype
TC-61 not fixing
TC-55 Post-prototype
TC-44 oh
TC-43 wlt

Next week
TC-44, TC-135

Final sets


ATTENDEES: Wendy, Dan S., Jeremy, Flavio, Larry, Johan, Guillaume

TC-95, TC-129 There is a proposal for resolution on TC-129
The proposal is correct and is in line with XKOS
GSIM is in the process of redoing parts dealing with designations, codes etc. some simplifications
We should review of GSIM changes across DDI when it comes out end of June
ACTION: Wendy enter it

TC-92 (this was a review item and since there was no response to change I have closed it)
TC-94 review and OK documentation change
revised and closed

TC-16 function of variableRole a) do we need to address in prototype? b) if so, discuss
We have the viewpoint but there is a role for a variableRole. Should we change the description?
Change variableRole to variableFunction "Immutable characteristics of the variable such as, geographic designator, weight, temporal designation, etc."
ECVE 0..n

TC-45 organization of packages related to datatypes
change as noted

TC-29 structured collection of logical records
make example

TC-35 relationship between ClassificationIndexEntry and ClassificationItem
close and review as is in prototype


ATTENDEES: Wendy, Dan S., Jeremy, Larry, Johan, Oliver, Dan G.

DDI 3.3 review release wrap-up

Also should generate Oxygen docs to assist in how to represent in XML
This is more how its modeled than how its serialized
Any binding we release should have a binding specific
RDF everyone seems to use the W3C documentation

Generate the Oxygen - talk Jon, Oliver may be able to help
Update with higher level content
Double check readme and schemas for 3.3 dates, etc.

TC-132 - Oliver will do further checking
TC-130 - closed
TC-131 - appropriately classes as post-prototype
TC-128 - Dan G and Jay will review
TC-36 - closed
TC-92 - resolved

Next week:
TC-129 Wendy will write up proposed change and post prior to next weeks meeting


ATTENDEES: Wendy, Dan S., Flavio, Larry, Oliver, Johan, Dan G.

DDILIFE-3601 - resolved, Dan S. will enter
TC-105 resolved and noted added to TC-14 regarding reasoning to be included in documentation
TC-130 resolved based on Wendy's drawing on TC-130. Reasoning for this level of modifications added to TC-14 for documentation



ATTENDEES: Wendy, Flavio, Oliver, Dan G., Jay, Johan, Kelly

Process Pattern and Workflows:
Problems with both the structure of Process Pattern and relationship to Workflows
Separate the single steps from the sequence
Where a workflow sequence realizes a process control step (like ifThen, Split, etc.)
Remains use of workflow sequence by WorkflowProcess

We aren't doing a real good job with Temporal in terms of sequencing - we may want to look at XKOS and review how this can be used
Do we add these to the Prototype or say that additional temporal work should be after the prototype
Should we just be talking about synchronous or asynchronous
One we include the Allan's Temporal relationships we are probably covering these - either as subclasses of ProcessControlStep or as a means of specifying the relationship within a single class structure
Doesn't need to be done before Prototype but need to explain the point they would hang off of (Process Control Step)
The idea of precondition/postcondition as controls - organized in a way other than IfThenElse - may be expressed temporally
Ability to define both loose and tight process patterns - loose especially needed for business workflows where certain sets of things must or should be available and certain conditions for exit or next step initiation
Do we need collections to define all of this or only for more formal relations

ACTION: Draft up short term (Prototype work) - decision following further review of specific changes and discussion
Write up what would need to be looked at after

Problem is with realization - some of them are missing may want to add these to enumeration
Add documentation of terms of relationships
Codelist ends up with two different types of member and should only be codes so you have mixed member types.
Contains and RelationStructure should have the same member type
This is specific to CodeList so don't forbid it - review others to see if there is a problem - because of Statistical Classification etc. use as Representation - could this be better handled by an abstract
Statistical Classification does not require a code
Look at the further impacts - if Representation requirement then need to look if we are twisting ourselves in knots to make this work. If so we probably need to solve in Representation
Is there a Statistical Classification without codes? It is optional in GSIM, but we've never seen one

ACTION: TC-129 target and source of relationStructure should be Code


for any issues critical to resolve prior to IASSIST


ATTENDEES: Wendy, Dan S., Jeremy, Larry, Johan, Jay

TC-121 resolved
TC-111 resolved
TC-104 resolved
TC-103 resolved
TC-125 should be done later in review process
TC-89 resolved
TC-119 resolved
TC-120 resolved


ATTENDEES: Wendy, Dan S., Jeremy, Jay, Larry, Oliver, Flavio, Johan, Guillaume

DDILIFE-3519 Copenhagen mapping content
Documentation: http://cdn.colectica.com/CopenhagenMapping/
Straight copy of GSIM plus additional typing such as relationship to an agent rather than just a string
Publication which is an other material - should be changed to complex type with an inline and choice of OtherMaterial like earlier change for consistency
Name, Label, and Description - possible future approach for simplifcation
GSIM- https://statswiki.unece.org/display/gsim/3+_Object+graph
Parent (there is a hierarcy in Nodes but not in classification) - should be an item in a single classification and if you have a classification used in several you should use correspondence and the map (Category rather than ClassificationItem should be shared)
ADD a concept to a classification item as the classifcation item is a type of category
What is the meaning of a ClassificationFamily being in the Group, StudyUnit, ResourcePackage
Maybe there should be a link from within the logical product.
GSIM plans to define more precisely the differences of CodeList and Statistical Classification (over next few weeks)
Leave the pull request and update with those changes. Then update the pull request.

Right now choice on use of Statistical Classification is to create a separate StatisticalClassificationCodeRepresentation or to make it a choice in CodeRepresentation
Wendy will make exmaples of each so people can see what the difference is.

Are we sure we have all the essential properties inherited from NodeSet and Node. Example: GSIM talks about a Part-Whole, Parent-Child
Uses only those included in the Statistical Classification - by definition this may vary by level - Totality is a requirment of the statistical classification
Do statistical classifications mix Whole-Part and Parent-Child at various levels within a Statistical Classification? Not within a single classification, just when there is a mix of classifications
Jay will look at NAICS to verify

Naming TC-118, TC-99, TC-84 (moved to Post-Prototype as related to TC-83), TC-63 - resolved all


ATTENDEES: Wendy, Jeremy, Dan S., Larry, Oliver, Achim

new additional labels on issues. These support an information page reflecting the progress of the TC internal review of the Prototype material:
--class_function (does it do what it's supposed to do)
--modification (specific requests for modification)


Marketing/PAG Meetings:
--Prototype goes out after end of 3.3 review period

DDILIFE-3594 closed
DDILIFE-3593 closed

Issue Work:
Example review TC-91 - review existing issues; note comments in TC-91; Larry will comment on specific purpose of some of the examples
Technical TC-107 - resolved
Role of XMI in Prototype - Achim draft up that documenation and we can then review it and ask additional questions
Modification TC-124, TC-122 - resolved

Next week:

DDILIFE-3519 Copenhagen mapping content

Modification TC-111, TC-23
Naming TC-118, TC-99, TC-84, TC-63
XML secondary validation
Consistancy TC-119, TC-120, TC-89
Application of Design Rules


ATTENDEES: Wendy, Jeremy, Dan S., Larry, Oliver, Achim

DDILIFE-3595 language and CodeValueType - No change
DDILIFE-3593 Quality Statement / Quality Standard - resolved
DDILIFE-3594 [haven't entered yet]
DDILIFE-3519 Classification management (Copenhagen mapping) - New metadata items are present in XML serialization but need to add in their usages - end of month
Task - rendering examples in documentation

TC-8 Please review this issue and the related documents. I am attempting to organize these into sets of related issues we can discuss and resolve during the Prototype process. Also any additional means of internal review (particularly in terms of bindings, xmi, documentation, examples, etc.). Note that some issues need some background work that perhaps smaller groups could do in preparation for a broader discussion. Others are topics where we'll want to bring in others from outside of TC. I'd like to get some sense of what we can accomplish and how the work will be distributed over the coming weeks.
Need to front end issues for XMI - review and comment for discussion and decision next Thursday.

Note that MT will not be meeting for the foreseeable future so we have that timeslot Wednesday (9:00 CDT / 16:00 CET) for subgroups or additional discussion time.

--Example review for additional issues/problems - Wendy will prepare the content for this review of updated Examples
--Issues regarding XMI errors - send out email on what issues need to be looked at and commented on


ATTENDEES: Wendy, Larry, Oliver

Update on XKOS work
Update on Prototype documentation
DDILIFE-3593 - added comment and need a response from others not in attendence
DDILIFE-3590 - approved
DDILIFE-3591 - approved
DDILIFE-3594 - resolved

Discussion of approach to Prototype review work:
Aggrgation and Composition - issues may be too complex for prototype
Reasons for doing some of these things in terms of XMI, XML, RDF, etc may be critical and we need to explore the specific reasons. Make sure and get detailed information on reasons.

Things like name changes that are better symantics for users should be reviewed in prototype
Name changes to avoid conflicts with certain systems need to be changed

There should be a list of notes for people to read for review process
Make a list of specific name changes that were not resolved as part of notes for reviewers


ATTENDEES: Wendy, Dan S., Jeremy, Larry, Jay, Guillaume

DDI4: (15 minutes)
TC-68 resolved - fixed
TC-76 resolved - no change
TC-77 resolved and entered
TC-81 Post-Prototype

DDILIFE (45 minutes)
3593 - go back and fix
3590 - check extension base of two items that were ext base other material
3519 - Coppenhagen Mapping - did not get to
3591 - Review in Bitbucket see commit 4ae0eda in Wendy's branch
3594 - A documentation issue. Documentation of the elements MarkedIncrement and ValueIncreatment contain their target documentation instead of the use documentation. Because both use the same base type this is confusing. Review recommended documenation change on issue and comment.

Didn't get past 3590 some work will get done remotely so we can get as much into 3.3 for NADDI as possible. TC-77 and TC-68 were entered in model post meeting.


ATTENDING: Wendy, Jeremy, Dan S., Johan, Oliver, Jay, Larry, Achim, Guillaume

DDILIFE-3519 Copenhagen mapping - questions noted in comments;
DDILIFE-3590 OtherMaterial - management options;
postpone - DDILIFE-3593 QualityStatement review and recommendation; - review not complete
TC-58 (includes 38, 34, 51) - recommendation; resolved
TC-70, TC-71, TC-72, TC-74 all with recommendations - all resolved
TC-65 Add the 4 classes requested
TC-69 determined to be post-prototype

TC-60 - to address the immediate action needed and when the larger problem of production needs to be addressed
Cardinality errors with Composition have been resolved under the current framework TC-60

Aggregation cardinality should be a priority issue - review and fixed - TC-77
Example: 2nd to last comment from the bottom AggregationText
Cardinality as expressed in Aggregations and Neither in Lion

In future XMI should be exported correctly - TC-78


AGENDA: Wendy, Dan s., Larry, Jay, Johan, Oliver, Kelly

OtherMaterial changes (DDILIFE-3590) - see issue
Copenhagen mapping (DDILIFE-3519)
DimensionType documentation (DDILIFE-3591) - OK
StandardType BUG (DDILIFE-3593) - see issue
10-15 minutes on Priority Prototype issues:
TC-60, TC-59, TC-57 - resolved enter
set of TC-38, 45, 51, 52, 58 (dealing with reorganization of packages) - draft collective recommedation and put on next week's agenda

Timeline: End of March for final changes in 3.3; review out in April

DDILIFE-3519 Copenhagen mapping - questions noted in comments
DDILIFE-3590 OtherMaterial - management options
DDILIFE-3593 QualityStatement review and recommendation
TC-38, 45, 51, 52, 58 - recommendation


ATTENDEES: Wendy, Dan S, Jay, Jeremy, Oliver, Larry

DDILIFE-3589 - closed
DDILIFE-3590 - resolved - Wendy will write up all changes required and post on issue for review prior to entry

Copenhagen content
--Where does it link into the model
--How can we use StatisticalClassification directly either as a choice in CodeListRepresentation or similar to GeographicStructure/Location usage, also Catagorical
--Is the target the same in terms of code? ItemCode is a property, Name, Label
--From a developer perspective its not a big deal
--For a large classification it keeps it in a unifying set
--In using it as a representation can we make the same clarification of what is used by the representation

Do the properties make sense in terms of what has been selected, cardinality, relationships.

Where do they go in

Do we have questions about putting this into 3.3 - general agreement it should go in
--Derivation from Versionable or Maintainable - ClassificationFamily, ClassificationSeries, StatisticalClassification go in as Maintainable
--FILE (DDILIFE-3592) 3.4 possiblity to collapse Identifiable/Versionable/Maintainable - file as future issue
--Suggestion is to put it in LogicalProcduct and include in LogicalProductPackagingItem

--FILE (DDILIFE-3591) DimensionType documentation regarding inclusion of variables with representations other than CodeRepresentation (Geographic, Statistical, etc.)

NEXT WEEK Dan S will make a proposed list of where this would link in. Others should review and comment in JIRA issue


ATTENDEES: Wendy, Jon, Larry, Oliver, Jeremy, Dan S., Kelly, Jay

Reviewed TC issues related to Prototype identify those that were priority (could be quickly resolved and fixed)
Resolved TC-33, TC-32, TC-31, TC-30, TC-28, TC-24
Those requiring changes to model will be entered by Wendy and added to the change list for Prototype internal review
Those relating requiring documentation will have documentation located and relayed to RDF group.


ATTENDEES: Wendy, Dan S., Jeremy, Oliver, Larry, Jay, Guillaume, Johan, Achim, Eric, Kelly

1-RDF/OWL issues: Eric and Achim will be joining the cal

Thanks to Eric and Achim for joining us regarding TC comments on RDF Strategy

Prioritization in strategy
Doing some diagnostics on model to check types were coherent
Emitting OWL that used properties as defined in DDI model directly (no external namespaces) - isomorphic easy to roundtrip
3-5 figure out which foreign namespaces we want to use; write or document something that converts from XML to RDF; in practice the model always has some sort of implementation so "going through the model" may be just a conceptual approach. XMI describes the model but not an instance of metadata.

Focus should be on stategy and documentation of algorithms/approaches. Creation of tools could be part of review but not critical for prototype.

Add the strategy noted above to the strategy document and note how this relates to the Dagstuhl documents and discussions.

Protégé is one type of validation. Other forms of validation should be noted if know or a statement that we need to determine the types of validation needed.

Specific item 1: Package recognition: Useful in OWL for navigation. The use of packages and functional views could be confusing. Look at the requirement "supporting navigation". Is the package useful for this or would something else be more effective.

Specific item 2: keep contact to get on the same page, use of JIRA etc. so we can follow and comment

tool for exploring https://rawgit.com/ericprud/XMItoRDF/master/doc/index.html

Specific item 3: Easier to have a simple model but this could come into conflict with how patterns are realized in the model. Can the design patterns be modeled in another way that can be recognized by the UML. Future problem.

There is the regular expression twice defined. There is a property regularExpression.
5 poly morphic properties will be filed in TC
Several dialects of OWL: OWL/XML, Turtle, Manchester...etc.
Which OWL sublanguage is being targeted: DL probably, might fit with RL

Is there someplace that we a list of all the rules we should be checking? We need to put these together and post them in a promenent place. This needs to be clear what we're doing, what have automated checks. -- WENDY

2-Triage on current TC issues related to Prototype Model
Go through TC issues with Prototype label for next week. indicate if the issue should be addressed, documented only, or tabled for post-prototype review

3-Update on 3.3 documentation progress

Copenhagen modeling work - where it links into the model


ATTENDING: Wendy, Jon, Oliver, Jay, Larry, Dan S., Jeremy

1 - RDF/OWL Stategy for TC Review:
If package is being represented in RDF this is a problem in terms of references and movement of classes between packages
Would like to see early on what can be discribed for discovery purposes because members are already experienting with what they can do using the RDF
This means TC needs to be involved at Step 3
4.a. What does this mean. Patterns are not currently in Bindings

This should be addressed in this strategy - doesn't look like previous work was not brought into this document
Serialization in RDF must contain the same semantic content as other bindings - this is not stated explicitly here (they must be conceptually the same
Round-tripping is a general requirement - requires that they be semantically identical

Don't see anything in her regarding the approach they are using regarding including other vocablaries. Mapping of generated RDF to OWL ontologies in not on this list. There were discussions of this in Dagstuhl. When do we use our own predicates and when do we replace them with our own ontologies. Part of the issue was the level of effort to do this. Part is how far to reflect RDF limitations in the model (similar to issues regarding XML).

Being able to read it into Protégé. This should be part of the strategy to verify its usability.

ACTION: Wendy will write this up and send to Kelly after review by TC by email

2-Update on 3.3 wrap-up work : Copenhagen mapping and documentation - on schedule

3-Quick review of Prototype issues filed in TC - add or refine as needed
-- In terms of documentation it may allow them some additional time to do finishing/reviewed/refined
-- TC's role is to determine if its fit for release
-- If documentation is bad all the people reviewing will let us know

4 - TC Prototype Review: An agreement on what type of issues/bugs with DDI 4 Model should result in changes and how we should do them (as found, in sets, all at once) - we need to keep RDF group up to date on changes and also anyone doing examples - is there a way to front end this work to get any changes done asap to support other work

Changes that get made: Bugs, violations of the modeling rules
Problems that may just be documented: complexity issues, multi-class relationships, improvements

Timing of changes - we should consult with Eric on this to see if he has a preference
Any changes will be well documented in terms of informing RDF and example creators - Create a change log (date, change, Build number) - currently the build runs every night but only commits changes

ACTION: Wendy will write up, send to Kelly, and set up a location for a TC Review of Prototype Change Log

5 - DNS filing
CV DNS filing - Strong recommendation from TC to use the DDI DNS namespace and file for an agency identification of DDI.CV. This approach becomes more important as 2 formats for publication include a DDI 3 Codelist and DDI 4 Custom Vocabulary
ACTION: Wendy will write up and send to CV group, Jared, Achim and post on issue)

Encourage everyone to register for NADDI and send some emails on this to any groups you know of. [Wendy will send to APDU and State Data Center lists] use Jared's email


ATTENDEES: Wendy, Dan S., Jeremy, Johan, Larry

Pull request - merged

Workplan - reviewed who can do what when for next few months, adjusted for dependencies, still target of first week in April for 3.3 public review;

ACTION: post on workplan page and create issue to track assignments and progress (done)

RDF/DDI4 Model review work - JIRA tracker is being used so file issues or comment if you have a stake in this

Documentation/Marketing - just setting up and Marketing is getting involved with recruitment - more after that meeting


ATTENDEES: Wendy, Jon, Larry, Dan S., Jeremy, Johan, Guillaume, Kelly

DDI 4:
We're supposed to get most of content by NADDI
RDF may be delayed
--documentation (RDF specific)
--documentation of process
Earlier consensus of AG:
--Model itself
--serializations (XMI, XML, RDF)

TC has some role for input on the serializations and can be reviewed prior to transfer of content to TC
Get model serializations out for TC to look at prior to this

--Is the XMI just out there or does it have a purpose
--How is it to be used in the reveiw and can this XMI support that review - what systems are supported
(ex: caviates regarding mangling of cardinality and directionality of relationships in EA)
--Are these acceptable or unacceptable caviates.

XML - usages of the two sorts of XML the Library and the Views
What are the relationships?
Are they doing what we expect them to do - act as a profile or subset?
Do the serializations to what they say on the "tin"

What is the minimum level we are willing to put out for review
Valid according to the spec and opens in Protégé - needed for review
This should be an iterative process - need to review
They need to use JIRA so we can see their progress
First call was last week, Second call today
First task is to formulate invitation language to send out within the week
Kelly will bring this forward to Eric

Highlevel documentation is the next essential about the relationship between the serialization and the model. What is the production system doing so that people can navigate between the RDF and the Model.

Something that needs to step up a bit in the review - Something that TC should be looking at and commenting on the problems/issues
Should take a look at what round-tripping means in reports from Dagstuhl 2017
Is there a clue to where the platform specific model vary from PIM in each case
There is the programming that makes the binding as one point
The model shouldn't be flattened for the RDF is a comment from last weeks RDF meeting.
Round trip means Binding to Model to Binding

DDI Lifecycle 3.3

COGS is now taking highlevel documentation and examples
It will be ready for dumping in more into it next week

(next week)
3524 - gt it in
3585 in is and needs review
(in progress - hopefully next week or 2)
Copenhagen mappings
COGS is not yet making the high level documentation for 3.3 so that needs to be worked on -

(will start on once the near final review version is ready - not a lot of time - task that needs to be done)
Do DDItoCOGS for 3.3 so we can test that out
Can use the strongly typed sheet from 3.2 plus the change log to identify what changes need to be made
Should Jon start uploading stuff into GITHUB or wait until Dan S. done some magic
Create branch for 3.2 first

(1-2 months)
High level documentation
All will be done in restructured text

2018 -
This year having the week long TC meeting was highly successful - bring up doing again in 2018

With controlled vocabularies have the DDI publish as DDI CodeLists - as well as current and proposed.


ATTENDEES: Wendy, Dan S., Jeremy, Larry, Jon, Oliver, Kelly, Guillaume
Johan - unavailable

Update on DNS filing / DDI URN resolution
Achim will be filing issues - we need to track it and nudge as needed
Should CV one be revised to fit into the DDI structure (for example agency name DDI-CV)

see issue

NEW revision - Guillaume will review and comment - if OK then this is what we go with

3.3 documentation
Wendy needs to do a brain dump - ASAP
System is up and running after working with Jeremy
Moving examples to COGS
Will put up on Bitbucket to get it available for contributions
Make sure people have access - they aren't beating down the doors at the moment

DID not get to.
3.3 work plan for review release
2018 priorities


ATTENDING: Wendy, Oliver, Jon, Guillaume, Jeremy, Dan S.

DDILIFE-3524 need examples see issue
DDILIFE-3583 closed
DDILIFE-3584 closed

Documentation: Jon needs and hour with Jeremy next week to sort out final issues. Wendy needs to do a content dump for Jon.

Short Sprint prior to NADDI. This is at the pass-off point between documentation and TC review. This is being looked at as a chance to do the following:

Wrap up any documentation issues
Organize and plan work of TC review
Layout DDI4 project work for the remainder of year (this would include Drupal to COGS conversion of work platform)
[space has been arranged]

Oliver - remotely
Jon - remotely
Jeremy - maybe (Sunday is easter) focus is 3.3 almost certainly no

Announcement of sprints and ask people to participate

Should this be a TC meeting? in which case we have 3.3 work that could be done even if it is out for review. Implementation issues, software development issues.

We used to have annual face-to-face meeting and these have always been very productive we need to reinstitute support for this. If we want a TC meeting do we do this as part of a review of 4 or do we want more 3.3 line.


ATTENDEES: Wendy, Dan S., Jeremy, Oliver, Guillaume, Larry

Original idea was that the conditional text was interleafed with the static text. By using the current conditional text is compute text or identify text using the source parameter. But people want an IF condition that if something is true then insert this text. It's supposed to be generating the text but there is nothing there that does this. If we want both computing text and an IF statement that results in a literal text.

The conditional text type inserts a block of text
How can we a conditional for literal text choices - Using the control construct within the text is overkill so we could probably make something more streamlined for this situation.

IfStatement condition and text if true - should probably go through Else to provide a default insert at minimum

Resolve to add an IfThenElse to choice in ConditionalText.

WT will draft IfElse for dynamic text.

How to indicate the position. Write documentation and examples of usage.
Still want allow source parameter
Want to be able to insert a combination of a text and source parameter as a result of an IF - Limit to a combination of literal text and source parameter

Reviewed DDI_L_3 entries and pulled into Master


ATTENDEES: Wendy, Oliver, Johan, Jeremy, Larry

DDILIFE 3581 - OK to remove

DDILIFE 3582 all approved with following not#4. Make sure it stays in the same order as previous when it was in in AbstraceVersionable. If not just add to each (non Abstract) to retain order

#7-9 move to reusable

DDILIFE 3524 - Check with Guilaume and others about using computation and logical expression, separating "true" and "false" responses into separate conditinal texts. How to specify which text. Definately good to be able to express specific text insert in a consistent way.

Agenda Topic Index Key

2016-2017 Minutes Page

Pre-2016 Minutes Page

Agenda for TC face-to-face







 CV coordination



DDI 3.3




















































DDI 4 Prototype Review 













DDI 4 Prototype
































2018 Issues for Scientific Board - TC Priorities20180201