Difference between revisions of "SCHEDULE"

From Open Rail Data Wiki
Jump to navigation Jump to search
Line 99: Line 99:
 
* Location is a [[Identifying_Stations|TIPLOC]] Reference
 
* Location is a [[Identifying_Stations|TIPLOC]] Reference
 
* assoc_days represent if the Association is valid on the relevant day (MTWTFSS)
 
* assoc_days represent if the Association is valid on the relevant day (MTWTFSS)
 +
* cif_stp_indicator indicates if the entry is P(ermanent) or O(verlay)
 +
  *  "the Permanent data is retained in addition to the Overlay, but the Overlay is assumed to supersede the Permanent position" (Page 28 CIF End User Spec)
  
 
=== Schedule ===
 
=== Schedule ===

Revision as of 21:48, 7 August 2012

Overview

The Schedule feed is an extract of train schedules from the Integration Train Planning System.

Schedule Data, cannot be obtained via Stomp, data is obtained by GZ file download from the Amazon S3 Data Buckets, each GZIP file consists of a collection of JSON strings.

The data consists of a primary set of data (rather large, can be 1.5GB in size) and a set of daily corrections that should be applied to the base data.

The data only contains Passenger Train Information.

   Freight services are not included in the schedule; all messages containing FOC codes are filtered out.

Obtaining the Data

Data is downloaded from Amazon S3. Each feed has a Bucket name and a File Name.

Each bucket has one or more files available within it. Normally the FULL_DAILY buckets will contain a single file (toc-full), where as the UPDATE_DAILY buckets will contain 7 files, one for each day.

Data is obtained from the Amazon S3 URL

   https://datafeeds.networkrail.co.uk/ntrod/CifFileAuthenticate?type=[bucket]&day=[file]

So for example

   https://datafeeds.networkrail.co.uk/ntrod/CifFileAuthenticate?type=CIF_ALL_FULL_DAILY&day=toc-full

Will give you the Full Schedule for All Regions for Today.

You will need to be already logged into DataFeeds in a Web Browser to obtain the data, or if using cURL, HTTP Basic Auth, following HTTP Redirects will login, (using your Email/Password, not your security key)

Data

Like the realtime data feeds, the Schedule data is split down into Train Provider and then from there down into the Full Schedule for that day and the daily updates.

So if you are building a local schedule database from scratch or are wiping your copy to build a fresh version.

  • First download and process the Full Daily.
  • And then daily grab the Daily update for that day and process that.

Files are normally updated around about Midnight UTC

A Daily Full file will only contain CREATE transactions, where as a Update can contain CREATE and DELETE transactions.

Each file contains,

  • a Data/information line,
  • a set of Schedule/Train Associations Transactions,
  • a set of Schedules Transactions,
  • an EOF message,

For the Update files. DELETE transactions are listed before CREATE transactions normally.

Files are New Line Delimited JSON Packets

Examples

Header

   "JsonTimetableV1":
       "classification":"public",
       "timestamp":1343952450,
       "owner":"Network Rail",
       "Sender":
           "organisation":"Rockshore",
           "application":"NTROD",
           "component":"SCHEDULE",
       "Metadata":
           "type":"full",
           "sequence":0

Example from CIF_ALL_FULL_DAILY

Gives the Last Update time of the File as a UNIX TIMESTAMP, in this example, Friday 3rd August 2012 01:07:30 +0100. All data should be send from the Rockshore Organisation.

Association

Create
   "JsonAssociationV1":
       transaction_type":"Create",
       "main_train_uid":"C05307",
       "assoc_train_uid":"C05351",
       "assoc_start_date":"2011-12-11T00:00:00Z",
       "assoc_end_date":"2012-09-09T00:00:00Z",
       "assoc_days":"0000001",
       "category":"NP",
       "date_indicator":"S",
       "location":"HTRWTM4",
       "base_location_suffix":null,
       "assoc_location_suffix":null,
       "diagram_type":"T",
       "CIF_stp_indicator":"P"
Delete
   "JsonAssociationV1":
       "transaction_type":"Delete",
       "main_train_uid":"W36743",
       "assoc_train_uid":"W37173",
       "assoc_start_date":"2012-08-03T00:00:00Z",
       "location":"STPANCI",
       "base_location_suffix":null,
       "diagram_type":"T",
       "cif_stp_indicator":null
  • The transaction type indicates, if this is a new Entry to create or old Entry to delete.
  • Location is a TIPLOC Reference
  • assoc_days represent if the Association is valid on the relevant day (MTWTFSS)
  • cif_stp_indicator indicates if the entry is P(ermanent) or O(verlay)
 *  "the Permanent data is retained in addition to the Overlay, but the Overlay is assumed to supersede the Permanent position" (Page 28 CIF End User Spec)

Schedule

Create
   "JsonScheduleV1":
       "CIF_bank_holiday_running":null,
       "CIF_stp_indicator":"P",
       "CIF_train_uid":"C24056",
       "applicable_timetable":"Y",
       "atoc_code":"GW",
       "new_schedule_segment":
           "traction_class":"",
           "uic_code":""
       "schedule_days_runs":"0000010",
       "schedule_end_date":"2012-12-08",
       "schedule_segment":
           "signalling_id":"1A35",
           "CIF_train_category":"XX",
           "CIF_headcode":"1234",
           "CIF_course_indicator":1,
           "CIF_train_service_code":"25397003",
           "CIF_business_sector":"??",
           "CIF_power_type":"HST",
           "CIF_timing_load":null,
           "CIF_speed":"125",
           "CIF_operating_characteristics":null,
           "CIF_train_class":"B",
           "CIF_sleepers":null,
           "CIF_reservations":"S",
           "CIF_connection_indicator":null,
           "CIF_catering_code":"C",
           "CIF_service_branding":"",
           "schedule_location": <snip>
       "schedule_start_date":"2011-12-17",
       "train_status":"P",
       "transaction_type":"Create"
  • atoc_code can be looked up on TOC_Codes (this example is missing)
  • Schedule Segment - signalling_id can be used to follow the train on the TD Feed
  • Schedule Segment - CIF Power Type is what is a reference to what is pulling the Train
  • Schedule Segment - CIF Speed is the top Speed of the Train
  • Schedule Segment - CIF Sleepers is the service a Sleeper Service
Delete
   "JsonScheduleV1":
       "CIF_train_uid":"C06309",
       "schedule_start_date":"2011-12-12",
       "CIF_stp_indicator":"P",
       "transaction_type":"Delete"

When performing a deletion, the keys provided in this packet can match multiple Schedules. (Normally as different sets of schedule locations are run at different times on different days)

Schedule Location

Times are given in hhmm format, example 2005 for 5 minutes past 8 PM. (I'm not sure why some times are followed with a H)

Schedule Locations consists of none or more locations

The Path key can describe the route the train is expected to take into a station. For example Leeds Central when approaching from the South as Paths A-F

Platform Data is not always provided.

Schedule Locations come in three types

  • LO - Train Origin
  • LI - Stopping/Passing/Timing Point
  • LT - Train Terminus

Stopping points contain both the Arrival and Departure time for the Train

As the data is an array, the Index of the Array can be used to help determine Stop Order.

Train Origin
   "location_type":"LO",
   "record_identity":"LO",
   "tiploc_code":"ABRDEEN",
   "tiploc_instance":null,
   "departure":"1350",
   "public_departure":"1350",
   "platform":"7",
   "line":null,
   "engineering_allowance":null,
   "pathing_allowance":null,
   "performance_allowance":null

As a departure only reference only the departure keys are present

Stopping Point
   "location_type":"LI",
   "record_identity":"LI",
   "tiploc_code":"FNPK",
   "tiploc_instance":null,
   "arrival":"0657",
   "departure":"0658",
   "pass":null,
   "public_arrival":"0657",
   "public_departure":"0658",
   "platform":"3",
   "line":null,
   "path":null,
   "engineering_allowance":null,
   "pathing_allowance":null,
   "performance_allowance":null

This train arrives and ~1 minute later, is expected to leave.

Passing Point

Some stopping points are just Passing Points, used for routing trains over points for specific paths or lines.

   "location_type":"LI",
   "record_identity":"LI",
   "tiploc_code":"KNGXBEL",
   "tiploc_instance":null,
   "arrival":null,
   "departure":null,
   "pass":"2052H",
   "public_arrival":null,
   "public_departure":null,
   "platform":null,
   "line":"FL2",
   "path":null,
   "engineering_allowance":null,
   "pathing_allowance":null,
   "performance_allowance":null
Train Terminus
   "location_type":"LT",
   "record_identity":"LT",
   "tiploc_code":"KNGX",
   "tiploc_instance":null,
   "arrival":"2054H",
   "public_arrival":"2100",
   "platform":"4",
   "path":null

As this is an arrival entry, only arrival keys are present.

EOF

   "EOF":true

Just a handy note to say you reached the end of the file, in case you obtained a broken download.

Further Information

The first ~4% of the Full Daily file contains schedule associations, this links multiple Train UID's with a Primary Train UID, which can be looked up in the Schedules.

Due to the Size of the Full Daily, (1.5GB when gunzip'ed) it can take some time (about 2 hours) to import the data from the file.

A given Schedule entry, contains information about the schedule, including its Start and End dates, and can then contain one or more schedule stops, which describe the calling points for a train on its schedule. These calling points will have a official arrival/departure time and a Public arrival/departure time. When displaying data to the end user, its probably best to use the Public versions.