-
-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there a way to query journeys from a certain timepoint to a certain time point? #283
Comments
You can pass the start/from date+time as await HAFAS.journeys('8098096', '8000261', {
results: 20,
// make an ISO 8601 string
departure: from_time + 'T00:00+01:00',
}) This effectively queries journeys by their earliest departure time. Alternatively, you can pass in the latest acceptable arrival time as
Currently, with |
What does "all" mean in your context? It might sound obvious, but there is an almost infinite number of possible journeys between two stops. Even if you only keep those that departure after When called via This leads me to the question: What do you actually want to do? |
Let's say we pass the opt:
That should mean HAFAS is searching for only direct trains, correct? Then combine it with the constraint of a timeframe of
If there is not such an option, Thank you very much for your fast replies. |
What do you actually want to do? Using the method you described, you can iterate over those direct trains between the stops that the routing algorithm deemed "good enough". You don't necessarily get all trains running between those stops. |
I want to get the ICE's between let's say Düsseldorf and Hamburg for the next 8 hours, and then filter for the one with the cheapest price? How do tell the api to just give me the ICE's for 8 hours departure time slot? |
As I elaborated, that won't necessarily (as in: there are no guarantees) give you all ICEs between the two stations, and thus not necessarily give you the cheapest. For pricing, there is a different API (flavour) used by Nevertheless, here's a snippet with the aforementioned limitations: import {createClient} from 'hafas-client/index.js'
import {profile as dbProfile} from 'hafas-client/p/db/index.js'
// don't send `rtMode: REALTIME`, breaks iteration
// see https://github.com/public-transport/hafas-client/issues/282#issuecomment-1373612431
const dbProfileWithoutRtModeRealtime = {
...dbProfile,
transformReqBody: (ctx, body) => {
const reqBody = dbProfile.transformReqBody(ctx, body)
delete reqBody.svcReqL[0].cfg.rtMode
return reqBody
}
}
const hafas = createClient(dbProfileWithoutRtModeRealtime, 'journeys iteration demo')
const from = '8000085' // Düsseldorf Hbf
const to = '8096009' // HAMBURG
const start = Date.parse('2023-02-14T07:00+01:00')
const latestJourneyDep = (journeys) => {
const deps = journeys.map((journey) => {
const dep = journey.legs[0].departure || j.legs[0].plannedDeparture
return +Date.parse(dep)
})
deps.sort()
return deps[0]
}
const collectedJourneys = []
let journeysOpts = {departure: start}
while (true) {
const {
journeys,
laterRef,
} = await hafas.journeys(from, to, journeysOpts)
if (journeys.length === 0) break
collectedJourneys.push(...journeys)
if ((latestJourneyDep(journeys) - start) > 8 * 3600 * 1000) break
journeysOpts = {laterThan: laterRef}
}
console.log(collectedJourneys) Due to #282 (comment), currently, the default DB profile does not allow iterating ("scrolling") through |
Does that answer your question? |
Yes thank you very much for the effort |
Just as a side note, I think for the use case of direct connections it is much easier to query the departure board for the origin station and the arrival board for the destination station and join the trips on your own (easy since they will have the same |
Here is my request:
this is how I try to call it
I just couldn't find an option to pass in for the two date times.
I am also aware that i would probably have to convert the Dates to Date objects.
Is it possible to get journey data from all connections between two stations?
If you have any other questions regarding this, please ask me.
The text was updated successfully, but these errors were encountered: