TRK_StartTimestamp

TRK_StartTimestamp takes a track column and returns a datetime column that represents the first instant of each input track.

Tracks are linestrings that represent the change in an entity's location over time. Each vertex in the linestring has a timestamp (stored as the M-value) and the vertices are ordered sequentially.

For more information on using tracks in GeoAnalytics Engine, see the core concept topic on tracks.

FunctionSyntax
Pythonstart_timestamp(track)
SQLTRK_StartTimestamp(track)
ScalastartTimestamp(track)

For more details, go to the GeoAnalytics Engine API reference for start_timestamp.

Examples

PythonPythonSQLScala
Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
from geoanalytics.tracks import functions as TRK
from geoanalytics.sql import functions as ST

data = [
    ("LINESTRING M (-117.27 34.05 1633455010, -117.22 33.91 1633456062, -116.96 33.64 1633457132)",),
    ("LINESTRING M (-116.89 33.96 1633575895, -116.71 34.01 1633576982, -116.66 34.08 1633577061)",),
    ("LINESTRING M (-116.24 33.88 1633575234, -116.33 34.02 1633576336)",)
]

df = spark.createDataFrame(data, ["wkt",]) \
          .withColumn("track", ST.line_from_text("wkt", srid=4326))

df.select(TRK.start_timestamp("track").alias("start_timestamp")).show()
Result
Use dark colors for code blocksCopy
1
2
3
4
5
6
7
+-------------------+
|    start_timestamp|
+-------------------+
|2021-10-05 10:30:10|
|2021-10-06 20:04:55|
|2021-10-06 19:53:54|
+-------------------+

Version table

ReleaseNotes

1.4.0

Python and SQL functions introduced

1.5.0

Scala function introduced

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.