4 research outputs found

    Benchmark data for: Machine Learning for geospatial vector data classification

    No full text
    Benchmark data for paper "Deep Learning for Classification Tasks on Geospatial Vector Polygons". Core of the data is in the six numpy zip files. Each numpy zip contains the original WKT geometries as zlib compressed blobs, variable and fixed length geometry vectors, fourier descriptors, and a class dictionary. The zlib compressed wkt strings can be decompressed with import numpy as np import zlib loaded = np.load('archaeology_train_v8.npz') wkts_zipped = loaded['wkts_zlib_compressed'] for wkt_zipped in wkts_zipped: &nbsp wkt = str.decode(zlib.decompress(wkt_zipped)) </code

    Benchmark data for: Machine Learning for geospatial vector data classification

    No full text
    Benchmark data for paper "Deep Learning for Classification Tasks on Geospatial Vector Polygons". Core of the data is in the six numpy zip files. Each numpy zip contains the original WKT geometries as zlib compressed blobs, variable and fixed length geometry vectors, fourier descriptors, and a class dictionary. The zlib compressed wkt strings can be decompressed with import numpy as np import zlib loaded = np.load('archaeology_train_v8.npz') wkts_zipped = loaded['wkts_zlib_compressed'] for wkt_zipped in wkts_zipped: &nbsp wkt = str.decode(zlib.decompress(wkt_zipped)) </code
    corecore