Abujayyab, S.K.M.Karas, I.R.2024-09-292024-09-292020978-303042851-81865-0929https://doi.org/10.1007/978-3-030-42852-5_16https://hdl.handle.net/20.500.14619/96831st Eurasian BIM Forum, EBF 2019 -- 31 May 2019 through 31 May 2019 -- Istanbul -- 238529Building information modelling BIM is relying on plenty of geospatial information such as buildings footprints. Collecting and updating BIM information is a considerable challenge. Recently, buildings footprints automatically extracted from high-resolution satellite images utilizing machine learning algorithms. Constructing required training datasets for machine learning algorithms and testing data is computationally intensive. When the analysis performs in large geographic areas, researchers are struggling from out of memory problems. The requirement of developing improved, fit memory computation methods for accomplishing this computation is urgent. This paper targeting to handling massive data size issue in buildings footprints extraction from high-resolution satellite images. This article established a method to process the spatial raster data based on the chunks computing. Chunk-based decomposition decomposes raster array into several tiny cubes. Cubes supposed to be small enough to fit into available memory and prevent memory overflow. The algorithm of the method developed using Python programming language. Spatial data and developed tool were prepared and processed in ArcGIS software. Matlab software utilized for machine learning. Neural networks implemented for extracting the buildings’ footprints. To demonstrate the performance of our approach, high-resolution Orthoimage located in Tucson, Arizona state in American United States was utilized as a case study. Original image was taken by UltraCamEagle sensor and contained (11888 columns, 11866 rows, cell size 0.5 foot, 564,252,032 pixels in 4 bands). The case image contained (1409 columns, 1346 rows, and 7586056 pixels in 4 bands). The full image is impossible to be handled in the traditional central processing unit CPU. The image divided to 36 chunks using 1000 rows and 1000 columns. Full analysis spent 35 min using Intel Core i7 processor. The output performance accuracy of the neural network is 98.3% for testing dataset. Consequences demonstrate that the chunk computing can solve the memory overflow in personal computers during buildings footprints extraction process, especially in case of processing large files of high-resolution images. The developed method is suitable to be implemented in an affordable lightweight desktop environment. In addition, building footprints extracted effetely and memory overflow problem bypassed. Furthermore, the developed method proved the high quality extracted buildings footprints that can be integrated with BIM applications. © Springer Nature Switzerland AG 2020.eninfo:eu-repo/semantics/closedAccessBuildings footprints extractionBuildings Information ModellingHigh resolution satellite imagesMassive data sizeNeural networksHandling massive data size issue in buildings footprints extraction from high-resolution satellite imagesConference Object10.1007/978-3-030-42852-5_162-s2.0-85082475148210Q41951188 CCIS