Monday 26 August 2024

geogst, a new Python module for Structural Geology

 

geogst is a new Python module for structural geology available on PyPi, making it easily installable via the classic command pip install geogst (or python -m pip install geogst).

This module was primarily developed to facilitate the creation of stereonets for geological data, determining the intersection between geological planes and topographic surfaces (expressed through Digital Elevation Models, DEMs), and generating the skeletons of geological profiles. Among its main features, the module allows for the calculation of topographic profiles, adding geological attitudes, and determining the intersections of geological traces with these profiles.


 

Additionally, geogst includes example geospatial datasets that can be used to explore its functionalities, such as creating profiles directly within Jupyter Notebooks.

Here are a few examples of Jupyter Notebooks for creating geological profiles:

This module will form the foundation for the modules in the qgSurf plugin for QGIS. Currently, qgSurf directly includes geogst as a submodule, so no separate installation is required. In future versions of qgSurf (initially experimental), the geogst module will be automatically installed if it is not already present.

Since QGIS does not allow the upload of compiled code in Python modules, a key advantage of using geogst as a separate module in qgSurf is the potential to incorporate compiled code in Fortran, C++, and Rust, significantly improving the speed and efficiency of these tools.

Conclusion and call to action: If you are a geologist or a developer working with geological data, we invite you to explore geogst and contribute to its development. Your experiences and feedback are crucial to improving and growing the community that uses ggSurf.

Saturday 17 August 2024

Mojo

A new AI-oriented language is on the scene: Mojo, which aims to become a superset of Python. It follows Python semantics but, thanks to recent and advanced compilation techniques, might achieve speedups as large as 100,000 times or more compared to standard Python.

Mojo was created by Chris Lattner, the creator of LLVM, Clang, Swift, Swift for TensorFlow and MLIR.

Currently, Mojo is only supported on Linux and macOS.

The source code is hosted at GitHub, from where you can clone the repository locally. Within the downloaded repository, the 'examples/notebooks' directory contains Jupyter Notebooks that you can run using Mojo (provided that Mojo and the Jupyter plugin are installed).

A very brief introduction to the Mojo language is provided in HelloMojo.ipynb. Another notebook, Matmul.ipynb, implements a matrix multiplication example in both Python and Mojo. The example showcases how optimization in Mojo can lead to a remarkable speedup, reportedly around 455,127x compared to Python. It's important to note that achieving such optimized Mojo code requires considerable effort and a solid understanding of Mojo.

That said, I tried running the code in that notebook on my old HP laptop, which lacks a GPU, and has the following specifications:

inxi -Fxxxzr

System:
Kernel: 5.15.0-118-generic x86_64 bits: 64 compiler: gcc v: 11.4.0
Desktop: Xfce 4.18.1 tk: Gtk 3.24.33 info: xfce4-panel wm: xfwm 4.18.0
vt: 7 dm: LightDM 1.30.0 Distro: Linux Mint 21.3 Virginia
base: Ubuntu 22.04 jammy
Machine:
Type: Laptop System: HP product: HP Laptop 15-bs0xx v: Type1ProductConfigId
serial: <superuser required> Chassis: type: 10 serial: <superuser required>
Mobo: HP model: 832B v: 23.37 serial: <superuser required> UEFI: Insyde
v: F.21 date: 07/04/2017
CPU:
Info: dual core model: Intel Core i5-7200U bits: 64 type: MT MCP
smt: enabled arch: Amber/Kaby Lake note: check rev: 9 cache: L1: 128 KiB
L2: 512 KiB L3: 3 MiB
Speed (MHz): avg: 1134 high: 1245 min/max: 400/3100 cores: 1: 1040
2: 1097 3: 1245 4: 1157 bogomips: 21599

Even though the speedup I achieved was an order of magnitude lower than the original 455,127x reported in the notebook, it was still impressive: 10,032x! It’s likely that using a newer machine, possibly equipped with a GPU, I could achieve speedups on the order of 100,000x.

As reported, tools like Cython or Numba achieve speedups typically in the range of 10-100x

Moreover, the Mojo code ran without any issue on my old laptop running Linux Mint.

So Mojo is a very interesting language, even in its infancy.

Other AI-oriented languages to explore, as described in the insightful post by James Thomason in VentureBeat, "Mojo Rising: The resurgence of AI-first programming languages" include Bend and JAX.

 

Note: the text was checked in ChatGPT.