domenica 1 marzo 2026

Kernel Panic QrCode

 In tanti anni ho visto qualche kernel panic, ma in questo formato non mi era mai successo

 


 la cosa curiosa che al riavvio successivo nessun problema

Spectralunmixing

Per processare le immagini Emit ed ottenere il prodotto L2B e' stato utilizzato il software di spectral unmixing SpectralUnmixing.jl  

Il software usa il linguaggio Julia che si installa tramite 

curl -fsSL https://install.julialang.org | sh

 Dopo si clona il repository

git clone  https://github.com/emit-sds/SpectralUnmixing.jl

julia

si preme il tasto parentesi quadrata chiusa il prompt si modifica da julia in pkg

addSpectralUnmixing

si esce con CTRL+C

using SpectralUnmixing

CLI.install()

si esce con CTRL+D 

 


 

Nella cartella data sono presenti dati di test per il software

nel file csv ci sono le firme spettrali degli endmeber (nome classe e riflettanza), ed una immagine in formato Envi

Per lanciare l'unmixing si usa  

cd data

julia --project=/home/luca/SpectralUnmixing.jl /home/luca/.julia/bin/unmix.jl   emit20250324t221005_jpl_unmix_ex  basic_endmember_library.csv  Class output_test --mode sma-best --n_mc 30 

--normalization brightness

 

n_mc indica il numero di boostraping con il metodo Montecarlo 

gli endmember sono Soil, PV (vegetazione fotosintetica),NPV (vegetazione non fotosintetica)


 

Questo l'output 

 ┌ Info: Unmixing was processed on: HW27747
└ @ Main /home/luca/.julia/bin/unmix.jl:177
┌ Info: Reflectance file processed: emit20250324t221005_jpl_unmix_ex
└ @ Main /home/luca/.julia/bin/unmix.jl:178
┌ Info: Arguments: Dict{String, Any}("optimizer" => "bvls", "write_complete_fractions" => true, "mode" => "sma-best", "reflectance_uncertainty_file" => "", "combination_type" => "class-even", "log_file" => nothing, "normalization" => "none", "spectral_starting_column" => 2, "endmember_file" => "basic_endmember_library.csv", "truncate_end_columns" => 0, "refl_scale" => 1.0, "reflectance_file" => "emit20250324t221005_jpl_unmix_ex", "endmember_class_header" => "Class", "n_mc" => 30, "endmember_classes" => [""], "refl_nodata" => -9999.0, "output_file_base" => "output_test", "num_endmembers" => [3], "start_line" => 1, "end_line" => -1, "wavelength_ignore_regions" => [0.0, 440.0, 1310.0, 1490.0, 1770.0, 2050.0, 2440.0, 2880.0], "max_combinations" => -1)
└ @ Main /home/luca/.julia/bin/unmix.jl:179
┌ Info: Ignoring wavelength regions: Any[[0.0, 440.0], [1310.0, 1490.0], [1770.0, 2050.0], [2440.0, 2880.0]]
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/EndmemberLibrary.jl:191
┌ Info: Running from lines: 1 - 10
└ @ Main /home/luca/.julia/bin/unmix.jl:211
AbstractString[InlineStrings.String7("SOIL"), InlineStrings.String7("PV"), InlineStrings.String7("NPV")]
AbstractString[InlineStrings.String7("SOIL"), InlineStrings.String7("PV"), InlineStrings.String7("NPV"), "Brightness"]
┌ Info: Output Image Size (x,y,b): 10, 10, [4, 4, 6].
│         Creating output fractional cover dataset.
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/Datasets.jl:77
┌ Info: Output Image Size (x,y,b): 10, 10, [4, 4, 6].
│         Creating output fractional cover dataset.
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/Datasets.jl:77
┌ Info: Output Image Size (x,y,b): 10, 10, [4, 4, 6].
│         Creating output fractional cover dataset.
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/Datasets.jl:77
┌ Info: Unmix output files: ["output_test_fractional_cover", "output_test_fractional_cover_uncertainty", "output_test_complete_fractions"]
└ @ Main /home/luca/.julia/bin/unmix.jl:267
┌ Info: total number of workers available: 1
└ @ Main /home/luca/.julia/bin/unmix.jl:268
┌ Info: Line 2 run in 0.0142 seconds
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/SpectralUnmixing.jl:542
┌ Info: Line 3 run in 0.0138 seconds
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/SpectralUnmixing.jl:542
┌ Info: Line 4 run in 0.0138 seconds
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/SpectralUnmixing.jl:542
┌ Info: Line 5 run in 0.0138 seconds
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/SpectralUnmixing.jl:542
┌ Info: Line 6 run in 0.0155 seconds
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/SpectralUnmixing.jl:542
┌ Info: Line 7 run in 0.0144 seconds
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/SpectralUnmixing.jl:542
┌ Info: Line 8 run in 0.0207 seconds
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/SpectralUnmixing.jl:542
┌ Info: Line 9 run in 0.0181 seconds
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/SpectralUnmixing.jl:542
┌ Info: Line 10 run in 0.0137 seconds
└ @ SpectralUnmixing /home/luca/SpectralUnmixing.jl/src/SpectralUnmixing.jl:542
 17.617387 seconds (18.76 M allocations: 1.104 GiB, 32.59% gc time, 66.76% compilation time)


i file di output saranno file hdr (complete_fractions, fractional_cover, fractional_cover_uncertainty)

complete_fractions: indica la percentuale  della classe contenuta nel pixel

fractional_cover : se complete_fraction indica tipo il 60%, fractional_cover indica la divisione percentuale delle frazioni mineralogiche (Ematite, Goethite....). Se la complete_fractions della categoria Soil e' inferiore al 20% e' inutile questo successivo raffinamento

fractional_cover_uncertainty: indica la qualita' del dato...valori bassi unmixing migliore 

Emit L3 ASA

Il prodotto Emit L3 ASA (scaricabile da qui) corrisponde all'unmixing mineralogico dei dati ipespettrali di Emit. La copertura e' a livello mondiale tra le latitudini 55N/-54.5S ed una risoluzione spaziale di mezzo grado decimale (circa 55 km) con specializzazione su suoli aridi (i dati sono mascherati per la vegetazione)

Vengono calcolate 10 categorie mineralogiche  

Il prodotto e' nato per polveri e non per suoli. Per indagini mineralogiche di suoli e' piu' affidale il prodotto L2B a cui pero' si deve effettuare in proprio l'unmixing spettrale 

  


 

sabato 28 febbraio 2026

Conversione Emit L2A in Envi

Attenzione : la struttura dei file netcdf ha subito una variazione intorno al 2025 

Da EarthData si possono scaricare i dati  

https://search.earthdata.nasa.gov/search/granules?p=C2408750690-LPCLOUD

https://earth.jpl.nasa.gov/emit/data/data-portal/coverage-and-forecasts/ 

Piccola nota: bande Emit per true color

Red: ~650 nm → Band 29
Green: ~550 nm → Band 19
Blue: ~460 nm → Band 11


i dati sono in formato netcfd ma non risulta georiferito , si apre tranquillamente in Esa Snap ma la cosa migliore e' trasformarlo in formato Envi per avere tutte le informazioni spettrali

 

In fondo alla pagina lo script per effettuare l'operazione..attenzione che a seconda che l'orbita sia discendente od ascendente si deve ruotare la matrice di 90 grafi  

Il file viene georiferito nel senso che ogni pixel ha le sue coordinate ma l'immagine non e' ruotata come dovrebbe essere


 

https://github.com/nasa/EMIT-Data-Resources 

import xarray as xr
import numpy as np
import rasterio
from rasterio.transform import from_origin

ds = xr.open_dataset('grosseto2.nc', engine='netcdf4')
ds_bands = xr.open_dataset('grosseto2.nc', engine='netcdf4', group='sensor_band_parameters')
wavelengths = ds_bands['wavelengths'].values
fwhm = ds_bands['fwhm'].values

gt = ds.attrs['geotransform']
reflectance = ds['reflectance'].values
data = np.transpose(reflectance, (2, 0, 1))
data = np.rot90(data, k=1, axes=(1, 2))

nrows, ncols = data.shape[1], data.shape[2]
nbands = data.shape[0]

transform = from_origin(gt[0], gt[3], gt[1], abs(gt[5]))

with rasterio.open(
'grosseto2.envi', 'w',
driver='ENVI',
height=nrows, width=ncols,
count=nbands,
dtype=data.dtype,
crs='EPSG:4326',
transform=transform,
) as dst:
dst.write(data)

# Rewrite the entire .hdr cleanly
with open('grosseto2.hdr', 'w') as hdr:
hdr.write('ENVI\n')
hdr.write('description = { EMIT L2A Reflectance - Grosseto }\n')
hdr.write(f'samples = {ncols}\n')
hdr.write(f'lines = {nrows}\n')
hdr.write(f'bands = {nbands}\n')
hdr.write('header offset = 0\n')
hdr.write('file type = ENVI Standard\n')
hdr.write('data type = 4\n')
hdr.write('interleave = bsq\n')
hdr.write('byte order = 0\n')
hdr.write(f'map info = {{Geographic Lat/Lon, 1, 1, {gt[0]}, {gt[3]}, {gt[1]}, {abs(gt[5])}, WGS-84}}\n')
hdr.write('wavelength units = Nanometers\n')
hdr.write('wavelength = {\n')
hdr.write(',\n'.join(f' {w:.6f}' for w in wavelengths))
hdr.write('}\n')
hdr.write('fwhm = {\n')
hdr.write(',\n'.join(f' {f:.6f}' for f in fwhm))
hdr.write('}\n')
hdr.write('band names = {\n')
hdr.write(',\n'.join(f' Band_{i+1}_{w:.2f}nm' for i, w in enumerate(wavelengths)))
hdr.write('}\n')

print(f"Done! {nbands} bands, {wavelengths[0]:.2f}-{wavelengths[-1]:.2f} nm")



 

 

Mixture-of-Experts Variational Autoencoder

Leggendo la documentazione delle libreria HyperCoast ho trovato riferimento a Moe Vae. Si tratta di una rete neurale complessa basato su 


Mixture-of-Experts Variational Autoencoder for Clustering and Generating from Similarity-Based Representations on Single Cell Data 
Andreas Kopf, Vincent Fortuin, Vignesh Ram Somnath, Manfred Claassen

https://arxiv.org/abs/1910.07763

che ha il vantaggio di lavorare su una moltitudine di dati come quelli iperspettrali

 Ho provato tramite AI, con i dati di Indian Pines 

A sinistra verita'a terra A destra risultato del modello

 

 

import torch
import torch.nn as nn
import torch.nn.functional as F
import scipy.io as sio
import numpy as np
import matplotlib.pyplot as plt
from torch.utils.data import DataLoader, TensorDataset
from sklearn.preprocessing import StandardScaler
from sklearn.svm import SVC
from scipy.signal import medfilt2d

# --- 1. ROBUST DATA LOADING ---
def load_indian_pines():
data = sio.loadmat('Indian_pines_corrected.mat')['indian_pines_corrected']
gt = sio.loadmat('Indian_pines_gt.mat')['indian_pines_gt']
h, w, b = data.shape
# Conditional cleanup for the 219 index error
if b > 200:
ignored_bands = [103, 104, 105, 106, 107, 108, 149, 150, 151, 152,
153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163]
data = np.delete(data, [i for i in ignored_bands if i < b], axis=2)
x = data.reshape(-1, data.shape[2]).astype(float)
scaler = StandardScaler()
x_scaled = scaler.fit_transform(x)
return torch.tensor(x_scaled, dtype=torch.float32), gt, h, w, data.shape[2]

# --- 2. IMPROVED MODEL ---
class Expert(nn.Module):
def __init__(self, input_dim, latent_dim):
super().__init__()
self.enc = nn.Sequential(nn.Linear(input_dim, 128), nn.BatchNorm1d(128), nn.ReLU(), nn.Linear(128, 64), nn.ReLU())
self.mu = nn.Linear(64, latent_dim)
self.logvar = nn.Linear(64, latent_dim)
self.dec = nn.Sequential(nn.Linear(latent_dim, 64), nn.ReLU(), nn.Linear(64, 128), nn.ReLU(), nn.Linear(128, input_dim))

def forward(self, x):
h = self.enc(x)
mu, lv = self.mu(h), self.logvar(h)
std = torch.exp(0.5 * lv)
z = mu + torch.randn_like(std) * std
return self.dec(z), mu, lv

class MoESimVAE(nn.Module):
def __init__(self, input_dim, latent_dim, num_experts=4):
super().__init__()
self.experts = nn.ModuleList([Expert(input_dim, latent_dim) for _ in range(num_experts)])
self.gate = nn.Sequential(nn.Linear(input_dim, 64), nn.ReLU(), nn.Linear(64, num_experts), nn.Softmax(dim=-1))

def forward(self, x):
w = self.gate(x)
recons, mus, lvs = [], [], []
final_recon = 0
for i, exp in enumerate(self.experts):
r, m, l = exp(x)
final_recon += w[:, i].unsqueeze(1) * r
mus.append(m); lvs.append(l)
return final_recon, mus, lvs, w

# --- 3. THE "SIM" LOSS ---
def compute_loss(recon, x, mus, lvs, w, sigma=0.5):
# Reconstruction + KLD
mse = F.mse_loss(recon, x, reduction='sum')
kld = 0
comb_mu = torch.zeros_like(mus[0])
for i in range(len(mus)):
kld += w[:, i].mean() * -0.5 * torch.sum(1 + lvs[i] - mus[i].pow(2) - lvs[i].exp())
comb_mu += w[:, i].unsqueeze(1) * mus[i]

# RBF Similarity (using a subset for speed/stability)
subset_idx = torch.randperm(x.size(0))[:64]
xs, zs = x[subset_idx], comb_mu[subset_idx]
dist_x = torch.cdist(xs, xs).pow(2)
dist_z = torch.cdist(zs, zs).pow(2)
k_x = torch.exp(-dist_x / (2 * sigma**2))
k_z = torch.exp(-dist_z / (2 * sigma**2))
sim_loss = F.mse_loss(k_z, k_x) * 50.0 # High weight for Sim
# Entropy to prevent expert collapse
entropy = -torch.sum(w.mean(0) * torch.log(w.mean(0) + 1e-8))
return mse + kld + sim_loss - (0.5 * entropy)

# --- 4. TRAIN AND MAP ---
def main():
x_tensor, gt, h, w, b = load_indian_pines()
loader = DataLoader(TensorDataset(x_tensor), batch_size=64, shuffle=True)
model = MoESimVAE(b, 25)
opt = torch.optim.Adam(model.parameters(), lr=1e-3)

print("Training... (Aiming for better separation)")
for epoch in range(30):
for batch in loader:
opt.zero_grad()
r, m, l, wg = model(batch[0])
loss = compute_loss(r, batch[0], m, l, wg)
loss.backward()
opt.step()

# Extract & Predict
model.eval()
with torch.no_grad():
_, mus, _, wg = model(x_tensor)
z = sum(wg[:, i].unsqueeze(1) * mus[i] for i in range(len(mus))).numpy()
y = gt.ravel()
labeled = np.where(y > 0)[0]
clf = SVC(kernel='rbf', C=10) # RBF kernel for the classifier too
clf.fit(z[labeled[::10]], y[labeled[::10]]) # Train on 10%
preds = clf.predict(z).reshape(h, w)
preds[gt == 0] = 0
# SPATIAL CLEANING (The secret sauce)
preds_clean = medfilt2d(preds.astype(float), kernel_size=3)
preds_clean[gt == 0] = 0

plt.figure(figsize=(12, 6))
plt.subplot(1, 2, 1); plt.imshow(gt, cmap='nipy_spectral'); plt.title("Ground Truth")
plt.subplot(1, 2, 2); plt.imshow(preds_clean, cmap='nipy_spectral'); plt.title("MoE-Sim-VAE (Cleaned)")
plt.show()

if __name__ == "__main__":
main()

Aviris vs Enmap

Un confronto tra spettri di punti del volo Aviris-NG zona Grosseto e Enamp

Per rendere il confronto minimamente rappresentativo e' stato preso un bersaglio antropico che non e' modificato nel tempo ed un campo sempre allo stato suolo nudo

 

Bersaglio naturale (suolo nudo)

Bersaglio Antropico

 

Bersaglio antropico

 

 

 

venerdì 27 febbraio 2026

HyperCoast Python Library

Hypercoast e' una libreria Python per leggere dati da formati iperspettrali e visualizzare spettri

L'uso migliore e' quello all'interno di Jupyter Lab

 

import hypercoast

filepath = "ang20210604t105418_rfl_v2z1_img"
ds = hypercoast.read_aviris(filepath)
print(ds)
m = hypercoast.Map()
print(m)
m
m.add_aviris(ds, wavelengths=[1000, 700, 400], vmin=0, vmax=0.2)
m.add("spectral")
display(m)


 

Aviris-NG Grosseto

 

 

 

 

Kernel Panic QrCode

 In tanti anni ho visto qualche kernel panic, ma in questo formato non mi era mai successo    la cosa curiosa che al riavvio successivo ness...