PAMTRA tutorial

PAMTRA overview

Model framework

PAMTRA as an acronym stands for Passive and Active Microwave TRAnsfer. It is a model framework written in FORTRAN90 and python for the simulation of passive and active RT (including Radar Doppler Spectra) in a plane-parallel, one-dimensional, and horizontally homogeneous atmosphere for the microwave frequency range up to 1 THz. With PAMTRA, the user is able to simulate up- and down-welling radiation at any height and observation angle in the atmosphere. The model framework includes several routines for reading and pre-processing of data from various input sources like cloud resolving models, profile measurements, or idealized profiles, e.g., designed for sensitivity studies. Due to its flexible design, a further extension for other input sources can be easily achieved.

title

The flow diagram shows an overview of the various steps performed in the FORTRAN core of the present model setup. For the simulation, the model needs various inputs (shown in reddish colors):

  • atmospheric state
  • assumption on gaseous absorption
  • assumption on hydrometeor single scattering properties
  • boundary conditions, i.e., cosmic background surface emissivity

From these assumptions interaction parameters within various modules (white boxes) are generated. These parameters serve as input for the solving routines for the passive and active part (shown in gray). The simulations produce polarized radiances or TBs (brightness temperatures) for up- and down-welling geometries for arbitrary zenith angles and vertical and horizontal polarization for the passive part and radar Doppler spectra (and derived moments such as reflectivity, mean Doppler velocity, skewness, and kurtosis, as well as left and right slopes) for the active part.

pyPAMTRA adds a python framework around the FORTRAN core which allows to call PAMTRA directly from python without using the FORTRAN I/O routines. Consequently, pyPAMTRA is a more user-friendly way to access the PAMTRA model. It includes a collection of supporting routines, e.g. for importing model data or producing graphical output of the simulation results. With pyPAMTRA, a local parallel execution of PAMTRA on multi-core processor machines is possible. Furthermore, by using Python for I/O and flow control, interfacing PAMTRA to instrument or atmospheric models, as well as post-processing routines, is possible in a more comfortable way.

pyPamtra

The installation of PAMTRA is decribed within the PAMTRA documentation. Once the program has been compiled and the pyPamtra library is installed within your PYTHONPATH (e.g., $HOME/lib/python/) it can be used. Furthermore, it is helpfull to tell pamtra where it finds its auxiliary data by setting an environment variable PAMTRA_DATADIR

In [21]:
import sys
sys.path.append("/home/mech/lib/python/") # to find pyPamtra in case you did not install it yourself
import pyPamtra # main pamtra module
import os # import operating system module
# tell pamtra where itfinds its auxiliary data if not done before by the environment variable
os.environ['PAMTRA_DATADIR'] = '/net/sever/mech/pamtra/data/'

After importing pyPamtra, PAMTRA is available in your python environment. First step is to create a pyPamtra object.

In [22]:
pam = pyPamtra.pyPamtra() # basic empty pyPamtra object with default settings
pam.p
Out[22]:
{'max_nlyrs': 0, 'ngridx': 0, 'ngridy': 0}
In [24]:
pam.nmlSet
Out[24]:
{'active': True,
 'add_obs_height_to_layer': False,
 'conserve_mass_rescale_dsd': True,
 'creator': 'Pamtrauser',
 'data_path': '$PAMTRA_DATADIR',
 'emissivity': 0.6,
 'file_desc': '',
 'gas_mod': 'R98',
 'ground_type': 'L',
 'hydro_adaptive_grid': True,
 'hydro_fullspec': False,
 'hydro_includehydroinrhoair': True,
 'hydro_limit_density_area': True,
 'hydro_softsphere_min_density': 10.0,
 'hydro_threshold': 1e-10,
 'lgas_extinction': True,
 'lhyd_absorption': True,
 'lhyd_emission': True,
 'lhyd_scattering': True,
 'liblapack': True,
 'liq_mod': 'Ell',
 'outpol': 'VH',
 'passive': True,
 'radar_airmotion': False,
 'radar_airmotion_linear_steps': 30,
 'radar_airmotion_model': 'step',
 'radar_airmotion_step_vmin': 0.5,
 'radar_airmotion_vmax': 4.0,
 'radar_airmotion_vmin': -4.0,
 'radar_aliasing_nyquist_interv': 1,
 'radar_attenuation': 'disabled',
 'radar_convolution_fft': True,
 'radar_fwhr_beamwidth_deg': 0.31,
 'radar_integration_time': 1.4,
 'radar_k2': 0.93,
 'radar_max_v': 7.885,
 'radar_min_spectral_snr': 1.2,
 'radar_min_v': -7.885,
 'radar_mode': 'simple',
 'radar_nfft': 256,
 'radar_no_ave': 150,
 'radar_noise_distance_factor': 2.0,
 'radar_npeaks': 1,
 'radar_pnoise0': -32.23,
 'radar_polarisation': 'NN',
 'radar_receiver_miscalibration': 0.0,
 'radar_receiver_uncertainty_std': 0.0,
 'radar_save_noise_corrected_spectra': False,
 'radar_smooth_spectrum': True,
 'radar_use_hildebrand': False,
 'radar_use_wider_peak': False,
 'randomseed': 0,
 'salinity': 33.0,
 'save_psd': False,
 'save_ssp': False,
 'tmatrix_db': 'none',
 'tmatrix_db_path': 'database/',
 'write_nc': True}

Before we can proceed, pamtra needs definitions of hydrometeors. Up to now at least one is required to scale various arrays. This can be done by either directly define them:

In [25]:
pam.df.addHydrometeor(("ice", -99., -1, 917., 130., 3.0, 0.684, 2., 3, 1, "mono_cosmo_ice", -99., -99., -99., -99., -99., -99., "mie-sphere", "heymsfield10_particles",0.0))

or by importing the definitions from a file (examples are provided in descriptorfiles/ subdirectory).

pam.df.readFile(descriptorFilename)

No matter which way, in both cases at least one hydrometeor should now be defined.

In [26]:
pam.df.nhydro
Out[26]:
1

Now it is time to provide pamtra some data. The mandatory atmospheric variables are

  • hgt_lev (height levels) or hgt (height of layer means)
  • temp_lev or temp
  • press_lev or press
  • relhum_lev or relhum

The following variables are optional and guessed if not provided: timestamp,lat,lon,lfrac,wind10u,wind10v,hgt_lev,hydro_q,hydro_n,hydro_reff,obs_height

These variables need to be stored in a python dictionary and can then be passed to the method to create a profile usable by pamtra.

In [31]:
pamData = dict()
pamData["temp"] = [275.,274.,273.]
pamData["relhum"] = [90.,90.,90.]
pamData["hgt"] = [500.,1500.,2500.]
pamData["press"] = [90000.,80000.,70000.]
In [32]:
pam.createProfile(**pamData) # create a pamtra profile.
# Produces a lot of warnings, just telling that many variables have been set to default

Another possibility to perform all these steps on standardized input is to use one of the importer methods provided by the pyPamtra package.

pam = pyPamtra.importer.readCosmoDe1MomDataset(...)

pam = pyPamtra.importer.readCosmoDe2MomDataset(...)

pam = pyPamtra.importer.readCosmoReAn2km(...)

pam = pyPamtra.importer.readCosmoReAn6km(...)

pam = pyPamtra.importer.readMesoNH(...)

pam = pyPamtra.importer.readWrfDataset(...)

These importer methods need various parameters,i.e like model output files and the appropriate descriptor file. Which parameters are necessary and accepted can be found out by:

In [ ]:
pyPamtra.importer.readWrfDataset??

The profile used by pamtra is stored in the profile dictionary pam.p. What exactly is in there can be seen and inspected by

In [ ]:
pam.p.keys() # gives a list of all entries of the dictionary
pam.p['temp'] # prints the temperature

The variables can still be changed before we later on perform the simulations.

Before we can run pamtra, it would be good to set some variables, so the model is doing what we want. What is set so far can be seen by examining the dictionary nmlSet:

In [ ]:
pam.nmlSet

and the settings can be done by changing the members of that dictionary, i.e.:

In [33]:
pam.nmlSet['active'] = False # switch off active calculation

To perform a simulations, on of the methods pam.runPamtra() or pam.runParallelPamtra() needs to be applied. For the simple runPamtra method, only a frequency (or array of frequencies) needs to be provided.

In [34]:
pam.runPamtra(89.0)

If everything went fine and we got no errors or warnings, we should now have the results stored in the dictionary pam.r. Like for the profile dictionary pam.p, it can be inspected by pam.r.keys().

In [19]:
pam.r['tb'].shape # gives the shape of the calculated brightness temperature (tb) (nx,ny,noutlevels,nang,nfreq,npol)
Out[19]:
(1, 1, 2, 32, 1, 2)
In [ ]:
pam.dimensions['tb'] # shows the dimensions
In [37]:
pam.r['tb'][0,0,0,0,0].mean(axis=-1)
# brightness temperature for satellite view in mixed polarization mode (mean of v and h)
Out[37]:
254.90436496417502

As a final step the results can be either plotted, further examined, or stored to a netcdf file by a method provided by pamtra:

In [20]:
pam.writeResultsToNetCDF('example.nc')

Some real world examples

Frequency spectrum

In [38]:
from __future__ import division # defines natural divisions like 1./2. = 1/2 = 0.5 and not 1/2 = 0
import sys
sys.path.append("/home/mech/lib/python/") #to find pyPamtra
import pyPamtra  # import pyPamtra
import matplotlib.pyplot as plt  # import ploting modules
import numpy as np # import numpy for arrays, numerical array operations, ....
import pandas as pn # import pandas data analysis library
import os # import operating system module
os.environ['PAMTRA_DATADIR'] = '/met/sever/mech/pamtra/data/'
%matplotlib inline

Create a pyPamtra object

In [2]:
pam = pyPamtra.pyPamtra()

Define the hydrometeor classes. Replace "mie-sphere" by "disabled" to turn off hydrometeor. Everything in SI units!

In [3]:
descriptorFile = np.array([
      #['hydro_name' 'as_ratio' 'liq_ice' 'rho_ms' 'a_ms' 'b_ms' 'alpha_as' 'beta_as' 'moment_in' 'nbin' 'dist_name' 'p_1' 'p_2' 'p_3' 'p_4' 'd_1' 'd_2' 'scat_name' 'vel_size_mod' 'canting']
       ('cwc_q', -99.0, 1, -99.0, -99.0, -99.0, -99.0, -99.0, 3, 1, 'mono', -99.0, -99.0, -99.0, -99.0, 2e-05, -99.0, 'mie-sphere', 'khvorostyanov01_drops', -99.0),
       ('iwc_q', -99.0, -1, -99.0, 130.0, 3.0, 0.684, 2.0, 3, 1, 'mono_cosmo_ice', -99.0, -99.0, -99.0, -99.0, -99.0, -99.0, 'mie-sphere', 'heymsfield10_particles', -99.0),
       ('rwc_q', -99.0, 1, -99.0, -99.0, -99.0, -99.0, -99.0, 3, 50, 'exp', -99.0, -99.0, 8000000.0, -99.0, 0.00012, 0.006, 'mie-sphere', 'khvorostyanov01_drops', -99.0),
       ('swc_q', -99.0, -1, -99.0, 0.038, 2.0, 0.3971, 1.88, 3, 50, 'exp_cosmo_snow', -99.0, -99.0, -99.0, -99.0, 5.1e-11, 0.02, 'mie-sphere', 'heymsfield10_particles', -99.0),
       ('gwc_q', -99.0, -1, -99.0, 169.6, 3.1, -99.0, -99.0, 3, 50, 'exp', -99.0, -99.0, 4000000.0, -99.0, 1e-10, 0.01, 'mie-sphere', 'khvorostyanov01_spheres', -99.0)], 
      dtype=[('hydro_name', 'S15'), ('as_ratio', '<f8'), ('liq_ice', '<i8'), ('rho_ms', '<f8'), ('a_ms', '<f8'), ('b_ms', '<f8'), ('alpha_as', '<f8'), ('beta_as', '<f8'), ('moment_in', '<i8'), ('nbin', '<i8'), ('dist_name', 'S15'), ('p_1', '<f8'), ('p_2', '<f8'), ('p_3', '<f8'), ('p_4', '<f8'), ('d_1', '<f8'), ('d_2', '<f8'), ('scat_name', 'S15'), ('vel_size_mod', 'S30'), ('canting', '<f8')]
      )

Table as panda DataFrame

In [68]:
pn.DataFrame(descriptorFile)
Out[68]:
hydro_name as_ratio liq_ice rho_ms a_ms b_ms alpha_as beta_as moment_in nbin dist_name p_1 p_2 p_3 p_4 d_1 d_2 scat_name vel_size_mod canting
0 cwc_q -99 1 -99 -99.000 -99.0 -99.0000 -99.00 3 1 mono -99 -99 -99 -99 2.000000e-05 -99.000 mie-sphere khvorostyanov01_drops -99
1 iwc_q -99 -1 -99 130.000 3.0 0.6840 2.00 3 1 mono_cosmo_ice -99 -99 -99 -99 -9.900000e+01 -99.000 mie-sphere heymsfield10_particles -99
2 rwc_q -99 1 -99 -99.000 -99.0 -99.0000 -99.00 3 50 exp -99 -99 8000000 -99 1.200000e-04 0.006 mie-sphere khvorostyanov01_drops -99
3 swc_q -99 -1 -99 0.038 2.0 0.3971 1.88 3 50 exp_cosmo_snow -99 -99 -99 -99 5.100000e-11 0.020 mie-sphere heymsfield10_particles -99
4 gwc_q -99 -1 -99 169.600 3.1 -99.0000 -99.00 3 50 exp -99 -99 4000000 -99 1.000000e-10 0.010 mie-sphere khvorostyanov01_spheres -99
In [4]:
for hyd in descriptorFile: pam.df.addHydrometeor(hyd)
#pam.df.readFile('/home/mech/workspace/pamtra/descriptorfiles/descriptor_file.txt')
In [70]:
pam.df.nhydro
Out[70]:
5
In [5]:
pam.readPamtraProfile('/home/mech/workspace/pamtra/profile/example_input.lay')
In [6]:
fig = plt.figure(figsize=[12,4])
fig.add_axes([0.1,0.1,0.25,0.8])
plt.plot(pam.p['press'][0,0,:]/100.,pam.p['hgt'][0,0,:])
plt.ylabel('height [m]')
plt.xlabel('pressure [hPa]')
fig.add_axes([0.425,0.1,0.25,0.8])
plt.plot(pam.p['temp'][0,0,:],pam.p['hgt'][0,0,:])
plt.xlabel('temperature [K]')
fig.add_axes([0.75,0.1,0.25,0.8])
plt.plot(pam.p['relhum'][0,0,:],pam.p['hgt'][0,0,:])
plt.xlabel('rel. humidity [%]')
plt.show()
In [7]:
freqs = np.arange(10.,200.,1.)
freqs
In [8]:
pam.runParallelPamtra(freqs, pp_deltaX=1, pp_deltaY=1, pp_deltaF=10, pp_local_workers="auto")
In [56]:
pam.r["tb"].shape # ngridx,ngridy,noutlevel,nangles,nfreqs,npol
Out[56]:
(2, 2, 2, 32, 190, 2)
In [10]:
plt.figure()
plt.plot(freqs,pam.r["tb"][0,0,0,0,:,0],label='upwelling T$_\mathrm{B}$')
plt.plot(freqs,pam.r["tb"][0,0,1,31,:,0],label='downwelling T$_\mathrm{B}$')
plt.xlabel('frequency [GHz]')
plt.ylabel('brightness temperature [K]')
plt.legend(loc=4)
plt.show()
In [11]:
pam.writeResultsToNetCDF('example_spectrum.nc')

Simulate a radiometer measurement from a radiosonde observation

First load a bunch of libraries

In [1]:
# -*- coding: utf-8 -*-
from __future__ import division
import numpy as np
import matplotlib.pyplot as plt
import pyPamtra
import pandas as pn
import netCDF4
import socket

#%matplotlib inline

read the radiosonde

In [2]:
rsData = netCDF4.Dataset("/home/mech/workspace/pamtra/doc/tutorials/data/rsdata.nc")
In [3]:
rsData.variables.keys()
Out[3]:
[u'time',
 u'latitude',
 u'longitude',
 u'time_of_measurement',
 u'air_pressure',
 u'air_temperture',
 u'dew_point_temperture',
 u'relative_humidity',
 u'saturation_pressure',
 u'water_vapor_pressure',
 u'absolute_humidity',
 u'specific_humidity',
 u'mixing_ratio',
 u'saturation_mixing_ratio',
 u'virtual_temperature',
 u'density',
 u'height',
 u'geopotential_height',
 u'flagarray1']

Create pyPamtra object

In [4]:
pam = pyPamtra.pyPamtra()

As of now, Pamtra needs at least one Hydrometeor type, we discuss the details later

In [5]:
pam.df.addHydrometeor(("ice", -99., -1, 917., 130., 3.0, 0.684, 2., 3, 1, "mono_cosmo_ice", -99., -99., -99., -99., -99., -99., "mie-sphere", "heymsfield10_particles",0.0))
#pam.df.data
#pn.DataFrame(pam.df.data)

Collect the data required by Pamtra

In [6]:
pamData = dict()
pamData["temp"] = rsData.variables["air_temperture"][:] + 273.15
pamData["press"] = rsData.variables["air_pressure"][:]
pamData["relhum"] = rsData.variables["relative_humidity"][:] * 100
pamData["hgt"] = rsData.variables["height"][:] 

Optional data

In [7]:
pamData["lat"] =rsData.variables["latitude"][:]
pamData["lon"] =rsData.variables["longitude"][:]
pamData["lfrac"] = np.array([1])

pass the data to the pamtra object. Note how additonal data is added

In [8]:
pam.createProfile(**pamData)
/home/mmaahn/lib/python/pyPamtra/core.py:711: Warning: timestamp set to now
  warnings.warn("timestamp set to now", Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:728: Warning: wind10u set to 0
  warnings.warn("%s set to %s"%(environment,preset,), Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:728: Warning: wind10v set to 0
  warnings.warn("%s set to %s"%(environment,preset,), Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:728: Warning: groundtemp set to nan
  warnings.warn("%s set to %s"%(environment,preset,), Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:738: Warning: obs_height set to [833000.0, 0.0]
  warnings.warn("%s set to %s"%(environment,preset,), Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:748: Warning: hydro_q set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:748: Warning: hydro_reff set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:748: Warning: hydro_n set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:759: Warning: airturb set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:766: Warning: wind_w set to nan
  warnings.warn(qValue + " set to nan", Warning)

equivalent to: pam.createProfile(temp=pamData["temp"],relhum?pamData["relhum"],....)

Settings

In [9]:
pam.nmlSet
Out[9]:
{'active': True,
 'add_obs_height_to_layer': False,
 'creator': 'Pamtrauser',
 'data_path': 'data/',
 'emissivity': 0.6,
 'file_desc': '',
 'gas_mod': 'R98',
 'ground_type': 'L',
 'hydro_adaptive_grid': True,
 'hydro_fullspec': False,
 'hydro_includehydroinrhoair': True,
 'hydro_limit_density_area': True,
 'hydro_softsphere_min_density': 10.0,
 'hydro_threshold': 1e-10,
 'lgas_extinction': True,
 'lhyd_absorption': True,
 'lhyd_emission': True,
 'lhyd_scattering': True,
 'liblapack': True,
 'liq_mod': 'Ell',
 'outpol': 'VH',
 'passive': True,
 'radar_airmotion': False,
 'radar_airmotion_linear_steps': 30,
 'radar_airmotion_model': 'step',
 'radar_airmotion_step_vmin': 0.5,
 'radar_airmotion_vmax': 4.0,
 'radar_airmotion_vmin': -4.0,
 'radar_aliasing_nyquist_interv': 1,
 'radar_attenuation': 'disabled',
 'radar_convolution_fft': True,
 'radar_k2': 0.93,
 'radar_max_v': 7.885,
 'radar_min_spectral_snr': 1.2,
 'radar_min_v': -7.885,
 'radar_mode': 'simple',
 'radar_nfft': 256,
 'radar_no_ave': 150,
 'radar_noise_distance_factor': 2.0,
 'radar_npeaks': 1,
 'radar_pnoise0': -32.23,
 'radar_polarisation': 'NN',
 'radar_receiver_miscalibration': 0.0,
 'radar_receiver_uncertainty_std': 0.0,
 'radar_save_noise_corrected_spectra': False,
 'radar_smooth_spectrum': True,
 'radar_use_hildebrand': False,
 'radar_use_wider_peak': False,
 'randomseed': 0,
 'salinity': 33.0,
 'save_psd': False,
 'save_ssp': False,
 'tmatrix_db': 'none',
 'tmatrix_db_path': 'database/',
 'write_nc': True}
In [1]:
pam.nmlSet["active"] = False
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-1-3ece4f2d70c5> in <module>()
----> 1 pam.nmlSet["active"] = False

NameError: name 'pam' is not defined

That's it!. Run Pamtra

In [11]:
pam.runPamtra([22.24,51.26])

Explore the results

In [39]:
pam.r["tb"]
Out[39]:
array([[[[[[ 254.90436496,  254.90436496]],

          [[ 254.9112975 ,  254.9112975 ]],

          [[ 254.92793886,  254.92793886]],

          [[ 254.95508172,  254.95508172]],

          [[ 254.99417652,  254.99417652]],

          [[ 255.04748453,  255.04748453]],

          [[ 255.11842565,  255.11842565]],

          [[ 255.21218613,  255.21218613]],

          [[ 255.33681208,  255.33681208]],

          [[ 255.50528679,  255.50528679]],

          [[ 255.73978341,  255.73978341]],

          [[ 256.08128465,  256.08128465]],

          [[ 256.6145309 ,  256.6145309 ]],

          [[ 257.54690519,  257.54690519]],

          [[ 259.55146665,  259.55146665]],

          [[ 266.50307613,  266.50307613]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]]],


         [[[ 254.44845445,  254.44845445]],

          [[ 254.45211905,  254.45211905]],

          [[ 254.46091855,  254.46091855]],

          [[ 254.47527925,  254.47527925]],

          [[ 254.49598163,  254.49598163]],

          [[ 254.52424514,  254.52424514]],

          [[ 254.56191982,  254.56191982]],

          [[ 254.61182286,  254.61182286]],

          [[ 254.67834851,  254.67834851]],

          [[ 254.76863792,  254.76863792]],

          [[ 254.89500338,  254.89500338]],

          [[ 255.08050498,  255.08050498]],

          [[ 255.37376682,  255.37376682]],

          [[ 255.8976649 ,  255.8976649 ]],

          [[ 257.07746306,  257.07746306]],

          [[ 262.01116045,  262.01116045]],

          [[ 105.08360343,  105.08360343]],

          [[  42.73412   ,   42.73412   ]],

          [[  27.81042699,   27.81042699]],

          [[  21.17481807,   21.17481807]],

          [[  17.45472089,   17.45472089]],

          [[  15.09769277,   15.09769277]],

          [[  13.48934084,   13.48934084]],

          [[  12.33823989,   12.33823989]],

          [[  11.48875581,   11.48875581]],

          [[  10.85059484,   10.85059484]],

          [[  10.36817638,   10.36817638]],

          [[  10.00585459,   10.00585459]],

          [[   9.74021193,    9.74021193]],

          [[   9.55580744,    9.55580744]],

          [[   9.442756  ,    9.442756  ]],

          [[   9.39566165,    9.39566165]]]]]])
In [13]:
print pam.dimensions["tb"]
print pam.p["obs_height"]
print pam.r["angles_deg"]
['gridx', 'gridy', 'outlevels', 'angles', 'frequency', 'passive_npol']
[[[ 833000.       0.]]]
[ 180.          173.02957873  167.23764055  161.49299212  155.76226444
  150.03750098  144.31583116  138.59596486  132.8772348   127.15925967
  121.44180428  115.72471411  110.00788175  104.29122833   98.57469255
   92.85822362   87.14177638   81.42530745   75.70877167   69.99211825
   64.27528589   58.55819572   52.84074033   47.1227652    41.40403514
   35.68416884   29.96249902   24.23773556   18.50700788   12.76235945
    6.97042127    0.        ]
In [14]:
gridx = 0
gridy = 0
outlevel = 1
angle= np.where(pam.r["angles_deg"]==0)[0] # gives the index where angles_deg == 0

groundbased zenith view

In [15]:
pam.r["tb"][gridx,gridy,outlevel,angle].mean(axis=-1)
Out[15]:
array([[  20.59316372,  104.3881451 ]])

satellite nadir view

In [16]:
outlevel = 0
angle= np.where(pam.r["angles_deg"]==180)[0]
pam.r["tb"][gridx,gridy,outlevel,angle].mean(axis=-1)
Out[16]:
array([[ 251.16838524,  255.34652853]])

run pamtra parallel

In [17]:
f_hatpro_Kband = [22.24, 23.04, 23.84, 25.44, 26.24, 27.84, 31.40]
f_hatpro_Vband = [51.26, 52.28, 53.86, 54.94, 56.66, 57.30, 58.00]
pam.runParallelPamtra(f_hatpro_Kband+f_hatpro_Vband,pp_local_workers='auto', pp_deltaF=1, pp_deltaX=0, pp_deltaY=0)
In [18]:
gridx = 0
gridy = 0
outlevel = 1
angle= np.where(pam.r["angles_deg"]==0)[0]
pam.r["tb"][gridx,gridy,outlevel,angle].mean(axis=-1)
Out[18]:
array([[  20.59316372,   20.33581613,   18.48255342,   15.13575237,
          14.19894512,   13.35583705,   13.92878796,  104.3881451 ,
         143.95343503,  237.97479957,  266.87892381,  272.27527794,
         272.7657382 ,  273.06649751]])
In [19]:
plt.plot(pam.r["tb"][gridx,gridy,outlevel,angle].mean(axis=-1).flatten())
plt.xticks(range(len(pam.set["freqs"])),pam.set["freqs"],rotation=90)
plt.show()
Out[19]:
([<matplotlib.axis.XTick at 0x7f0062617f10>,
  <matplotlib.axis.XTick at 0x7f0060db3290>,
  <matplotlib.axis.XTick at 0x7f0060d28710>,
  <matplotlib.axis.XTick at 0x7f0060d28c90>,
  <matplotlib.axis.XTick at 0x7f0060d39410>,
  <matplotlib.axis.XTick at 0x7f0060d39b50>,
  <matplotlib.axis.XTick at 0x7f0060d422d0>,
  <matplotlib.axis.XTick at 0x7f0060d42a10>,
  <matplotlib.axis.XTick at 0x7f0060d4d190>,
  <matplotlib.axis.XTick at 0x7f0060d4d8d0>,
  <matplotlib.axis.XTick at 0x7f0060d57050>,
  <matplotlib.axis.XTick at 0x7f0060d57790>,
  <matplotlib.axis.XTick at 0x7f0060d57ed0>,
  <matplotlib.axis.XTick at 0x7f0060dd7890>],
 <a list of 14 Text xticklabel objects>)
In [20]:
pam.writeResultsToNetCDF("pamtra_rs.nc")

Simulate a CloudSat measurement from a Como field

In [1]:
from __future__ import division
import numpy as np
import matplotlib.pyplot as plt
import matplotlib
import pandas as pn
import pyPamtra

#%matplotlib inline

define the hydrometeor classes. Replace "mie-sphere" by "disabled" to turn off a specfic hydrometeor. Everything in SI units!

In [2]:
descriptorFile = np.array([
      #['hydro_name' 'as_ratio' 'liq_ice' 'rho_ms' 'a_ms' 'b_ms' 'alpha_as' 'beta_as' 'moment_in' 'nbin' 'dist_name' 'p_1' 'p_2' 'p_3' 'p_4' 'd_1' 'd_2' 'scat_name' 'vel_size_mod' 'canting']
       ('cwc_q', -99.0, 1, -99.0, -99.0, -99.0, -99.0, -99.0, 3, 1, 'mono', -99.0, -99.0, -99.0, -99.0, 2e-05, -99.0, 'mie-sphere', 'khvorostyanov01_drops', -99.0),
       ('iwc_q', -99.0, -1, -99.0, 130.0, 3.0, 0.684, 2.0, 3, 1, 'mono_cosmo_ice', -99.0, -99.0, -99.0, -99.0, -99.0, -99.0, 'mie-sphere', 'heymsfield10_particles', -99.0),
       ('rwc_q', -99.0, 1, -99.0, -99.0, -99.0, -99.0, -99.0, 3, 50, 'exp', -99.0, -99.0, 8000000.0, -99.0, 0.00012, 0.006, 'mie-sphere', 'khvorostyanov01_drops', -99.0),
       ('swc_q', -99.0, -1, -99.0, 0.038, 2.0, 0.3971, 1.88, 3, 50, 'exp_cosmo_snow', -99.0, -99.0, -99.0, -99.0, 5.1e-11, 0.02, 'mie-sphere', 'heymsfield10_particles', -99.0),
       ('gwc_q', -99.0, -1, -99.0, 169.6, 3.1, -99.0, -99.0, 3, 50, 'exp', -99.0, -99.0, 4000000.0, -99.0, 1e-10, 0.01, 'mie-sphere', 'khvorostyanov01_spheres', -99.0)], 
      dtype=[('hydro_name', 'S15'), ('as_ratio', '<f8'), ('liq_ice', '<i8'), ('rho_ms', '<f8'), ('a_ms', '<f8'), ('b_ms', '<f8'), ('alpha_as', '<f8'), ('beta_as', '<f8'), ('moment_in', '<i8'), ('nbin', '<i8'), ('dist_name', 'S15'), ('p_1', '<f8'), ('p_2', '<f8'), ('p_3', '<f8'), ('p_4', '<f8'), ('d_1', '<f8'), ('d_2', '<f8'), ('scat_name', 'S15'), ('vel_size_mod', 'S30'), ('canting', '<f8')]
      )

Table looks better when converting it to a DataFrame

In [4]:
pn.DataFrame(descriptorFile)
Out[4]:
hydro_name as_ratio liq_ice rho_ms a_ms b_ms alpha_as beta_as moment_in nbin dist_name p_1 p_2 p_3 p_4 d_1 d_2 scat_name vel_size_mod canting
0 cwc_q -99 1 -99 -99.000 -99.0 -99.0000 -99.00 3 1 mono -99 -99 -99 -99 2.000000e-05 -99.000 mie-sphere khvorostyanov01_drops -99
1 iwc_q -99 -1 -99 130.000 3.0 0.6840 2.00 3 1 mono_cosmo_ice -99 -99 -99 -99 -9.900000e+01 -99.000 mie-sphere heymsfield10_particles -99
2 rwc_q -99 1 -99 -99.000 -99.0 -99.0000 -99.00 3 50 exp -99 -99 8000000 -99 1.200000e-04 0.006 mie-sphere khvorostyanov01_drops -99
3 swc_q -99 -1 -99 0.038 2.0 0.3971 1.88 3 50 exp_cosmo_snow -99 -99 -99 -99 5.100000e-11 0.020 mie-sphere heymsfield10_particles -99
4 gwc_q -99 -1 -99 169.600 3.1 -99.0000 -99.00 3 50 exp -99 -99 4000000 -99 1.000000e-10 0.010 mie-sphere khvorostyanov01_spheres -99

Use the Cosmo import routine to read Comso netCDF

In [5]:
constantFields = "/home/mech/workspace/pamtra/doc/tutorials/data/cosmo_constant_fields.nc"
fname='/home/mech/workspace/pamtra/doc/tutorials/data/LMK_gop9_test_fields_SynSatMic_201307241200-0900.nc.gz'
pam = pyPamtra.importer.readCosmoDe1MomDataset(fname,"gop_fields_SynSatMic",descriptorFile,colIndex=0,verbosity=1,constantFields=constantFields)
opening 1 of 1 cosmo_constant_fields.nc
LMK_gop9_test_fields_SynSatMic_201307241200-0900.nc.gz
/home/mmaahn/lib/python/pyPamtra/core.py:738: Warning: obs_height set to [833000.0, 0.0]
  warnings.warn("%s set to %s"%(environment,preset,), Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:748: Warning: hydro_reff set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:748: Warning: hydro_n set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:759: Warning: airturb set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mmaahn/lib/python/pyPamtra/core.py:766: Warning: wind_w set to nan
  warnings.warn(qValue + " set to nan", Warning)

Settings

In [6]:
#turn off passive calculations
pam.nmlSet["passive"] = False
#used frequencies
frequencies = [94]

#show some messages
pam.set["verbose"] = 0
pam.set["pyVerbose"] = 1

Cut out the CloudSat track and remove from the pamtra object

In [7]:
filterPam = np.zeros(pam._shape2D,dtype=bool)
filterPam[3:400,240] = True
print pam._shape3D
pam.filterProfiles(filterPam)
print pam._shape3D
(421, 461, 50)
(397, 1, 50)

Run Pamtra

W-band is attenuated

In [2]:
pam.nmlSet['radar_attenuation'] = 'top-down'
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-2-b7545532e727> in <module>()
----> 1 pam.nmlSet['radar_attenuation'] = 'top-down'
      2 pam.nmlSet

NameError: name 'pam' is not defined
In [9]:
pam.runParallelPamtra(frequencies, pp_deltaX=1, pp_deltaY=1, pp_deltaF=0, pp_local_workers="auto")
submitting job  0 0 1 0 1 0 1
submitted job:  1
submitting job  1 0 1 1 2 0 1
submitted job:  2
submitting job  2 0 1 2 3 0 1
submitted job:  3
submitting job  3 0 1 3 4 0 1
submitted job:  4
submitting job  4 0 1 4 5 0 1
submitted job:  5
submitting job  5 0 1 5 6 0 1
submitted job:  6
submitting job  6 0 1 6 7 0 1
submitted job:  7
submitting job  7 0 1 7 8 0 1
submitted job:  8
submitting job  8 0 1 8 9 0 1
submitted job:  9
submitting job  9 0 1 9 10 0 1
submitted job:  10
submitting job  10 0 1 10 11 0 1
submitted job:  11
submitting job  11 0 1 11 12 0 1
submitted job:  12
submitting job  12 0 1 12 13 0 1
submitted job:  13
submitting job  13 0 1 13 14 0 1
submitted job:  14
submitting job  14 0 1 14 15 0 1
submitted job:  15
submitting job  15 0 1 15 16 0 1
submitted job:  16
submitting job  16 0 1 16 17 0 1
submitted job:  17
submitting job  17 0 1 17 18 0 1
submitted job:  18
submitting job  18 0 1 18 19 0 1
submitted job:  19
submitting job  19 0 1 19 20 0 1
submitted job:  20
submitting job  20 0 1 20 21 0 1
submitted job:  21
submitting job  21 0 1 21 22 0 1
submitted job:  22
submitting job  22 0 1 22 23 0 1
submitted job:  23
submitting job  23 0 1 23 24 0 1
submitted job:  24
submitting job  24 0 1 24 25 0 1
submitted job:  25
submitting job  25 0 1 25 26 0 1
submitted job:  26
submitting job  26 0 1 26 27 0 1
submitted job:  27
submitting job  27 0 1 27 28 0 1
submitted job:  28
submitting job  28 0 1 28 29 0 1
submitted job:  29
submitting job  29 0 1 29 30 0 1
submitted job:  30
submitting job  30 0 1 30 31 0 1
submitted job:  31
submitting job  31 0 1 31 32 0 1
submitted job:  32
submitting job  32 0 1 32 33 0 1
submitted job:  33
submitting job  33 0 1 33 34 0 1
submitted job:  34
submitting job  34 0 1 34 35 0 1
submitted job:  35
submitting job  35 0 1 35 36 0 1
submitted job:  36
submitting job  36 0 1 36 37 0 1
submitted job:  37
submitting job  37 0 1 37 38 0 1
submitted job:  38
submitting job  38 0 1 38 39 0 1
submitted job:  39
submitting job  39 0 1 39 40 0 1
submitted job:  40
submitting job  40 0 1 40 41 0 1
submitted job:  41
submitting job  41 0 1 41 42 0 1
submitted job:  42
submitting job  42 0 1 42 43 0 1
submitted job:  43
submitting job  43 0 1 43 44 0 1
submitted job:  44
submitting job  44 0 1 44 45 0 1
submitted job:  45
submitting job  45 0 1 45 46 0 1
submitted job:  46
submitting job  46 0 1 46 47 0 1
submitted job:  47
submitting job  47 0 1 47 48 0 1
submitted job:  48
submitting job  48 0 1 48 49 0 1
submitted job:  49
submitting job  49 0 1 49 50 0 1
submitted job:  50
submitting job  50 0 1 50 51 0 1
submitted job:  51
submitting job  51 0 1 51 52 0 1
submitted job:  52
submitting job  52 0 1 52 53 0 1
submitted job:  53
submitting job  53 0 1 53 54 0 1
submitted job:  54
submitting job  54 0 1 54 55 0 1
submitted job:  55
submitting job  55 0 1 55 56 0 1
submitted job:  56
submitting job  56 0 1 56 57 0 1
submitted job:  57
submitting job  57 0 1 57 58 0 1
submitted job:  58
submitting job  58 0 1 58 59 0 1
submitted job:  59
submitting job  59 0 1 59 60 0 1
submitted job:  60
submitting job  60 0 1 60 61 0 1
submitted job:  61
submitting job  61 0 1 61 62 0 1
submitted job:  62
submitting job  62 0 1 62 63 0 1
submitted job:  63
submitting job  63 0 1 63 64 0 1
submitted job:  64
submitting job  64 0 1 64 65 0 1
submitted job:  65
submitting job  65 0 1 65 66 0 1
submitted job:  66
submitting job  66 0 1 66 67 0 1
submitted job:  67
submitting job  67 0 1 67 68 0 1
submitted job:  68
submitting job  68 0 1 68 69 0 1
submitted job:  69
submitting job  69 0 1 69 70 0 1
submitted job:  70
submitting job  70 0 1 70 71 0 1
submitted job:  71
submitting job  71 0 1 71 72 0 1
submitted job:  72
submitting job  72 0 1 72 73 0 1
submitted job:  73
submitting job  73 0 1 73 74 0 1
submitted job:  74
submitting job  74 0 1 74 75 0 1
submitted job:  75
submitting job  75 0 1 75 76 0 1
submitted job:  76
submitting job  76 0 1 76 77 0 1
submitted job:  77
submitting job  77 0 1 77 78 0 1
submitted job:  78
submitting job  78 0 1 78 79 0 1
submitted job:  79
submitting job  79 0 1 79 80 0 1
submitted job:  80
submitting job  80 0 1 80 81 0 1
submitted job:  81
submitting job  81 0 1 81 82 0 1
submitted job:  82
submitting job  82 0 1 82 83 0 1
submitted job:  83
submitting job  83 0 1 83 84 0 1
submitted job:  84
submitting job  84 0 1 84 85 0 1
submitted job:  85
submitting job  85 0 1 85 86 0 1
submitted job:  86
submitting job  86 0 1 86 87 0 1
submitted job:  87
submitting job  87 0 1 87 88 0 1
submitted job:  88
submitting job  88 0 1 88 89 0 1
submitted job:  89
submitting job  89 0 1 89 90 0 1
submitted job:  90
submitting job  90 0 1 90 91 0 1
submitted job:  91
submitting job  91 0 1 91 92 0 1
submitted job:  92
submitting job  92 0 1 92 93 0 1
submitted job:  93
submitting job  93 0 1 93 94 0 1
submitted job:  94
submitting job  94 0 1 94 95 0 1
submitted job:  95
submitting job  95 0 1 95 96 0 1
submitted job:  96
submitting job  96 0 1 96 97 0 1
submitted job:  97
submitting job  97 0 1 97 98 0 1
submitted job:  98
submitting job  98 0 1 98 99 0 1
submitted job:  99
submitting job  99 0 1 99 100 0 1
submitted job:  100
submitting job  100 0 1 100 101 0 1
submitted job:  101
submitting job  101 0 1 101 102 0 1
submitted job:  102
submitting job  102 0 1 102 103 0 1
submitted job:  103
submitting job  103 0 1 103 104 0 1
submitted job:  104
submitting job  104 0 1 104 105 0 1
submitted job:  105
submitting job  105 0 1 105 106 0 1
submitted job:  106
submitting job  106 0 1 106 107 0 1
submitted job:  107
submitting job  107 0 1 107 108 0 1
submitted job:  108
submitting job  108 0 1 108 109 0 1
submitted job:  109
submitting job  109 0 1 109 110 0 1
submitted job:  110
submitting job  110 0 1 110 111 0 1
submitted job:  111
submitting job  111 0 1 111 112 0 1
submitted job:  112
submitting job  112 0 1 112 113 0 1
submitted job:  113
submitting job  113 0 1 113 114 0 1
submitted job:  114
submitting job  114 0 1 114 115 0 1
submitted job:  115
submitting job  115 0 1 115 116 0 1
submitted job:  116
submitting job  116 0 1 116 117 0 1
submitted job:  117
submitting job  117 0 1 117 118 0 1
submitted job:  118
submitting job  118 0 1 118 119 0 1
submitted job:  119
submitting job  119 0 1 119 120 0 1
submitted job:  120
submitting job  120 0 1 120 121 0 1
submitted job:  121
submitting job  121 0 1 121 122 0 1
submitted job:  122
submitting job  122 0 1 122 123 0 1
submitted job:  123
submitting job  123 0 1 123 124 0 1
submitted job:  124
submitting job  124 0 1 124 125 0 1
submitted job:  125
submitting job  125 0 1 125 126 0 1
submitted job:  126
submitting job  126 0 1 126 127 0 1
submitted job:  127
submitting job  127 0 1 127 128 0 1
submitted job:  128
submitting job  128 0 1 128 129 0 1
submitted job:  129
submitting job  129 0 1 129 130 0 1
submitted job:  130
submitting job  130 0 1 130 131 0 1
submitted job:  131
submitting job  131 0 1 131 132 0 1
submitted job:  132
submitting job  132 0 1 132 133 0 1
submitted job:  133
submitting job  133 0 1 133 134 0 1
submitted job:  134
submitting job  134 0 1 134 135 0 1
submitted job:  135
submitting job  135 0 1 135 136 0 1
submitted job:  136
submitting job  136 0 1 136 137 0 1
submitted job:  137
submitting job  137 0 1 137 138 0 1
submitted job:  138
submitting job  138 0 1 138 139 0 1
submitted job:  139
submitting job  139 0 1 139 140 0 1
submitted job:  140
submitting job  140 0 1 140 141 0 1
submitted job:  141
submitting job  141 0 1 141 142 0 1
submitted job:  142
submitting job  142 0 1 142 143 0 1
submitted job:  143
submitting job  143 0 1 143 144 0 1
submitted job:  144
submitting job  144 0 1 144 145 0 1
submitted job:  145
submitting job  145 0 1 145 146 0 1
submitted job:  146
submitting job  146 0 1 146 147 0 1
submitted job:  147
submitting job  147 0 1 147 148 0 1
submitted job:  148
submitting job  148 0 1 148 149 0 1
submitted job:  149
submitting job  149 0 1 149 150 0 1
submitted job:  150
submitting job  150 0 1 150 151 0 1
submitted job:  151
submitting job  151 0 1 151 152 0 1
submitted job:  152
submitting job  152 0 1 152 153 0 1
submitted job:  153
submitting job  153 0 1 153 154 0 1
submitted job:  154
submitting job  154 0 1 154 155 0 1
submitted job:  155
submitting job  155 0 1 155 156 0 1
submitted job:  156
submitting job  156 0 1 156 157 0 1
submitted job:  157
submitting job  157 0 1 157 158 0 1
submitted job:  158
submitting job  158 0 1 158 159 0 1
submitted job:  159
submitting job  159 0 1 159 160 0 1
submitted job:  160
submitting job  160 0 1 160 161 0 1
submitted job:  161
submitting job  161 0 1 161 162 0 1
submitted job:  162
submitting job  162 0 1 162 163 0 1
submitted job:  163
submitting job  163 0 1 163 164 0 1
submitted job:  164
submitting job  164 0 1 164 165 0 1
submitted job:  165
submitting job  165 0 1 165 166 0 1
submitted job:  166
submitting job  166 0 1 166 167 0 1
submitted job:  167
submitting job  167 0 1 167 168 0 1
submitted job:  168
submitting job  168 0 1 168 169 0 1
submitted job:  169
submitting job  169 0 1 169 170 0 1
submitted job:  170
submitting job  170 0 1 170 171 0 1
submitted job:  171
submitting job  171 0 1 171 172 0 1
submitted job:  172
submitting job  172 0 1 172 173 0 1
submitted job:  173
submitting job  173 0 1 173 174 0 1
submitted job:  174
submitting job  174 0 1 174 175 0 1
submitted job:  175
submitting job  175 0 1 175 176 0 1
submitted job:  176
submitting job  176 0 1 176 177 0 1
submitted job:  177
submitting job  177 0 1 177 178 0 1
submitted job:  178
submitting job  178 0 1 178 179 0 1
submitted job:  179
submitting job  179 0 1 179 180 0 1
submitted job:  180
submitting job  180 0 1 180 181 0 1
submitted job:  181
submitting job  181 0 1 181 182 0 1
submitted job:  182
submitting job  182 0 1 182 183 0 1
submitted job:  183
submitting job  183 0 1 183 184 0 1
submitted job:  184
submitting job  184 0 1 184 185 0 1
submitted job:  185
submitting job  185 0 1 185 186 0 1
submitted job:  186
submitting job  186 0 1 186 187 0 1
submitted job:  187
submitting job  187 0 1 187 188 0 1
submitted job:  188
submitting job  188 0 1 188 189 0 1
submitted job:  189
submitting job  189 0 1 189 190 0 1
submitted job:  190
submitting job  190 0 1 190 191 0 1
submitted job:  191
submitting job  191 0 1 191 192 0 1
submitted job:  192
submitting job  192 0 1 192 193 0 1
submitted job:  193
submitting job  193 0 1 193 194 0 1
submitted job:  194
submitting job  194 0 1 194 195 0 1
submitted job:  195
submitting job  195 0 1 195 196 0 1
submitted job:  196
submitting job  196 0 1 196 197 0 1
submitted job:  197
submitting job  197 0 1 197 198 0 1
submitted job:  198
submitting job  198 0 1 198 199 0 1
submitted job:  199
submitting job  199 0 1 199 200 0 1
submitted job:  200
submitting job  200 0 1 200 201 0 1
submitted job:  201
submitting job  201 0 1 201 202 0 1
submitted job:  202
submitting job  202 0 1 202 203 0 1
submitted job:  203
submitting job  203 0 1 203 204 0 1
submitted job:  204
submitting job  204 0 1 204 205 0 1
submitted job:  205
submitting job  205 0 1 205 206 0 1
submitted job:  206
submitting job  206 0 1 206 207 0 1
submitted job:  207
submitting job  207 0 1 207 208 0 1
submitted job:  208
submitting job  208 0 1 208 209 0 1
submitted job:  209
submitting job  209 0 1 209 210 0 1
submitted job:  210
submitting job  210 0 1 210 211 0 1
submitted job:  211
submitting job  211 0 1 211 212 0 1
submitted job:  212
submitting job  212 0 1 212 213 0 1
submitted job:  213
submitting job  213 0 1 213 214 0 1
submitted job:  214
submitting job  214 0 1 214 215 0 1
submitted job:  215
submitting job  215 0 1 215 216 0 1
submitted job:  216
submitting job  216 0 1 216 217 0 1
submitted job:  217
submitting job  217 0 1 217 218 0 1
submitted job:  218
submitting job  218 0 1 218 219 0 1
submitted job:  219
submitting job  219 0 1 219 220 0 1
submitted job:  220
submitting job  220 0 1 220 221 0 1
submitted job:  221
submitting job  221 0 1 221 222 0 1
submitted job:  222
submitting job  222 0 1 222 223 0 1
submitted job:  223
submitting job  223 0 1 223 224 0 1
submitted job:  224
submitting job  224 0 1 224 225 0 1
submitted job:  225
submitting job  225 0 1 225 226 0 1
submitted job:  226
submitting job  226 0 1 226 227 0 1
submitted job:  227
submitting job  227 0 1 227 228 0 1
submitted job:  228
submitting job  228 0 1 228 229 0 1
submitted job:  229
submitting job  229 0 1 229 230 0 1
submitted job:  230
submitting job  230 0 1 230 231 0 1
submitted job:  231
submitting job  231 0 1 231 232 0 1
submitted job:  232
submitting job  232 0 1 232 233 0 1
submitted job:  233
submitting job  233 0 1 233 234 0 1
submitted job:  234
submitting job  234 0 1 234 235 0 1
submitted job:  235
submitting job  235 0 1 235 236 0 1
submitted job:  236
submitting job  236 0 1 236 237 0 1
submitted job:  237
submitting job  237 0 1 237 238 0 1
submitted job:  238
submitting job  238 0 1 238 239 0 1
submitted job:  239
submitting job  239 0 1 239 240 0 1
submitted job:  240
submitting job  240 0 1 240 241 0 1
submitted job:  241
submitting job  241 0 1 241 242 0 1
submitted job:  242
submitting job  242 0 1 242 243 0 1
submitted job:  243
submitting job  243 0 1 243 244 0 1
submitted job:  244
submitting job  244 0 1 244 245 0 1
submitted job:  245
submitting job  245 0 1 245 246 0 1
submitted job:  246
submitting job  246 0 1 246 247 0 1
submitted job:  247
submitting job  247 0 1 247 248 0 1
submitted job:  248
submitting job  248 0 1 248 249 0 1
submitted job:  249
submitting job  249 0 1 249 250 0 1
submitted job:  250
submitting job  250 0 1 250 251 0 1
submitted job:  251
submitting job  251 0 1 251 252 0 1
submitted job:  252
submitting job  252 0 1 252 253 0 1
submitted job:  253
submitting job  253 0 1 253 254 0 1
submitted job:  254
submitting job  254 0 1 254 255 0 1
submitted job:  255
submitting job  255 0 1 255 256 0 1
submitted job:  256
submitting job  256 0 1 256 257 0 1
submitted job:  257
submitting job  257 0 1 257 258 0 1
submitted job:  258
submitting job  258 0 1 258 259 0 1
submitted job:  259
submitting job  259 0 1 259 260 0 1
submitted job:  260
submitting job  260 0 1 260 261 0 1
submitted job:  261
submitting job  261 0 1 261 262 0 1
submitted job:  262
submitting job  262 0 1 262 263 0 1
submitted job:  263
submitting job  263 0 1 263 264 0 1
submitted job:  264
submitting job  264 0 1 264 265 0 1
submitted job:  265
submitting job  265 0 1 265 266 0 1
submitted job:  266
submitting job  266 0 1 266 267 0 1
submitted job:  267
submitting job  267 0 1 267 268 0 1
submitted job:  268
submitting job  268 0 1 268 269 0 1
submitted job:  269
submitting job  269 0 1 269 270 0 1
submitted job:  270
submitting job  270 0 1 270 271 0 1
submitted job:  271
submitting job  271 0 1 271 272 0 1
submitted job:  272
submitting job  272 0 1 272 273 0 1
submitted job:  273
submitting job  273 0 1 273 274 0 1
submitted job:  274
submitting job  274 0 1 274 275 0 1
submitted job:  275
submitting job  275 0 1 275 276 0 1
submitted job:  276
submitting job  276 0 1 276 277 0 1
submitted job:  277
submitting job  277 0 1 277 278 0 1
submitted job:  278
submitting job  278 0 1 278 279 0 1
submitted job:  279
submitting job  279 0 1 279 280 0 1
submitted job:  280
submitting job  280 0 1 280 281 0 1
submitted job:  281
submitting job  281 0 1 281 282 0 1
submitted job:  282
submitting job  282 0 1 282 283 0 1
submitted job:  283
submitting job  283 0 1 283 284 0 1
submitted job:  284
submitting job  284 0 1 284 285 0 1
submitted job:  285
submitting job  285 0 1 285 286 0 1
submitted job:  286
submitting job  286 0 1 286 287 0 1
submitted job:  287
submitting job  287 0 1 287 288 0 1
submitted job:  288
submitting job  288 0 1 288 289 0 1
submitted job:  289
submitting job  289 0 1 289 290 0 1
submitted job:  290
submitting job  290 0 1 290 291 0 1
submitted job:  291
submitting job  291 0 1 291 292 0 1
submitted job:  292
submitting job  292 0 1 292 293 0 1
submitted job:  293
submitting job  293 0 1 293 294 0 1
submitted job:  294
submitting job  294 0 1 294 295 0 1
submitted job:  295
submitting job  295 0 1 295 296 0 1
submitted job:  296
submitting job  296 0 1 296 297 0 1
submitted job:  297
submitting job  297 0 1 297 298 0 1
submitted job:  298
submitting job  298 0 1 298 299 0 1
submitted job:  299
submitting job  299 0 1 299 300 0 1
submitted job:  300
submitting job  300 0 1 300 301 0 1
submitted job:  301
submitting job  301 0 1 301 302 0 1
submitted job:  302
submitting job  302 0 1 302 303 0 1
submitted job:  303
submitting job  303 0 1 303 304 0 1
submitted job:  304
submitting job  304 0 1 304 305 0 1
submitted job:  305
submitting job  305 0 1 305 306 0 1
submitted job:  306
submitting job  306 0 1 306 307 0 1
submitted job:  307
submitting job  307 0 1 307 308 0 1
submitted job:  308
submitting job  308 0 1 308 309 0 1
submitted job:  309
submitting job  309 0 1 309 310 0 1
submitted job:  310
submitting job  310 0 1 310 311 0 1
submitted job:  311
submitting job  311 0 1 311 312 0 1
submitted job:  312
submitting job  312 0 1 312 313 0 1
submitted job:  313
submitting job  313 0 1 313 314 0 1
submitted job:  314
submitting job  314 0 1 314 315 0 1
submitted job:  315
submitting job  315 0 1 315 316 0 1
submitted job:  316
submitting job  316 0 1 316 317 0 1
submitted job:  317
submitting job  317 0 1 317 318 0 1
submitted job:  318
submitting job  318 0 1 318 319 0 1
submitted job:  319
submitting job  319 0 1 319 320 0 1
submitted job:  320
submitting job  320 0 1 320 321 0 1
submitted job:  321
submitting job  321 0 1 321 322 0 1
submitted job:  322
submitting job  322 0 1 322 323 0 1
submitted job:  323
submitting job  323 0 1 323 324 0 1
submitted job:  324
submitting job  324 0 1 324 325 0 1
submitted job:  325
submitting job  325 0 1 325 326 0 1
submitted job:  326
submitting job  326 0 1 326 327 0 1
submitted job:  327
submitting job  327 0 1 327 328 0 1
submitted job:  328
submitting job  328 0 1 328 329 0 1
submitted job:  329
submitting job  329 0 1 329 330 0 1
submitted job:  330
submitting job  330 0 1 330 331 0 1
submitted job:  331
submitting job  331 0 1 331 332 0 1
submitted job:  332
submitting job  332 0 1 332 333 0 1
submitted job:  333
submitting job  333 0 1 333 334 0 1
submitted job:  334
submitting job  334 0 1 334 335 0 1
submitted job:  335
submitting job  335 0 1 335 336 0 1
submitted job:  336
submitting job  336 0 1 336 337 0 1
submitted job:  337
submitting job  337 0 1 337 338 0 1
submitted job:  338
submitting job  338 0 1 338 339 0 1
submitted job:  339
submitting job  339 0 1 339 340 0 1
submitted job:  340
submitting job  340 0 1 340 341 0 1
submitted job:  341
submitting job  341 0 1 341 342 0 1
submitted job:  342
submitting job  342 0 1 342 343 0 1
submitted job:  343
submitting job  343 0 1 343 344 0 1
submitted job:  344
submitting job  344 0 1 344 345 0 1
submitted job:  345
submitting job  345 0 1 345 346 0 1
submitted job:  346
submitting job  346 0 1 346 347 0 1
submitted job:  347
submitting job  347 0 1 347 348 0 1
submitted job:  348
submitting job  348 0 1 348 349 0 1
submitted job:  349
submitting job  349 0 1 349 350 0 1
submitted job:  350
submitting job  350 0 1 350 351 0 1
submitted job:  351
submitting job  351 0 1 351 352 0 1
submitted job:  352
submitting job  352 0 1 352 353 0 1
submitted job:  353
submitting job  353 0 1 353 354 0 1
submitted job:  354
submitting job  354 0 1 354 355 0 1
submitted job:  355
submitting job  355 0 1 355 356 0 1
submitted job:  356
submitting job  356 0 1 356 357 0 1
submitted job:  357
submitting job  357 0 1 357 358 0 1
submitted job:  358
submitting job  358 0 1 358 359 0 1
submitted job:  359
submitting job  359 0 1 359 360 0 1
submitted job:  360
submitting job  360 0 1 360 361 0 1
submitted job:  361
submitting job  361 0 1 361 362 0 1
submitted job:  362
submitting job  362 0 1 362 363 0 1
submitted job:  363
submitting job  363 0 1 363 364 0 1
submitted job:  364
submitting job  364 0 1 364 365 0 1
submitted job:  365
submitting job  365 0 1 365 366 0 1
submitted job:  366
submitting job  366 0 1 366 367 0 1
submitted job:  367
submitting job  367 0 1 367 368 0 1
submitted job:  368
submitting job  368 0 1 368 369 0 1
submitted job:  369
submitting job  369 0 1 369 370 0 1
submitted job:  370
submitting job  370 0 1 370 371 0 1
submitted job:  371
submitting job  371 0 1 371 372 0 1
submitted job:  372
submitting job  372 0 1 372 373 0 1
submitted job:  373
submitting job  373 0 1 373 374 0 1
submitted job:  374
submitting job  374 0 1 374 375 0 1
submitted job:  375
submitting job  375 0 1 375 376 0 1
submitted job:  376
submitting job  376 0 1 376 377 0 1
submitted job:  377
submitting job  377 0 1 377 378 0 1
submitted job:  378
submitting job  378 0 1 378 379 0 1
submitted job:  379
submitting job  379 0 1 379 380 0 1
submitted job:  380
submitting job  380 0 1 380 381 0 1
submitted job:  381
submitting job  381 0 1 381 382 0 1
submitted job:  382
submitting job  382 0 1 382 383 0 1
submitted job:  383
submitting job  383 0 1 383 384 0 1
submitted job:  384
submitting job  384 0 1 384 385 0 1
submitted job:  385
submitting job  385 0 1 385 386 0 1
submitted job:  386
submitting job  386 0 1 386 387 0 1
submitted job:  387
submitting job  387 0 1 387 388 0 1
submitted job:  388
submitting job  388 0 1 388 389 0 1
submitted job:  389
submitting job  389 0 1 389 390 0 1
submitted job:  390
submitting job  390 0 1 390 391 0 1
submitted job:  391
submitting job  391 0 1 391 392 0 1
submitted job:  392
submitting job  392 0 1 392 393 0 1
submitted job:  393
submitting job  393 0 1 393 394 0 1
submitted job:  394
submitting job  394 0 1 394 395 0 1
submitted job:  395
submitting job  395 0 1 395 396 0 1
submitted job:  396
submitting job  396 0 1 396 397 0 1
submitted job:  397
waiting for all jobs to finish
got job 1
got job 2
got job 3
got job 4
got job 5
got job 6
got job 7
got job 8
got job 9
got job 10
got job 11
got job 12
got job 13
got job 14
got job 15
got job 16
got job 17
got job 18
got job 19
got job 20
got job 21
got job 22
got job 23
got job 24
got job 25
got job 26
got job 27
got job 28
got job 29
got job 30
got job 31
got job 32
got job 33
got job 34
got job 35
got job 36
got job 37
got job 38
got job 39
got job 40
got job 41
got job 42
got job 43
got job 44
got job 45
got job 46
got job 47
got job 48
got job 49
got job 50
got job 51
got job 52
got job 53
got job 54
got job 55
got job 56
got job 57
got job 58
got job 59
got job 60
got job 61
got job 62
got job 63
got job 64
got job 65
got job 66
got job 67
got job 68
got job 69
got job 70
got job 71
got job 72
got job 73
got job 74
got job 75
got job 76
got job 77
got job 78
got job 79
got job 80
got job 81
got job 82
got job 83
got job 84
got job 85
got job 86
got job 87
got job 88
got job 89
got job 90
got job 91
got job 92
got job 93
got job 94
got job 95
got job 96
got job 97
got job 98
got job 99
got job 100
got job 101
got job 102
got job 103
got job 104
got job 105
got job 106
got job 107
got job 108
got job 109
got job 110
got job 111
got job 112
got job 113
got job 114
got job 115
got job 116
got job 117
got job 118
got job 119
got job 120
got job 121
got job 122
got job 123
got job 124
got job 125
got job 126
got job 127
got job 128
got job 129
got job 130
got job 131
got job 132
got job 133
got job 134
got job 135
got job 136
got job 137
got job 138
got job 139
got job 140
got job 141
got job 142
got job 143
got job 144
got job 145
got job 146
got job 147
got job 148
got job 149
got job 150
got job 151
got job 152
got job 153
got job 154
got job 155
got job 156
got job 157
got job 158
got job 159
got job 160
got job 161
got job 162
got job 163
got job 164
got job 165
got job 166
got job 167
got job 168
got job 169
got job 170
got job 171
got job 172
got job 173
got job 174
got job 175
got job 176
got job 177
got job 178
got job 179
got job 180
got job 181
got job 182
got job 183
got job 184
got job 185
got job 186
got job 187
got job 188
got job 189
got job 190
got job 191
got job 192
got job 193
got job 194
got job 195
got job 196
got job 197
got job 198
got job 199
got job 200
got job 201
got job 202
got job 203
got job 204
got job 205
got job 206
got job 207
got job 208
got job 209
got job 210
got job 211
got job 212
got job 213
got job 214
got job 215
got job 216
got job 217
got job 218
got job 219
got job 220
got job 221
got job 222
got job 223
got job 224
got job 225
got job 226
got job 227
got job 228
got job 229
got job 230
got job 231
got job 232
got job 233
got job 234
got job 235
got job 236
got job 237
got job 238
got job 239
got job 240
got job 241
got job 242
got job 243
got job 244
got job 245
got job 246
got job 247
got job 248
got job 249
got job 250
got job 251
got job 252
got job 253
got job 254
got job 255
got job 256
got job 257
got job 258
got job 259
got job 260
got job 261
got job 262
got job 263
got job 264
got job 265
got job 266
got job 267
got job 268
got job 269
got job 270
got job 271
got job 272
got job 273
got job 274
got job 275
got job 276
got job 277
got job 278
got job 279
got job 280
got job 281
got job 282
got job 283
got job 284
got job 285
got job 286
got job 287
got job 288
got job 289
got job 290
got job 291
got job 292
got job 293
got job 294
got job 295
got job 296
got job 297
got job 298
got job 299
got job 300
got job 301
got job 302
got job 303
got job 304
got job 305
got job 306
got job 307
got job 308
got job 309
got job 310
got job 311
got job 312
got job 313
got job 314
got job 315
got job 316
got job 317
got job 318
got job 319
got job 320
got job 321
got job 322
got job 323
got job 324
got job 325
got job 326
got job 327
got job 328
got job 329
got job 330
got job 331
got job 332
got job 333
got job 334
got job 335
got job 336
got job 337
got job 338
got job 339
got job 340
got job 341
got job 342
got job 343
got job 344
got job 345
got job 346
got job 347
got job 348
got job 349
got job 350
got job 351
got job 352
got job 353
got job 354
got job 355
got job 356
got job 357
got job 358
got job 359
got job 360
got job 361
got job 362
got job 363
got job 364
got job 365
got job 366
got job 367
got job 368
got job 369
got job 370
got job 371
got job 372
got job 373
got job 374
got job 375
got job 376
got job 377
got job 378
got job 379
got job 380
got job 381
got job 382
got job 383
got job 384
got job 385
got job 386
got job 387
got job 388
got job 389
got job 390
got job 391
got job 392
got job 393
got job 394
got job 395
got job 396
got job 397
pyPamtra runtime: 7.02497291565

store results

In [11]:
outName = "cosmo_field_20130724_94.nc"
pam.writeResultsToNetCDF(outName, ncForm='NETCDF3_CLASSIC')
cosmo_field_20130724_94.nc written
In [12]:
pam.r["Ze"]
Out[12]:
array([[[[[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]],


         ..., 
         [[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]]]],




       [[[[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]],


         ..., 
         [[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]]]],




       [[[[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]],


         ..., 
         [[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]]]],




       ..., 
       [[[[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]],


         ..., 
         [[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]]]],




       [[[[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]],


         ..., 
         [[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]]]],




       [[[[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]],


         ..., 
         [[[-9999.]]],


         [[[-9999.]]],


         [[[-9999.]]]]]])
In [13]:
pam.dimensions["Ze"]
Out[13]:
['gridx', 'gridy', 'lyr', 'frequency', 'radar_npol', 'radar_npeaks']
In [14]:
frequency = 0
radar_npol = 0
gridy = 0
radar_npeak = 0
Ze = pam.r["Ze"][:,gridy,:,frequency,radar_npol,radar_npeak]
H = pam.p["hgt"][:,gridy]
lon = pam.p["lon"][:,gridy]
In [15]:
matplotlib.rcParams['figure.figsize'] = (10.0, 8.0)
plt.pcolormesh(lon,H.T,Ze.T,vmin=-50,vmax=30)
plt.ylim(0,np.max(H))
plt.xlim(np.min(lon),np.max(lon))
plt.xlabel("longitude [deg]")
plt.ylabel("height [m]")
plt.colorbar(label="Ze [dBz]")
plt.show()
Out[15]:
<matplotlib.colorbar.Colorbar at 0x7f165c3c2a90>

PAMTRA on Oberschleissheim radiosonde in comparison to HALO EMV flight observations

Read in the radiosonde data. The file has been modified to make is easier.

In [10]:
import numpy as np
rsData = np.genfromtxt('data/oberschleissheim_2016071912.txt',comments='#',skip_header=3,names=True)

import pyPamtra and create an pyPamtra object

In [11]:
import pyPamtra
pam = pyPamtra.pyPamtra()

It is mandatory to add a hydrometeor, eventhough it might be a clear sky profile.

In [12]:
pam.df.addHydrometeor(("ice", -99., -1, 917., 130., 3.0, 0.684, 2., 3, 1, "mono_cosmo_ice", -99., -99., -99., -99., -99., -99., "mie-sphere", "heymsfield10_particles",0.0))

Put the radiosonde data in a dictionary.

In [13]:
pamData = dict()
pamData["temp"] = rsData["TEMP"][:] + 273.15
pamData["press"] = rsData["PRES"][:] *100.
pamData["relhum"] = rsData["RELH"][:]
pamData["hgt"] = rsData["HGHT"][:] 
In [14]:
pamData["lat"] = np.array([48.25])
pamData["lon"] = np.array([11.55])
pamData["lfrac"] = np.array([1])
In [15]:
pam.createProfile(**pamData)
/home/mech/lib/python/pyPamtra/core.py:752: Warning: timestamp set to now
  warnings.warn("timestamp set to now", Warning)
/home/mech/lib/python/pyPamtra/core.py:769: Warning: wind10u set to 0
  warnings.warn("%s set to %s"%(environment,preset,), Warning)
/home/mech/lib/python/pyPamtra/core.py:769: Warning: wind10v set to 0
  warnings.warn("%s set to %s"%(environment,preset,), Warning)
/home/mech/lib/python/pyPamtra/core.py:769: Warning: groundtemp set to nan
  warnings.warn("%s set to %s"%(environment,preset,), Warning)
/home/mech/lib/python/pyPamtra/core.py:779: Warning: obs_height set to [833000.0, 0.0]
  warnings.warn("%s set to %s"%(environment,preset,), Warning)
/home/mech/lib/python/pyPamtra/core.py:789: Warning: hydro_q set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mech/lib/python/pyPamtra/core.py:789: Warning: hydro_reff set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mech/lib/python/pyPamtra/core.py:789: Warning: hydro_n set to 0
  warnings.warn(qValue + " set to 0", Warning)
/home/mech/lib/python/pyPamtra/core.py:800: Warning: airturb set to nan
  warnings.warn(qValue + " set to nan", Warning)
/home/mech/lib/python/pyPamtra/core.py:800: Warning: wind_w set to nan
  warnings.warn(qValue + " set to nan", Warning)
/home/mech/lib/python/pyPamtra/core.py:800: Warning: wind_uv set to nan
  warnings.warn(qValue + " set to nan", Warning)
/home/mech/lib/python/pyPamtra/core.py:800: Warning: turb_edr set to nan
  warnings.warn(qValue + " set to nan", Warning)
In [16]:
pam.p.keys()
Out[16]:
['wind_uv',
 'unixtime',
 'nlyrs',
 'press',
 'groundtemp',
 'turb_edr',
 'obs_height',
 'wind_w',
 'hgt',
 'lon',
 'ngridy',
 'hydro_reff',
 'lfrac',
 'radar_prop',
 'wind10u',
 'model_i',
 'model_j',
 'lat',
 'relhum',
 'ngridx',
 'hgt_lev',
 'hydro_q',
 'temp',
 'airturb',
 'max_nlyrs',
 'noutlevels',
 'wind10v',
 'hydro_n']
In [18]:
pam.runPamtra(22.24)
In [19]:
pam.r.keys()
Out[19]:
['psd_n',
 'radar_edges',
 'psd_d',
 'pamtraVersion',
 'radar_vel',
 'radar_hgt',
 'radar_slopes',
 'radar_snr',
 'radar_moments',
 'radar_spectra',
 'psd_deltad',
 'pamtraHash',
 'tb',
 'Att_hydro',
 'angles_deg',
 'scatter_matrix',
 'psd_mass',
 'Att_atmo',
 'extinct_matrix',
 'nmlSettings',
 'radar_quality',
 'Ze',
 'psd_area',
 'kextatmo',
 'emis_vector']
In [20]:
pam.r['tb']
Out[20]:
array([[[[[[ 283.10654585,  283.10654585]],

          [[ 283.12558923,  283.12558923]],

          [[ 283.17096573,  283.17096573]],

          [[ 283.24396253,  283.24396253]],

          [[ 283.34691002,  283.34691002]],

          [[ 283.4831638 ,  283.4831638 ]],

          [[ 283.65723239,  283.65723239]],

          [[ 283.87485738,  283.87485738]],

          [[ 284.1428364 ,  284.1428364 ]],

          [[ 284.46796884,  284.46796884]],

          [[ 284.85320099,  284.85320099]],

          [[ 285.28455155,  285.28455155]],

          [[ 285.68480727,  285.68480727]],

          [[ 285.72810837,  285.72810837]],

          [[ 283.92959878,  283.92959878]],

          [[ 271.88860653,  271.88860653]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]],

          [[   2.73      ,    2.73      ]]],


         [[[ 283.14907854,  283.14907854]],

          [[ 283.17314092,  283.17314092]],

          [[ 283.23081083,  283.23081083]],

          [[ 283.32459767,  283.32459767]],

          [[ 283.45908312,  283.45908312]],

          [[ 283.6413261 ,  283.6413261 ]],

          [[ 283.88182844,  283.88182844]],

          [[ 284.19617087,  284.19617087]],

          [[ 284.60783842,  284.60783842]],

          [[ 285.15330609,  285.15330609]],

          [[ 285.89172426,  285.89172426]],

          [[ 286.92472337,  286.92472337]],

          [[ 288.4405395 ,  288.4405395 ]],

          [[ 290.82160679,  290.82160679]],

          [[ 294.90320623,  294.90320623]],

          [[ 301.3831694 ,  301.3831694 ]],

          [[ 285.02336855,  285.02336855]],

          [[ 200.98370805,  200.98370805]],

          [[ 148.04872451,  148.04872451]],

          [[ 117.16817282,  117.16817282]],

          [[  97.509186  ,   97.509186  ]],

          [[  84.1119266 ,   84.1119266 ]],

          [[  74.5351371 ,   74.5351371 ]],

          [[  67.46075999,   67.46075999]],

          [[  62.12166741,   62.12166741]],

          [[  58.0448109 ,   58.0448109 ]],

          [[  54.92561081,   54.92561081]],

          [[  52.56199942,   52.56199942]],

          [[  50.81777763,   50.81777763]],

          [[  49.60139772,   49.60139772]],

          [[  48.85343958,   48.85343958]],

          [[  48.54135894,   48.54135894]]]]]])