##// END OF EJS Templates
HDFWriter create metadata if not given, update setup files
jespinoza -
r1339:2c655c5faa65 v3.0.0b4
parent child
Show More
@@ -1,143 +1,141
1 # CHANGELOG
1 # CHANGELOG
2
2
3 ## 3.0.0
3 ## 3.0.0
4
4
5 * Python 3.x & 2.X compatible
5 * Python 3.x & 2.X compatible
6 * New architecture with multiprocessing support
6 * New architecture with multiprocessing support
7 * Add @MPDecorator for multiprocessing Operations (Plots, Writers and Publishers)
7 * Add @MPDecorator for multiprocessing Operations (Plots, Writers and Publishers)
8 * Added new type of operation `external` for non-locking operations
8 * Added new type of operation `external` for non-locking operations
9 * New plotting architecture with buffering/throttle capabilities to speed up plots
9 * New plotting architecture with buffering/throttle capabilities to speed up plots
10 * Clean controller to optimize scripts (format & optype are no longer required)
10 * Clean controller to optimize scripts (format & optype are no longer required)
11 * Replace ParamReader and ParamWriter with new flexible HDFReader and HDFWriter
11 * Replace `ParamReader` and `ParamWriter` with new flexible `HDFReader` and `HDFWriter`
12 * New GUI with dinamic load of Units and operations (use Kivy framework)
12 * New GUI with dynamic load of Units and operations (use Kivy framework)
13 * Clean code
13 * Clean code
14
14
15 ## 2.3
15 ## 2.3
16
16
17 * Added support for Madrigal formats (reading/writing).
17 * Added support for Madrigal formats (reading/writing).
18 * Added support for reading BLTR parameters (*.sswma).
18 * Added support for reading BLTR parameters (*.sswma).
19 * Added support for reading Julia format (*.dat).
19 * Added support for reading Julia format (*.dat).
20 * Added high order function `MPProject` for multiprocessing scripts.
20 * Added high order function `MPProject` for multiprocessing scripts.
21 * Added two new Processing Units `PublishData` and `ReceiverData` for receiving and sending dataOut through multiple ways (tcp, ipc, inproc).
21 * Added two new Processing Units `PublishData` and `ReceiverData` for receiving and sending dataOut through multiple ways (tcp, ipc, inproc).
22 * Added a new graphics Processing Unit `PlotterReceiver`. It is decoupled from normal processing sequence with support for data generated by multiprocessing scripts.
22 * Added a new graphics Processing Unit `PlotterReceiver`. It is decoupled from normal processing sequence with support for data generated by multiprocessing scripts.
23 * Added support for sending realtime graphic to web server.
23 * Added support for sending realtime graphic to web server.
24 * GUI command `schain` is now `schainGUI`.
24 * GUI command `schain` is now `schainGUI`.
25 * Added a CLI tool named `schain`.
25 * Added a CLI tool named `schain`.
26 * Scripts templates can be now generated with `schain generate`.
26 * Scripts templates can be now generated with `schain generate`.
27 * Now it is possible to search Processing Units and Operations with `schain search [module]` to get the right name and its allowed parameters.
27 * Now it is possible to search Processing Units and Operations with `schain search [module]` to get the right name and its allowed parameters.
28 * `schain xml` to run xml scripts.
28 * `schain xml` to run xml scripts.
29 * Added suggestions when parameters are poorly written.
29 * Added suggestions when parameters are poorly written.
30 * `Controller.start()` now runs in a different process than the process calling it.
30 * `Controller.start()` now runs in a different process than the process calling it.
31 * Added `schainpy.utils.log` for log standarization.
31 * Added `schainpy.utils.log` for log standarization.
32 * Running script on online mode no longer ignores date and hour. Issue #1109.
32 * Running script on online mode no longer ignores date and hour. Issue #1109.
33 * Added support for receving voltage data directly from JARS (tcp, ipc).
33 * Added support for receving voltage data directly from JARS (tcp, ipc).
34 * Updated README for MAC OS GUI installation.
34 * Updated README for MAC OS GUI installation.
35 * Setup now installs numpy.
35 * Setup now installs numpy.
36
36
37 ## 2.2.6
37 ## 2.2.6
38
38
39 * Graphics generated by the GUI are now the same as generated by scripts. Issue #1074.
39 * Graphics generated by the GUI are now the same as generated by scripts. Issue #1074.
40 * Added support for C extensions.
40 * Added support for C extensions.
41 * Function `hildebrand_sehkon` optimized with a C wrapper.
41 * Function `hildebrand_sehkon` optimized with a C wrapper.
42 * Numpy version updated.
42 * Numpy version updated.
43 * Migration to GIT.
43 * Migration to GIT.
44
44
45 ## 2.2.5:
45 ## 2.2.5
46
46
47 * splitProfiles and combineProfiles modules were added to VoltageProc and Signal Chain GUI.
47 * splitProfiles and combineProfiles modules were added to VoltageProc and Signal Chain GUI.
48 * nProfiles of USRP data (hdf5) is the number of profiles thera are in one second.
48 * nProfiles of USRP data (hdf5) is the number of profiles thera are in one second.
49 * jroPlotter works directly with data objects instead of dictionaries
49 * jroPlotter works directly with data objects instead of dictionaries
50 * script "schain" was added to Signal Chain installer
50 * script "schain" was added to Signal Chain installer
51
51
52 ## 2.2.4.1:
52 ## 2.2.4.1
53
53
54 * jroIO_usrp.py is update to read Sandra's data
54 * jroIO_usrp.py is update to read Sandra's data
55 * decimation in Spectra and RTI plots is always enabled.
55 * decimation in Spectra and RTI plots is always enabled.
56 * time* window option added to GUI
56 * time* window option added to GUI
57
57
58 ## 2.2.4:
58 ## 2.2.4
59
59
60 * jroproc_spectra_lags.py added to schainpy
60 * jroproc_spectra_lags.py added to schainpy
61 * Bug fixed in schainGUI: ProcUnit was created with the same id in some cases.
61 * Bug fixed in schainGUI: ProcUnit was created with the same id in some cases.
62 * Bug fixed in jroHeaderIO: Header size validation.
62 * Bug fixed in jroHeaderIO: Header size validation.
63
63
64 ## 2.2.3.1:
64 ## 2.2.3.1
65
65
66 * Filtering block by time has been added.
66 * Filtering block by time has been added.
67 * Bug fixed plotting RTI, CoherenceMap and others using xmin and xmax parameters. The first day worked
67 * Bug fixed plotting RTI, CoherenceMap and others using xmin and xmax parameters. The first day worked
68 properly but the next days did not.
68 properly but the next days did not.
69
69
70 ## 2.2.3:
70 ## 2.2.3
71
71
72 * Bug fixed in GUI: Error getting(reading) Code value
72 * Bug fixed in GUI: Error getting(reading) Code value
73 * Bug fixed in GUI: Flip option always needs channelList field
73 * Bug fixed in GUI: Flip option always needs channelList field
74 * Bug fixed in jrodata: when one branch modified a value in "dataOut" (example: dataOut.code) this value
74 * Bug fixed in jrodata: when one branch modified a value in "dataOut" (example: dataOut.code) this value
75 was modified for every branch (because this was a reference). It was modified in data.copy()
75 was modified for every branch (because this was a reference). It was modified in data.copy()
76 * Bug fixed in jroproc_voltage.profileSelector(): rangeList replaces to profileRangeList.
76 * Bug fixed in jroproc_voltage.profileSelector(): rangeList replaces to profileRangeList.
77
77
78 ## 2.2.2:
78 ## 2.2.2
79
79
80 * VoltageProc: ProfileSelector, Reshape, Decoder with nTxs!=1 and getblock=True was tested
80 * VoltageProc: ProfileSelector, Reshape, Decoder with nTxs!=1 and getblock=True was tested
81 * Rawdata and testRawdata.py added to Signal Chain project
81 * Rawdata and testRawdata.py added to Signal Chain project
82
82
83 ## 2.2.1:
83 ## 2.2.1
84
84
85 * Bugs fixed in GUI
85 * Bugs fixed in GUI
86 * Views were improved in GUI
86 * Views were improved in GUI
87 * Support to MST* ISR experiments
87 * Support to MST* ISR experiments
88 * Bug fixed getting noise using hyldebrant. (minimum number of points > 20%)
88 * Bug fixed getting noise using hyldebrant. (minimum number of points > 20%)
89 * handleError added to jroplotter.py
89 * handleError added to jroplotter.py
90
90
91 ## 2.2.0:
91 ## 2.2.0
92
92
93 * GUI: use of external plotter
93 * GUI: use of external plotter
94 * Compatible with matplotlib 1.5.0
94 * Compatible with matplotlib 1.5.0
95
95
96 ## 2.1.5:
96 ## 2.1.5
97
97
98 * serializer module added to Signal Chain
98 * serializer module added to Signal Chain
99 * jroplotter.py added to Signal Chain
99 * jroplotter.py added to Signal Chain
100
100
101 ## 2.1.4.2:
101 ## 2.1.4.2
102
102
103 * A new Plotter Class was added
103 * A new Plotter Class was added
104 * Project.start() does not accept filename as a parameter anymore
104 * Project.start() does not accept filename as a parameter anymore
105
105
106 ## 2.1.4.1:
106 ## 2.1.4.1
107
107
108 * Send notifications when an error different to ValueError is detected
108 * Send notifications when an error different to ValueError is detected
109
109
110 ## 2.1.4:
110 ## 2.1.4
111
111
112 * Sending error notifications to signal chain administrator
112 * Sending error notifications to signal chain administrator
113 * Login to email server added
113 * Login to email server added
114
114
115 ## 2.1.3.3:
115 ## 2.1.3.3
116
116
117 * Colored Button Icons were added to GUI
117 * Colored Button Icons were added to GUI
118
118
119 ## 2.1.3.2:
119 ## 2.1.3.2
120
120
121 * GUI: user interaction enhanced
121 * GUI: user interaction enhanced
122 * controller_api.py: Safe access to ControllerThead
122 * controller_api.py: Safe access to ControllerThead
123
123
124 ## 2.1.3.1:
124 ## 2.1.3.1
125
125
126 * GUI: every icon were resized
126 * GUI: every icon were resized
127 * jroproc_voltage.py: Print a message when "Read from code" option is selected and the code is not defined inside data file
127 * jroproc_voltage.py: Print a message when "Read from code" option is selected and the code is not defined inside data file
128
128
129 ## 2.1.3:
129 ## 2.1.3
130
130
131 * jroplot_heispectra.py: SpectraHeisScope was not showing the right channels
131 * jroplot_heispectra.py: SpectraHeisScope was not showing the right channels
132 * jroproc_voltage.py: Bug fixed selecting profiles (self.nProfiles took a wrong value),
132 * jroproc_voltage.py: Bug fixed selecting profiles (self.nProfiles took a wrong value), Bug fixed selecting heights by block (selecting profiles instead heights)
133 Bug fixed selecting heights by block (selecting profiles instead heights)
134 * jroproc_voltage.py: New feature added: decoding data by block using FFT.
133 * jroproc_voltage.py: New feature added: decoding data by block using FFT.
135 * jroIO_heispectra.py: Bug fixed in FitsReader. Using local Fits instance instead schainpy.mode.data.jrodata.Fits.
134 * jroIO_heispectra.py: Bug fixed in FitsReader. Using local Fits instance instead schainpy.mode.data.jrodata.Fits.
136 * jroIO_heispectra.py: Channel index list does not exist.
135 * jroIO_heispectra.py: Channel index list does not exist.
137
136
138 ## 2.1.2:
137 ## 2.1.2
139
138
140 * jroutils_ftp.py: Bug fixed, Any error sending file stopped the Server Thread
139 * jroutils_ftp.py: Bug fixed, Any error sending file stopped the Server Thread Server thread opens and closes remote server each time file list is sent
141 Server thread opens and closes remote server each time file list is sent
142 * jroplot_spectra.py: Noise path was not being created when noise data is saved.
140 * jroplot_spectra.py: Noise path was not being created when noise data is saved.
143 * jroIO_base.py: startTime can be greater than endTime. Example: SpreadF [18:00 * 07:00] No newline at end of file
141 * jroIO_base.py: startTime can be greater than endTime. Example: SpreadF
@@ -1,102 +1,104
1 # Signal Chain
1 # Signal Chain
2
2
3 Signal Chain is a radar data processing library wich includes modules to read,
3 Signal Chain is a radar data processing library wich includes modules to read,
4 and write different files formats, besides modules to process and visualize the
4 and write different files formats, besides modules to process and visualize the
5 data.
5 data.
6
6
7 ## Dependencies
7 ## Dependencies
8
8
9 - GCC (gcc or gfortran)
9 - GCC (gcc or gfortran)
10 - Python.h (python-dev or python-devel)
10 - Python.h (python-dev or python-devel)
11 - Python-TK (python-tk)
11 - Python-TK (python-tk)
12 - HDF5 libraries (libhdf5-dev)
12 - HDF5 libraries (libhdf5-dev)
13
13
14 ## Installation
14 ## Installation
15
15
16 To get started the easiest way to install it is through [PyPI](https://pypi.org/project/schainpy/) with pip:
16 To get started the easiest way to install it is through
17 [PyPI](https://pypi.org/project/schainpy/) with pip. We strongly recommend to
18 use an virtual environment like virtualenv or anaconda.
17
19
18 ```bash
20 ```bash
19 pip install schainpy
21 pip install schainpy
20 ```
22 ```
21
23
22 ### From source
24 ### From source
23
25
24 First, ensure that you have the above-listed dependencies installed, then clone
26 First, ensure that you have the above-listed dependencies installed, then clone
25 the repository and install as normal python package:
27 the repository and install as normal python package:
26
28
27 ```bash
29 ```bash
28 git clone https://github.com/JRO-Peru/schain.git
30 git clone https://github.com/JRO-Peru/schainpy.git
29 cd schain
31 cd schain
30 git checkout `branch-name` (optional)
32 git checkout `branch-name` (optional)
31 sudo pip install ./
33 sudo pip install ./
32 ```
34 ```
33
35
34 ### Using Docker
36 ### Using Docker
35
37
36 Download Dockerfile from the repository, and create a docker image:
38 Download Dockerfile from the repository, and create a docker image:
37
39
38 ```bash
40 ```bash
39 docker build -t schain .
41 docker build -t schain .
40 ```
42 ```
41
43
42 You can run a container using an xml file or a schain script also you need to
44 You can run a container using an xml file or a schain script also you need to
43 mount a volume for the data input and for the output files/plots:
45 mount a volume for the data input and for the output files/plots:
44
46
45 ```bash
47 ```bash
46 docker run -it --rm --volume /path/to/host/data:/data schain xml /data/test.xml
48 docker run -it --rm --volume /path/to/host/data:/data schain xml /data/test.xml
47 docker run -it --rm --volume /path/to/host/data:/data --entrypoint /urs/local/bin/python schain /data/test.py
49 docker run -it --rm --volume /path/to/host/data:/data --entrypoint /urs/local/bin/python schain /data/test.py
48 ```
50 ```
49
51
50 ## CLI (command line interface)
52 ## CLI (command line interface)
51
53
52 Signal Chain provides the following commands:
54 Signal Chain provides the following commands:
53
55
54 - schainGUI: Open the GUI
56 - schainGUI: Open the GUI
55 - schain: Signal chain command line
57 - schain: Signal chain command line
56
58
57 ## Example
59 ## Example
58
60
59 Here you can find an script to read Spectra data (.pdata), remove dc and plot
61 Here you can find an script to read Spectra data (.pdata), remove dc and plot
60 self-spectra & RTI:
62 self-spectra & RTI:
61
63
62 ```python
64 ```python
63 #!/usr/bin/python
65 #!/usr/bin/python
64
66
65 from schainpy.controller import Project
67 from schainpy.controller import Project
66
68
67 prj = Project()
69 prj = Project()
68
70
69 read_unit = prj.addReadUnit(
71 read_unit = prj.addReadUnit(
70 datatype='Spectra',
72 datatype='Spectra',
71 path='/path/to/pdata/',
73 path='/path/to/pdata/',
72 startDate='2014/01/31',
74 startDate='2014/01/31',
73 endDate='2014/03/31',
75 endDate='2014/03/31',
74 startTime='00:00:00',
76 startTime='00:00:00',
75 endTime='23:59:59',
77 endTime='23:59:59',
76 online=0,
78 online=0,
77 walk=0
79 walk=0
78 )
80 )
79
81
80 proc_unit = prj.addProcUnit(
82 proc_unit = prj.addProcUnit(
81 datatype='Spectra',
83 datatype='Spectra',
82 inputId=read_unit.getId()
84 inputId=read_unit.getId()
83 )
85 )
84
86
85 op = proc_unit.addOperation(name='selectChannels')
87 op = proc_unit.addOperation(name='selectChannels')
86 op.addParameter(name='channelList', value='0,1')
88 op.addParameter(name='channelList', value='0,1')
87
89
88 op = proc_unit.addOperation(name='selectHeights')
90 op = proc_unit.addOperation(name='selectHeights')
89 op.addParameter(name='minHei', value='80')
91 op.addParameter(name='minHei', value='80')
90 op.addParameter(name='maxHei', value='200')
92 op.addParameter(name='maxHei', value='200')
91
93
92 op = proc_unit.addOperation(name='removeDC')
94 op = proc_unit.addOperation(name='removeDC')
93
95
94 op = proc_unit.addOperation(name='SpectraPlot')
96 op = proc_unit.addOperation(name='SpectraPlot')
95 op.addParameter(name='wintitle', value='Spectra', format='str')
97 op.addParameter(name='wintitle', value='Spectra', format='str')
96
98
97 op = proc_unit.addOperation(name='RTIPlot')
99 op = proc_unit.addOperation(name='RTIPlot')
98 op.addParameter(name='wintitle', value='RTI', format='str')
100 op.addParameter(name='wintitle', value='RTI', format='str')
99
101
100 prj.start()
102 prj.start()
101
103
102 ```
104 ```
@@ -1,8 +1,8
1 """Signal chain python package"""
1 """Signal chain python package"""
2
2
3 try:
3 try:
4 from .controller import Project
4 from .controller import Project
5 except:
5 except:
6 pass
6 pass
7
7
8 __version__ = '3.0.0b3'
8 __version__ = '3.0.0b4'
@@ -1,620 +1,626
1 import os
1 import os
2 import time
2 import time
3 import datetime
3 import datetime
4
4
5 import numpy
5 import numpy
6 import h5py
6 import h5py
7
7
8 import schainpy.admin
8 import schainpy.admin
9 from schainpy.model.data.jrodata import *
9 from schainpy.model.data.jrodata import *
10 from schainpy.model.proc.jroproc_base import ProcessingUnit, Operation, MPDecorator
10 from schainpy.model.proc.jroproc_base import ProcessingUnit, Operation, MPDecorator
11 from schainpy.model.io.jroIO_base import *
11 from schainpy.model.io.jroIO_base import *
12 from schainpy.utils import log
12 from schainpy.utils import log
13
13
14
14
15 class HDFReader(Reader, ProcessingUnit):
15 class HDFReader(Reader, ProcessingUnit):
16 """Processing unit to read HDF5 format files
16 """Processing unit to read HDF5 format files
17
17
18 This unit reads HDF5 files created with `HDFWriter` operation contains
18 This unit reads HDF5 files created with `HDFWriter` operation contains
19 by default two groups Data and Metadata all variables would be saved as `dataOut`
19 by default two groups Data and Metadata all variables would be saved as `dataOut`
20 attributes.
20 attributes.
21 It is possible to read any HDF5 file by given the structure in the `description`
21 It is possible to read any HDF5 file by given the structure in the `description`
22 parameter, also you can add extra values to metadata with the parameter `extras`.
22 parameter, also you can add extra values to metadata with the parameter `extras`.
23
23
24 Parameters:
24 Parameters:
25 -----------
25 -----------
26 path : str
26 path : str
27 Path where files are located.
27 Path where files are located.
28 startDate : date
28 startDate : date
29 Start date of the files
29 Start date of the files
30 endDate : list
30 endDate : list
31 End date of the files
31 End date of the files
32 startTime : time
32 startTime : time
33 Start time of the files
33 Start time of the files
34 endTime : time
34 endTime : time
35 End time of the files
35 End time of the files
36 description : dict, optional
36 description : dict, optional
37 Dictionary with the description of the HDF5 file
37 Dictionary with the description of the HDF5 file
38 extras : dict, optional
38 extras : dict, optional
39 Dictionary with extra metadata to be be added to `dataOut`
39 Dictionary with extra metadata to be be added to `dataOut`
40
40
41 Examples
41 Examples
42 --------
42 --------
43
43
44 desc = {
44 desc = {
45 'Data': {
45 'Data': {
46 'data_output': ['u', 'v', 'w'],
46 'data_output': ['u', 'v', 'w'],
47 'utctime': 'timestamps',
47 'utctime': 'timestamps',
48 } ,
48 } ,
49 'Metadata': {
49 'Metadata': {
50 'heightList': 'heights'
50 'heightList': 'heights'
51 }
51 }
52 }
52 }
53
53
54 desc = {
54 desc = {
55 'Data': {
55 'Data': {
56 'data_output': 'winds',
56 'data_output': 'winds',
57 'utctime': 'timestamps'
57 'utctime': 'timestamps'
58 },
58 },
59 'Metadata': {
59 'Metadata': {
60 'heightList': 'heights'
60 'heightList': 'heights'
61 }
61 }
62 }
62 }
63
63
64 extras = {
64 extras = {
65 'timeZone': 300
65 'timeZone': 300
66 }
66 }
67
67
68 reader = project.addReadUnit(
68 reader = project.addReadUnit(
69 name='HDFReader',
69 name='HDFReader',
70 path='/path/to/files',
70 path='/path/to/files',
71 startDate='2019/01/01',
71 startDate='2019/01/01',
72 endDate='2019/01/31',
72 endDate='2019/01/31',
73 startTime='00:00:00',
73 startTime='00:00:00',
74 endTime='23:59:59',
74 endTime='23:59:59',
75 # description=json.dumps(desc),
75 # description=json.dumps(desc),
76 # extras=json.dumps(extras),
76 # extras=json.dumps(extras),
77 )
77 )
78
78
79 """
79 """
80
80
81 __attrs__ = ['path', 'startDate', 'endDate', 'startTime', 'endTime', 'description', 'extras']
81 __attrs__ = ['path', 'startDate', 'endDate', 'startTime', 'endTime', 'description', 'extras']
82
82
83 def __init__(self):
83 def __init__(self):
84 ProcessingUnit.__init__(self)
84 ProcessingUnit.__init__(self)
85 self.dataOut = Parameters()
85 self.dataOut = Parameters()
86 self.ext = ".hdf5"
86 self.ext = ".hdf5"
87 self.optchar = "D"
87 self.optchar = "D"
88 self.meta = {}
88 self.meta = {}
89 self.data = {}
89 self.data = {}
90 self.open_file = h5py.File
90 self.open_file = h5py.File
91 self.open_mode = 'r'
91 self.open_mode = 'r'
92 self.description = {}
92 self.description = {}
93 self.extras = {}
93 self.extras = {}
94 self.filefmt = "*%Y%j***"
94 self.filefmt = "*%Y%j***"
95 self.folderfmt = "*%Y%j"
95 self.folderfmt = "*%Y%j"
96
96
97 def setup(self, **kwargs):
97 def setup(self, **kwargs):
98
98
99 self.set_kwargs(**kwargs)
99 self.set_kwargs(**kwargs)
100 if not self.ext.startswith('.'):
100 if not self.ext.startswith('.'):
101 self.ext = '.{}'.format(self.ext)
101 self.ext = '.{}'.format(self.ext)
102
102
103 if self.online:
103 if self.online:
104 log.log("Searching files in online mode...", self.name)
104 log.log("Searching files in online mode...", self.name)
105
105
106 for nTries in range(self.nTries):
106 for nTries in range(self.nTries):
107 fullpath = self.searchFilesOnLine(self.path, self.startDate,
107 fullpath = self.searchFilesOnLine(self.path, self.startDate,
108 self.endDate, self.expLabel, self.ext, self.walk,
108 self.endDate, self.expLabel, self.ext, self.walk,
109 self.filefmt, self.folderfmt)
109 self.filefmt, self.folderfmt)
110 try:
110 try:
111 fullpath = next(fullpath)
111 fullpath = next(fullpath)
112 except:
112 except:
113 fullpath = None
113 fullpath = None
114
114
115 if fullpath:
115 if fullpath:
116 break
116 break
117
117
118 log.warning(
118 log.warning(
119 'Waiting {} sec for a valid file in {}: try {} ...'.format(
119 'Waiting {} sec for a valid file in {}: try {} ...'.format(
120 self.delay, self.path, nTries + 1),
120 self.delay, self.path, nTries + 1),
121 self.name)
121 self.name)
122 time.sleep(self.delay)
122 time.sleep(self.delay)
123
123
124 if not(fullpath):
124 if not(fullpath):
125 raise schainpy.admin.SchainError(
125 raise schainpy.admin.SchainError(
126 'There isn\'t any valid file in {}'.format(self.path))
126 'There isn\'t any valid file in {}'.format(self.path))
127
127
128 pathname, filename = os.path.split(fullpath)
128 pathname, filename = os.path.split(fullpath)
129 self.year = int(filename[1:5])
129 self.year = int(filename[1:5])
130 self.doy = int(filename[5:8])
130 self.doy = int(filename[5:8])
131 self.set = int(filename[8:11]) - 1
131 self.set = int(filename[8:11]) - 1
132 else:
132 else:
133 log.log("Searching files in {}".format(self.path), self.name)
133 log.log("Searching files in {}".format(self.path), self.name)
134 self.filenameList = self.searchFilesOffLine(self.path, self.startDate,
134 self.filenameList = self.searchFilesOffLine(self.path, self.startDate,
135 self.endDate, self.expLabel, self.ext, self.walk, self.filefmt, self.folderfmt)
135 self.endDate, self.expLabel, self.ext, self.walk, self.filefmt, self.folderfmt)
136
136
137 self.setNextFile()
137 self.setNextFile()
138
138
139 return
139 return
140
140
141 def readFirstHeader(self):
141 def readFirstHeader(self):
142 '''Read metadata and data'''
142 '''Read metadata and data'''
143
143
144 self.__readMetadata()
144 self.__readMetadata()
145 self.__readData()
145 self.__readData()
146 self.__setBlockList()
146 self.__setBlockList()
147
147
148 if 'type' in self.meta:
148 if 'type' in self.meta:
149 self.dataOut = eval(self.meta['type'])()
149 self.dataOut = eval(self.meta['type'])()
150
150
151 for attr in self.meta:
151 for attr in self.meta:
152 setattr(self.dataOut, attr, self.meta[attr])
152 setattr(self.dataOut, attr, self.meta[attr])
153
153
154 self.blockIndex = 0
154 self.blockIndex = 0
155
155
156 return
156 return
157
157
158 def __setBlockList(self):
158 def __setBlockList(self):
159 '''
159 '''
160 Selects the data within the times defined
160 Selects the data within the times defined
161
161
162 self.fp
162 self.fp
163 self.startTime
163 self.startTime
164 self.endTime
164 self.endTime
165 self.blockList
165 self.blockList
166 self.blocksPerFile
166 self.blocksPerFile
167
167
168 '''
168 '''
169
169
170 startTime = self.startTime
170 startTime = self.startTime
171 endTime = self.endTime
171 endTime = self.endTime
172
172
173 thisUtcTime = self.data['utctime']
173 thisUtcTime = self.data['utctime']
174 self.interval = numpy.min(thisUtcTime[1:] - thisUtcTime[:-1])
174 self.interval = numpy.min(thisUtcTime[1:] - thisUtcTime[:-1])
175
175
176 thisDatetime = datetime.datetime.utcfromtimestamp(thisUtcTime[0])
176 thisDatetime = datetime.datetime.utcfromtimestamp(thisUtcTime[0])
177
177
178 thisDate = thisDatetime.date()
178 thisDate = thisDatetime.date()
179 thisTime = thisDatetime.time()
179 thisTime = thisDatetime.time()
180
180
181 startUtcTime = (datetime.datetime.combine(thisDate, startTime) - datetime.datetime(1970, 1, 1)).total_seconds()
181 startUtcTime = (datetime.datetime.combine(thisDate, startTime) - datetime.datetime(1970, 1, 1)).total_seconds()
182 endUtcTime = (datetime.datetime.combine(thisDate, endTime) - datetime.datetime(1970, 1, 1)).total_seconds()
182 endUtcTime = (datetime.datetime.combine(thisDate, endTime) - datetime.datetime(1970, 1, 1)).total_seconds()
183
183
184 ind = numpy.where(numpy.logical_and(thisUtcTime >= startUtcTime, thisUtcTime < endUtcTime))[0]
184 ind = numpy.where(numpy.logical_and(thisUtcTime >= startUtcTime, thisUtcTime < endUtcTime))[0]
185
185
186 self.blockList = ind
186 self.blockList = ind
187 self.blocksPerFile = len(ind)
187 self.blocksPerFile = len(ind)
188 return
188 return
189
189
190 def __readMetadata(self):
190 def __readMetadata(self):
191 '''
191 '''
192 Reads Metadata
192 Reads Metadata
193 '''
193 '''
194
194
195 meta = {}
195 meta = {}
196
196
197 if self.description:
197 if self.description:
198 for key, value in self.description['Metadata'].items():
198 for key, value in self.description['Metadata'].items():
199 meta[key] = self.fp[value].value
199 meta[key] = self.fp[value].value
200 else:
200 else:
201 grp = self.fp['Metadata']
201 grp = self.fp['Metadata']
202 for name in grp:
202 for name in grp:
203 meta[name] = grp[name].value
203 meta[name] = grp[name].value
204
204
205 if self.extras:
205 if self.extras:
206 for key, value in self.extras.items():
206 for key, value in self.extras.items():
207 meta[key] = value
207 meta[key] = value
208 self.meta = meta
208 self.meta = meta
209
209
210 return
210 return
211
211
212 def __readData(self):
212 def __readData(self):
213
213
214 data = {}
214 data = {}
215
215
216 if self.description:
216 if self.description:
217 for key, value in self.description['Data'].items():
217 for key, value in self.description['Data'].items():
218 if isinstance(value, str):
218 if isinstance(value, str):
219 if isinstance(self.fp[value], h5py.Dataset):
219 if isinstance(self.fp[value], h5py.Dataset):
220 data[key] = self.fp[value].value
220 data[key] = self.fp[value].value
221 elif isinstance(self.fp[value], h5py.Group):
221 elif isinstance(self.fp[value], h5py.Group):
222 array = []
222 array = []
223 for ch in self.fp[value]:
223 for ch in self.fp[value]:
224 array.append(self.fp[value][ch].value)
224 array.append(self.fp[value][ch].value)
225 data[key] = numpy.array(array)
225 data[key] = numpy.array(array)
226 elif isinstance(value, list):
226 elif isinstance(value, list):
227 array = []
227 array = []
228 for ch in value:
228 for ch in value:
229 array.append(self.fp[ch].value)
229 array.append(self.fp[ch].value)
230 data[key] = numpy.array(array)
230 data[key] = numpy.array(array)
231 else:
231 else:
232 grp = self.fp['Data']
232 grp = self.fp['Data']
233 for name in grp:
233 for name in grp:
234 if isinstance(grp[name], h5py.Dataset):
234 if isinstance(grp[name], h5py.Dataset):
235 array = grp[name].value
235 array = grp[name].value
236 elif isinstance(grp[name], h5py.Group):
236 elif isinstance(grp[name], h5py.Group):
237 array = []
237 array = []
238 for ch in grp[name]:
238 for ch in grp[name]:
239 array.append(grp[name][ch].value)
239 array.append(grp[name][ch].value)
240 array = numpy.array(array)
240 array = numpy.array(array)
241 else:
241 else:
242 log.warning('Unknown type: {}'.format(name))
242 log.warning('Unknown type: {}'.format(name))
243
243
244 if name in self.description:
244 if name in self.description:
245 key = self.description[name]
245 key = self.description[name]
246 else:
246 else:
247 key = name
247 key = name
248 data[key] = array
248 data[key] = array
249
249
250 self.data = data
250 self.data = data
251 return
251 return
252
252
253 def getData(self):
253 def getData(self):
254
254
255 for attr in self.data:
255 for attr in self.data:
256 if self.data[attr].ndim == 1:
256 if self.data[attr].ndim == 1:
257 setattr(self.dataOut, attr, self.data[attr][self.blockIndex])
257 setattr(self.dataOut, attr, self.data[attr][self.blockIndex])
258 else:
258 else:
259 setattr(self.dataOut, attr, self.data[attr][:, self.blockIndex])
259 setattr(self.dataOut, attr, self.data[attr][:, self.blockIndex])
260
260
261 self.dataOut.flagNoData = False
261 self.dataOut.flagNoData = False
262 self.blockIndex += 1
262 self.blockIndex += 1
263
263
264 log.log("Block No. {}/{} -> {}".format(
264 log.log("Block No. {}/{} -> {}".format(
265 self.blockIndex,
265 self.blockIndex,
266 self.blocksPerFile,
266 self.blocksPerFile,
267 self.dataOut.datatime.ctime()), self.name)
267 self.dataOut.datatime.ctime()), self.name)
268
268
269 return
269 return
270
270
271 def run(self, **kwargs):
271 def run(self, **kwargs):
272
272
273 if not(self.isConfig):
273 if not(self.isConfig):
274 self.setup(**kwargs)
274 self.setup(**kwargs)
275 self.isConfig = True
275 self.isConfig = True
276
276
277 if self.blockIndex == self.blocksPerFile:
277 if self.blockIndex == self.blocksPerFile:
278 self.setNextFile()
278 self.setNextFile()
279
279
280 self.getData()
280 self.getData()
281
281
282 return
282 return
283
283
284 @MPDecorator
284 @MPDecorator
285 class HDFWriter(Operation):
285 class HDFWriter(Operation):
286 """Operation to write HDF5 files.
286 """Operation to write HDF5 files.
287
287
288 The HDF5 file contains by default two groups Data and Metadata where
288 The HDF5 file contains by default two groups Data and Metadata where
289 you can save any `dataOut` attribute specified by `dataList` and `metadataList`
289 you can save any `dataOut` attribute specified by `dataList` and `metadataList`
290 parameters, data attributes are normaly time dependent where the metadata
290 parameters, data attributes are normaly time dependent where the metadata
291 are not.
291 are not.
292 It is possible to customize the structure of the HDF5 file with the
292 It is possible to customize the structure of the HDF5 file with the
293 optional description parameter see the examples.
293 optional description parameter see the examples.
294
294
295 Parameters:
295 Parameters:
296 -----------
296 -----------
297 path : str
297 path : str
298 Path where files will be saved.
298 Path where files will be saved.
299 blocksPerFile : int
299 blocksPerFile : int
300 Number of blocks per file
300 Number of blocks per file
301 metadataList : list
301 metadataList : list
302 List of the dataOut attributes that will be saved as metadata
302 List of the dataOut attributes that will be saved as metadata
303 dataList : int
303 dataList : int
304 List of the dataOut attributes that will be saved as data
304 List of the dataOut attributes that will be saved as data
305 setType : bool
305 setType : bool
306 If True the name of the files corresponds to the timestamp of the data
306 If True the name of the files corresponds to the timestamp of the data
307 description : dict, optional
307 description : dict, optional
308 Dictionary with the desired description of the HDF5 file
308 Dictionary with the desired description of the HDF5 file
309
309
310 Examples
310 Examples
311 --------
311 --------
312
312
313 desc = {
313 desc = {
314 'data_output': {'winds': ['z', 'w', 'v']},
314 'data_output': {'winds': ['z', 'w', 'v']},
315 'utctime': 'timestamps',
315 'utctime': 'timestamps',
316 'heightList': 'heights'
316 'heightList': 'heights'
317 }
317 }
318 desc = {
318 desc = {
319 'data_output': ['z', 'w', 'v'],
319 'data_output': ['z', 'w', 'v'],
320 'utctime': 'timestamps',
320 'utctime': 'timestamps',
321 'heightList': 'heights'
321 'heightList': 'heights'
322 }
322 }
323 desc = {
323 desc = {
324 'Data': {
324 'Data': {
325 'data_output': 'winds',
325 'data_output': 'winds',
326 'utctime': 'timestamps'
326 'utctime': 'timestamps'
327 },
327 },
328 'Metadata': {
328 'Metadata': {
329 'heightList': 'heights'
329 'heightList': 'heights'
330 }
330 }
331 }
331 }
332
332
333 writer = proc_unit.addOperation(name='HDFWriter')
333 writer = proc_unit.addOperation(name='HDFWriter')
334 writer.addParameter(name='path', value='/path/to/file')
334 writer.addParameter(name='path', value='/path/to/file')
335 writer.addParameter(name='blocksPerFile', value='32')
335 writer.addParameter(name='blocksPerFile', value='32')
336 writer.addParameter(name='metadataList', value='heightList,timeZone')
336 writer.addParameter(name='metadataList', value='heightList,timeZone')
337 writer.addParameter(name='dataList',value='data_output,utctime')
337 writer.addParameter(name='dataList',value='data_output,utctime')
338 # writer.addParameter(name='description',value=json.dumps(desc))
338 # writer.addParameter(name='description',value=json.dumps(desc))
339
339
340 """
340 """
341
341
342 ext = ".hdf5"
342 ext = ".hdf5"
343 optchar = "D"
343 optchar = "D"
344 filename = None
344 filename = None
345 path = None
345 path = None
346 setFile = None
346 setFile = None
347 fp = None
347 fp = None
348 firsttime = True
348 firsttime = True
349 #Configurations
349 #Configurations
350 blocksPerFile = None
350 blocksPerFile = None
351 blockIndex = None
351 blockIndex = None
352 dataOut = None
352 dataOut = None
353 #Data Arrays
353 #Data Arrays
354 dataList = None
354 dataList = None
355 metadataList = None
355 metadataList = None
356 currentDay = None
356 currentDay = None
357 lastTime = None
357 lastTime = None
358
358
359 def __init__(self):
359 def __init__(self):
360
360
361 Operation.__init__(self)
361 Operation.__init__(self)
362 return
362 return
363
363
364 def setup(self, path=None, blocksPerFile=10, metadataList=None, dataList=None, setType=None, description=None):
364 def setup(self, path=None, blocksPerFile=10, metadataList=None, dataList=None, setType=None, description=None):
365 self.path = path
365 self.path = path
366 self.blocksPerFile = blocksPerFile
366 self.blocksPerFile = blocksPerFile
367 self.metadataList = metadataList
367 self.metadataList = metadataList
368 self.dataList = dataList
368 self.dataList = [s.strip() for s in dataList]
369 self.setType = setType
369 self.setType = setType
370 self.description = description
370 self.description = description
371
371
372 for s in ['type', 'timeZone', 'useLocalTime']:
372 if self.metadataList is None:
373 if s not in self.metadataList:
373 self.metadataList = self.dataOut.metadata_list
374 self.metadataList.append(s)
375
374
376 tableList = []
375 tableList = []
377 dsList = []
376 dsList = []
378
377
379 for i in range(len(self.dataList)):
378 for i in range(len(self.dataList)):
380 dsDict = {}
379 dsDict = {}
380 if hasattr(self.dataOut, self.dataList[i]):
381 dataAux = getattr(self.dataOut, self.dataList[i])
381 dataAux = getattr(self.dataOut, self.dataList[i])
382 dsDict['variable'] = self.dataList[i]
382 dsDict['variable'] = self.dataList[i]
383 else:
384 log.warning('Attribute {} not found in dataOut', self.name)
385 continue
383
386
384 if dataAux is None:
387 if dataAux is None:
385 continue
388 continue
386 elif isinstance(dataAux, (int, float, numpy.integer, numpy.float)):
389 elif isinstance(dataAux, (int, float, numpy.integer, numpy.float)):
387 dsDict['nDim'] = 0
390 dsDict['nDim'] = 0
388 else:
391 else:
389 dsDict['nDim'] = len(dataAux.shape)
392 dsDict['nDim'] = len(dataAux.shape)
390 dsDict['shape'] = dataAux.shape
393 dsDict['shape'] = dataAux.shape
391 dsDict['dsNumber'] = dataAux.shape[0]
394 dsDict['dsNumber'] = dataAux.shape[0]
392 dsDict['dtype'] = dataAux.dtype
395 dsDict['dtype'] = dataAux.dtype
393
396
394 dsList.append(dsDict)
397 dsList.append(dsDict)
395
398
396 self.dsList = dsList
399 self.dsList = dsList
397 self.currentDay = self.dataOut.datatime.date()
400 self.currentDay = self.dataOut.datatime.date()
398
401
399 def timeFlag(self):
402 def timeFlag(self):
400 currentTime = self.dataOut.utctime
403 currentTime = self.dataOut.utctime
401 timeTuple = time.localtime(currentTime)
404 timeTuple = time.localtime(currentTime)
402 dataDay = timeTuple.tm_yday
405 dataDay = timeTuple.tm_yday
403
406
404 if self.lastTime is None:
407 if self.lastTime is None:
405 self.lastTime = currentTime
408 self.lastTime = currentTime
406 self.currentDay = dataDay
409 self.currentDay = dataDay
407 return False
410 return False
408
411
409 timeDiff = currentTime - self.lastTime
412 timeDiff = currentTime - self.lastTime
410
413
411 #Si el dia es diferente o si la diferencia entre un dato y otro supera la hora
414 #Si el dia es diferente o si la diferencia entre un dato y otro supera la hora
412 if dataDay != self.currentDay:
415 if dataDay != self.currentDay:
413 self.currentDay = dataDay
416 self.currentDay = dataDay
414 return True
417 return True
415 elif timeDiff > 3*60*60:
418 elif timeDiff > 3*60*60:
416 self.lastTime = currentTime
419 self.lastTime = currentTime
417 return True
420 return True
418 else:
421 else:
419 self.lastTime = currentTime
422 self.lastTime = currentTime
420 return False
423 return False
421
424
422 def run(self, dataOut, path, blocksPerFile=10, metadataList=[],
425 def run(self, dataOut, path, blocksPerFile=10, metadataList=None,
423 dataList=[], setType=None, description={}):
426 dataList=[], setType=None, description={}):
424
427
425 self.dataOut = dataOut
428 self.dataOut = dataOut
426 if not(self.isConfig):
429 if not(self.isConfig):
427 self.setup(path=path, blocksPerFile=blocksPerFile,
430 self.setup(path=path, blocksPerFile=blocksPerFile,
428 metadataList=metadataList, dataList=dataList,
431 metadataList=metadataList, dataList=dataList,
429 setType=setType, description=description)
432 setType=setType, description=description)
430
433
431 self.isConfig = True
434 self.isConfig = True
432 self.setNextFile()
435 self.setNextFile()
433
436
434 self.putData()
437 self.putData()
435 return
438 return
436
439
437 def setNextFile(self):
440 def setNextFile(self):
438
441
439 ext = self.ext
442 ext = self.ext
440 path = self.path
443 path = self.path
441 setFile = self.setFile
444 setFile = self.setFile
442
445
443 timeTuple = time.localtime(self.dataOut.utctime)
446 timeTuple = time.localtime(self.dataOut.utctime)
444 subfolder = 'd%4.4d%3.3d' % (timeTuple.tm_year,timeTuple.tm_yday)
447 subfolder = 'd%4.4d%3.3d' % (timeTuple.tm_year,timeTuple.tm_yday)
445 fullpath = os.path.join(path, subfolder)
448 fullpath = os.path.join(path, subfolder)
446
449
447 if os.path.exists(fullpath):
450 if os.path.exists(fullpath):
448 filesList = os.listdir(fullpath)
451 filesList = os.listdir(fullpath)
449 filesList = [k for k in filesList if k.startswith(self.optchar)]
452 filesList = [k for k in filesList if k.startswith(self.optchar)]
450 if len( filesList ) > 0:
453 if len( filesList ) > 0:
451 filesList = sorted(filesList, key=str.lower)
454 filesList = sorted(filesList, key=str.lower)
452 filen = filesList[-1]
455 filen = filesList[-1]
453 # el filename debera tener el siguiente formato
456 # el filename debera tener el siguiente formato
454 # 0 1234 567 89A BCDE (hex)
457 # 0 1234 567 89A BCDE (hex)
455 # x YYYY DDD SSS .ext
458 # x YYYY DDD SSS .ext
456 if isNumber(filen[8:11]):
459 if isNumber(filen[8:11]):
457 setFile = int(filen[8:11]) #inicializo mi contador de seteo al seteo del ultimo file
460 setFile = int(filen[8:11]) #inicializo mi contador de seteo al seteo del ultimo file
458 else:
461 else:
459 setFile = -1
462 setFile = -1
460 else:
463 else:
461 setFile = -1 #inicializo mi contador de seteo
464 setFile = -1 #inicializo mi contador de seteo
462 else:
465 else:
463 os.makedirs(fullpath)
466 os.makedirs(fullpath)
464 setFile = -1 #inicializo mi contador de seteo
467 setFile = -1 #inicializo mi contador de seteo
465
468
466 if self.setType is None:
469 if self.setType is None:
467 setFile += 1
470 setFile += 1
468 file = '%s%4.4d%3.3d%03d%s' % (self.optchar,
471 file = '%s%4.4d%3.3d%03d%s' % (self.optchar,
469 timeTuple.tm_year,
472 timeTuple.tm_year,
470 timeTuple.tm_yday,
473 timeTuple.tm_yday,
471 setFile,
474 setFile,
472 ext )
475 ext )
473 else:
476 else:
474 setFile = timeTuple.tm_hour*60+timeTuple.tm_min
477 setFile = timeTuple.tm_hour*60+timeTuple.tm_min
475 file = '%s%4.4d%3.3d%04d%s' % (self.optchar,
478 file = '%s%4.4d%3.3d%04d%s' % (self.optchar,
476 timeTuple.tm_year,
479 timeTuple.tm_year,
477 timeTuple.tm_yday,
480 timeTuple.tm_yday,
478 setFile,
481 setFile,
479 ext )
482 ext )
480
483
481 self.filename = os.path.join( path, subfolder, file )
484 self.filename = os.path.join( path, subfolder, file )
482
485
483 #Setting HDF5 File
486 #Setting HDF5 File
484 self.fp = h5py.File(self.filename, 'w')
487 self.fp = h5py.File(self.filename, 'w')
485 #write metadata
488 #write metadata
486 self.writeMetadata(self.fp)
489 self.writeMetadata(self.fp)
487 #Write data
490 #Write data
488 self.writeData(self.fp)
491 self.writeData(self.fp)
489
492
490 def getLabel(self, name, x=None):
493 def getLabel(self, name, x=None):
491
494
492 if x is None:
495 if x is None:
493 if 'Data' in self.description:
496 if 'Data' in self.description:
494 data = self.description['Data']
497 data = self.description['Data']
495 if 'Metadata' in self.description:
498 if 'Metadata' in self.description:
496 data.update(self.description['Metadata'])
499 data.update(self.description['Metadata'])
497 else:
500 else:
498 data = self.description
501 data = self.description
499 if name in data:
502 if name in data:
500 if isinstance(data[name], str):
503 if isinstance(data[name], str):
501 return data[name]
504 return data[name]
502 elif isinstance(data[name], list):
505 elif isinstance(data[name], list):
503 return None
506 return None
504 elif isinstance(data[name], dict):
507 elif isinstance(data[name], dict):
505 for key, value in data[name].items():
508 for key, value in data[name].items():
506 return key
509 return key
507 return name
510 return name
508 else:
511 else:
509 if 'Metadata' in self.description:
512 if 'Metadata' in self.description:
510 meta = self.description['Metadata']
513 meta = self.description['Metadata']
511 else:
514 else:
512 meta = self.description
515 meta = self.description
513 if name in meta:
516 if name in meta:
514 if isinstance(meta[name], list):
517 if isinstance(meta[name], list):
515 return meta[name][x]
518 return meta[name][x]
516 elif isinstance(meta[name], dict):
519 elif isinstance(meta[name], dict):
517 for key, value in meta[name].items():
520 for key, value in meta[name].items():
518 return value[x]
521 return value[x]
522 if 'cspc' in name:
523 return 'pair{:02d}'.format(x)
524 else:
519 return 'channel{:02d}'.format(x)
525 return 'channel{:02d}'.format(x)
520
526
521 def writeMetadata(self, fp):
527 def writeMetadata(self, fp):
522
528
523 if self.description:
529 if self.description:
524 if 'Metadata' in self.description:
530 if 'Metadata' in self.description:
525 grp = fp.create_group('Metadata')
531 grp = fp.create_group('Metadata')
526 else:
532 else:
527 grp = fp
533 grp = fp
528 else:
534 else:
529 grp = fp.create_group('Metadata')
535 grp = fp.create_group('Metadata')
530
536
531 for i in range(len(self.metadataList)):
537 for i in range(len(self.metadataList)):
532 if not hasattr(self.dataOut, self.metadataList[i]):
538 if not hasattr(self.dataOut, self.metadataList[i]):
533 log.warning('Metadata: `{}` not found'.format(self.metadataList[i]), self.name)
539 log.warning('Metadata: `{}` not found'.format(self.metadataList[i]), self.name)
534 continue
540 continue
535 value = getattr(self.dataOut, self.metadataList[i])
541 value = getattr(self.dataOut, self.metadataList[i])
536 if isinstance(value, bool):
542 if isinstance(value, bool):
537 if value is True:
543 if value is True:
538 value = 1
544 value = 1
539 else:
545 else:
540 value = 0
546 value = 0
541 grp.create_dataset(self.getLabel(self.metadataList[i]), data=value)
547 grp.create_dataset(self.getLabel(self.metadataList[i]), data=value)
542 return
548 return
543
549
544 def writeData(self, fp):
550 def writeData(self, fp):
545
551
546 if self.description:
552 if self.description:
547 if 'Data' in self.description:
553 if 'Data' in self.description:
548 grp = fp.create_group('Data')
554 grp = fp.create_group('Data')
549 else:
555 else:
550 grp = fp
556 grp = fp
551 else:
557 else:
552 grp = fp.create_group('Data')
558 grp = fp.create_group('Data')
553
559
554 dtsets = []
560 dtsets = []
555 data = []
561 data = []
556
562
557 for dsInfo in self.dsList:
563 for dsInfo in self.dsList:
558 if dsInfo['nDim'] == 0:
564 if dsInfo['nDim'] == 0:
559 ds = grp.create_dataset(
565 ds = grp.create_dataset(
560 self.getLabel(dsInfo['variable']),
566 self.getLabel(dsInfo['variable']),
561 (self.blocksPerFile, ),
567 (self.blocksPerFile, ),
562 chunks=True,
568 chunks=True,
563 dtype=numpy.float64)
569 dtype=numpy.float64)
564 dtsets.append(ds)
570 dtsets.append(ds)
565 data.append((dsInfo['variable'], -1))
571 data.append((dsInfo['variable'], -1))
566 else:
572 else:
567 label = self.getLabel(dsInfo['variable'])
573 label = self.getLabel(dsInfo['variable'])
568 if label is not None:
574 if label is not None:
569 sgrp = grp.create_group(label)
575 sgrp = grp.create_group(label)
570 else:
576 else:
571 sgrp = grp
577 sgrp = grp
572 for i in range(dsInfo['dsNumber']):
578 for i in range(dsInfo['dsNumber']):
573 ds = sgrp.create_dataset(
579 ds = sgrp.create_dataset(
574 self.getLabel(dsInfo['variable'], i),
580 self.getLabel(dsInfo['variable'], i),
575 (self.blocksPerFile, ) + dsInfo['shape'][1:],
581 (self.blocksPerFile, ) + dsInfo['shape'][1:],
576 chunks=True,
582 chunks=True,
577 dtype=dsInfo['dtype'])
583 dtype=dsInfo['dtype'])
578 dtsets.append(ds)
584 dtsets.append(ds)
579 data.append((dsInfo['variable'], i))
585 data.append((dsInfo['variable'], i))
580 fp.flush()
586 fp.flush()
581
587
582 log.log('Creating file: {}'.format(fp.filename), self.name)
588 log.log('Creating file: {}'.format(fp.filename), self.name)
583
589
584 self.ds = dtsets
590 self.ds = dtsets
585 self.data = data
591 self.data = data
586 self.firsttime = True
592 self.firsttime = True
587 self.blockIndex = 0
593 self.blockIndex = 0
588 return
594 return
589
595
590 def putData(self):
596 def putData(self):
591
597
592 if (self.blockIndex == self.blocksPerFile) or self.timeFlag():
598 if (self.blockIndex == self.blocksPerFile) or self.timeFlag():
593 self.closeFile()
599 self.closeFile()
594 self.setNextFile()
600 self.setNextFile()
595
601
596 for i, ds in enumerate(self.ds):
602 for i, ds in enumerate(self.ds):
597 attr, ch = self.data[i]
603 attr, ch = self.data[i]
598 if ch == -1:
604 if ch == -1:
599 ds[self.blockIndex] = getattr(self.dataOut, attr)
605 ds[self.blockIndex] = getattr(self.dataOut, attr)
600 else:
606 else:
601 ds[self.blockIndex] = getattr(self.dataOut, attr)[ch]
607 ds[self.blockIndex] = getattr(self.dataOut, attr)[ch]
602
608
603 self.fp.flush()
609 self.fp.flush()
604 self.blockIndex += 1
610 self.blockIndex += 1
605 log.log('Block No. {}/{}'.format(self.blockIndex, self.blocksPerFile), self.name)
611 log.log('Block No. {}/{}'.format(self.blockIndex, self.blocksPerFile), self.name)
606
612
607 return
613 return
608
614
609 def closeFile(self):
615 def closeFile(self):
610
616
611 if self.blockIndex != self.blocksPerFile:
617 if self.blockIndex != self.blocksPerFile:
612 for ds in self.ds:
618 for ds in self.ds:
613 ds.resize(self.blockIndex, axis=0)
619 ds.resize(self.blockIndex, axis=0)
614
620
615 self.fp.flush()
621 self.fp.flush()
616 self.fp.close()
622 self.fp.close()
617
623
618 def close(self):
624 def close(self):
619
625
620 self.closeFile()
626 self.closeFile()
@@ -1,86 +1,86
1 # Copyright (c) 2012-2020 Jicamarca Radio Observatory
1 # Copyright (c) 2012-2020 Jicamarca Radio Observatory
2 # All rights reserved.
2 # All rights reserved.
3 #
3 #
4 # Distributed under the terms of the BSD 3-clause license.
4 # Distributed under the terms of the BSD 3-clause license.
5 """schainpy is an open source library to read, write and process radar data
5 """schainpy is an open source library to read, write and process radar data
6
6
7 Signal Chain is a radar data processing library wich includes modules to read,
7 Signal Chain is a radar data processing library wich includes modules to read,
8 and write different files formats, besides modules to process and visualize the
8 and write different files formats, besides modules to process and visualize the
9 data.
9 data.
10 """
10 """
11
11
12 import os
12 import os
13 from setuptools import setup, Extension
13 from setuptools import setup, Extension
14 from setuptools.command.build_ext import build_ext as _build_ext
14 from setuptools.command.build_ext import build_ext as _build_ext
15 from schainpy import __version__
15 from schainpy import __version__
16
16
17 DOCLINES = __doc__.split("\n")
17 DOCLINES = __doc__.split("\n")
18
18
19 class build_ext(_build_ext):
19 class build_ext(_build_ext):
20 def finalize_options(self):
20 def finalize_options(self):
21 _build_ext.finalize_options(self)
21 _build_ext.finalize_options(self)
22 # Prevent numpy from thinking it is still in its setup process:
22 # Prevent numpy from thinking it is still in its setup process:
23 __builtins__.__NUMPY_SETUP__ = False
23 __builtins__.__NUMPY_SETUP__ = False
24 import numpy
24 import numpy
25 self.include_dirs.append(numpy.get_include())
25 self.include_dirs.append(numpy.get_include())
26
26
27 setup(
27 setup(
28 name = "schainpy",
28 name = "schainpy",
29 version = __version__,
29 version = __version__,
30 description = DOCLINES[0],
30 description = DOCLINES[0],
31 long_description = "\n".join(DOCLINES[2:]),
31 long_description = "\n".join(DOCLINES[2:]),
32 url = "https://github.com/JRO-Peru/schain",
32 url = "https://github.com/JRO-Peru/schainpy",
33 author = "Jicamarca Radio Observatory",
33 author = "Jicamarca Radio Observatory",
34 author_email = "jro-developers@igp.gob.pe",
34 author_email = "jro-developers@jro.igp.gob.pe",
35 license="BSD-3-Clause",
35 license="BSD-3-Clause",
36 classifiers=[
36 classifiers=[
37 "Development Status :: 4 - Beta",
37 "Development Status :: 4 - Beta",
38 "Environment :: Console",
38 "Environment :: Console",
39 "Intended Audience :: Science/Research",
39 "Intended Audience :: Science/Research",
40 "License :: OSI Approved :: BSD License",
40 "License :: OSI Approved :: BSD License",
41 "Operating System :: MacOS :: MacOS X",
41 "Operating System :: MacOS :: MacOS X",
42 "Operating System :: POSIX :: Linux",
42 "Operating System :: POSIX :: Linux",
43 "Programming Language :: Python :: 2",
43 "Programming Language :: Python :: 2",
44 "Programming Language :: Python :: 2.7",
44 "Programming Language :: Python :: 2.7",
45 "Programming Language :: Python :: 3",
45 "Programming Language :: Python :: 3",
46 "Programming Language :: Python :: 3.5",
46 "Programming Language :: Python :: 3.5",
47 "Programming Language :: Python :: 3.6",
47 "Programming Language :: Python :: 3.6",
48 "Programming Language :: Python :: 3.7",
48 "Programming Language :: Python :: 3.7",
49 "Topic :: Scientific/Engineering",
49 "Topic :: Scientific/Engineering",
50 ],
50 ],
51 packages = {
51 packages = {
52 'schainpy',
52 'schainpy',
53 'schainpy.model',
53 'schainpy.model',
54 'schainpy.model.data',
54 'schainpy.model.data',
55 'schainpy.model.graphics',
55 'schainpy.model.graphics',
56 'schainpy.model.io',
56 'schainpy.model.io',
57 'schainpy.model.proc',
57 'schainpy.model.proc',
58 'schainpy.model.utils',
58 'schainpy.model.utils',
59 'schainpy.utils',
59 'schainpy.utils',
60 'schainpy.gui',
60 'schainpy.gui',
61 'schainpy.cli',
61 'schainpy.cli',
62 },
62 },
63 package_data = {'': ['schain.conf.template'],
63 package_data = {'': ['schain.conf.template'],
64 'schainpy.files': ['*.oga']
64 'schainpy.files': ['*.oga']
65 },
65 },
66 include_package_data = False,
66 include_package_data = False,
67 scripts = ['schainpy/gui/schainGUI'],
67 scripts = ['schainpy/gui/schainGUI'],
68 entry_points = {
68 entry_points = {
69 'console_scripts': [
69 'console_scripts': [
70 'schain = schainpy.cli.cli:main',
70 'schain = schainpy.cli.cli:main',
71 ],
71 ],
72 },
72 },
73 cmdclass = {'build_ext': build_ext},
73 cmdclass = {'build_ext': build_ext},
74 ext_modules=[
74 ext_modules=[
75 Extension("schainpy.model.data._noise", ["schainc/_noise.c"]),
75 Extension("schainpy.model.data._noise", ["schainc/_noise.c"]),
76 ],
76 ],
77 setup_requires = ["numpy"],
77 setup_requires = ["numpy"],
78 install_requires = [
78 install_requires = [
79 "scipy",
79 "scipy",
80 "h5py",
80 "h5py",
81 "matplotlib",
81 "matplotlib",
82 "pyzmq",
82 "pyzmq",
83 "fuzzywuzzy",
83 "fuzzywuzzy",
84 "click",
84 "click",
85 ],
85 ],
86 )
86 )
General Comments 0
You need to be logged in to leave comments. Login now