##// END OF EJS Templates
Fix ScpecraWriter and CohInt attribute
Juan C. Espinoza -
r1310:ce24778ee4ae
parent child
Show More
@@ -1,121 +1,121
1 ## CHANGELOG:
1 # CHANGELOG:
2 2
3 ### 3.0
3 ## 3.0
4 4 * Python 3.x & 2.X compatible
5 5 * New architecture with multiprocessing support
6 6 * Add @MPDecorator for multiprocessing Operations (Plots, Writers and Publishers)
7 7 * Added new type of operation `external` for non-locking operations
8 8 * New plotting architecture with buffering/throttle capabilities to speed up plots
9 9 * Clean controller to optimize scripts (format & optype are no longer required)
10 10 * New GUI with dinamic load of Units and operations (use Kivy framework)
11 11
12 ### 2.3
12 ## 2.3
13 13 * Added support for Madrigal formats (reading/writing).
14 14 * Added support for reading BLTR parameters (*.sswma).
15 15 * Added support for reading Julia format (*.dat).
16 16 * Added high order function `MPProject` for multiprocessing scripts.
17 17 * Added two new Processing Units `PublishData` and `ReceiverData` for receiving and sending dataOut through multiple ways (tcp, ipc, inproc).
18 18 * Added a new graphics Processing Unit `PlotterReceiver`. It is decoupled from normal processing sequence with support for data generated by multiprocessing scripts.
19 19 * Added support for sending realtime graphic to web server.
20 20 * GUI command `schain` is now `schainGUI`.
21 21 * Added a CLI tool named `schain`.
22 22 * Scripts templates can be now generated with `schain generate`.
23 23 * Now it is possible to search Processing Units and Operations with `schain search [module]` to get the right name and its allowed parameters.
24 24 * `schain xml` to run xml scripts.
25 25 * Added suggestions when parameters are poorly written.
26 26 * `Controller.start()` now runs in a different process than the process calling it.
27 27 * Added `schainpy.utils.log` for log standarization.
28 28 * Running script on online mode no longer ignores date and hour. Issue #1109.
29 29 * Added support for receving voltage data directly from JARS (tcp, ipc).
30 30 * Updated README for MAC OS GUI installation.
31 31 * Setup now installs numpy.
32 32
33 ### 2.2.6
33 ## 2.2.6
34 34 * Graphics generated by the GUI are now the same as generated by scripts. Issue #1074.
35 35 * Added support for C extensions.
36 36 * Function `hildebrand_sehkon` optimized with a C wrapper.
37 37 * Numpy version updated.
38 38 * Migration to GIT.
39 39
40 ### 2.2.5:
40 ## 2.2.5:
41 41 * splitProfiles and combineProfiles modules were added to VoltageProc and Signal Chain GUI.
42 42 * nProfiles of USRP data (hdf5) is the number of profiles thera are in one second.
43 43 * jroPlotter works directly with data objects instead of dictionaries
44 44 * script "schain" was added to Signal Chain installer
45 45
46 ### 2.2.4.1:
46 ## 2.2.4.1:
47 47 * jroIO_usrp.py is update to read Sandra's data
48 48 * decimation in Spectra and RTI plots is always enabled.
49 49 * time* window option added to GUI
50 50
51 ### 2.2.4:
51 ## 2.2.4:
52 52 * jroproc_spectra_lags.py added to schainpy
53 53 * Bug fixed in schainGUI: ProcUnit was created with the same id in some cases.
54 54 * Bug fixed in jroHeaderIO: Header size validation.
55 55
56 ### 2.2.3.1:
56 ## 2.2.3.1:
57 57 * Filtering block by time has been added.
58 58 * Bug fixed plotting RTI, CoherenceMap and others using xmin and xmax parameters. The first day worked
59 59 properly but the next days did not.
60 60
61 ### 2.2.3:
61 ## 2.2.3:
62 62 * Bug fixed in GUI: Error getting(reading) Code value
63 63 * Bug fixed in GUI: Flip option always needs channelList field
64 64 * Bug fixed in jrodata: when one branch modified a value in "dataOut" (example: dataOut.code) this value
65 65 was modified for every branch (because this was a reference). It was modified in data.copy()
66 66 * Bug fixed in jroproc_voltage.profileSelector(): rangeList replaces to profileRangeList.
67 67
68 ### 2.2.2:
68 ## 2.2.2:
69 69 * VoltageProc: ProfileSelector, Reshape, Decoder with nTxs!=1 and getblock=True was tested
70 70 * Rawdata and testRawdata.py added to Signal Chain project
71 71
72 ### 2.2.1:
72 ## 2.2.1:
73 73 * Bugs fixed in GUI
74 74 * Views were improved in GUI
75 75 * Support to MST* ISR experiments
76 76 * Bug fixed getting noise using hyldebrant. (minimum number of points > 20%)
77 77 * handleError added to jroplotter.py
78 78
79 ### 2.2.0:
79 ## 2.2.0:
80 80 * GUI: use of external plotter
81 81 * Compatible with matplotlib 1.5.0
82 82
83 ### 2.1.5:
83 ## 2.1.5:
84 84 * serializer module added to Signal Chain
85 85 * jroplotter.py added to Signal Chain
86 86
87 ### 2.1.4.2:
87 ## 2.1.4.2:
88 88 * A new Plotter Class was added
89 89 * Project.start() does not accept filename as a parameter anymore
90 90
91 ### 2.1.4.1:
91 ## 2.1.4.1:
92 92 * Send notifications when an error different to ValueError is detected
93 93
94 ### 2.1.4:
94 ## 2.1.4:
95 95 * Sending error notifications to signal chain administrator
96 96 * Login to email server added
97 97
98 ### 2.1.3.3:
98 ## 2.1.3.3:
99 99 * Colored Button Icons were added to GUI
100 100
101 ### 2.1.3.2:
101 ## 2.1.3.2:
102 102 * GUI: user interaction enhanced
103 103 * controller_api.py: Safe access to ControllerThead
104 104
105 ### 2.1.3.1:
105 ## 2.1.3.1:
106 106 * GUI: every icon were resized
107 107 * jroproc_voltage.py: Print a message when "Read from code" option is selected and the code is not defined inside data file
108 108
109 ### 2.1.3:
109 ## 2.1.3:
110 110 * jroplot_heispectra.py: SpectraHeisScope was not showing the right channels
111 111 * jroproc_voltage.py: Bug fixed selecting profiles (self.nProfiles took a wrong value),
112 112 Bug fixed selecting heights by block (selecting profiles instead heights)
113 113 * jroproc_voltage.py: New feature added: decoding data by block using FFT.
114 114 * jroIO_heispectra.py: Bug fixed in FitsReader. Using local Fits instance instead schainpy.mode.data.jrodata.Fits.
115 115 * jroIO_heispectra.py: Channel index list does not exist.
116 116
117 ### 2.1.2:
117 ## 2.1.2:
118 118 * jroutils_ftp.py: Bug fixed, Any error sending file stopped the Server Thread
119 119 Server thread opens and closes remote server each time file list is sent
120 120 * jroplot_spectra.py: Noise path was not being created when noise data is saved.
121 121 * jroIO_base.py: startTime can be greater than endTime. Example: SpreadF [18:00 * 07:00] No newline at end of file
@@ -1,1396 +1,1397
1 1 '''
2 2
3 3 $Author: murco $
4 4 $Id: JROData.py 173 2012-11-20 15:06:21Z murco $
5 5 '''
6 6
7 7 import copy
8 8 import numpy
9 9 import datetime
10 10 import json
11 11
12 12 import schainpy.admin
13 13 from schainpy.utils import log
14 14 from .jroheaderIO import SystemHeader, RadarControllerHeader
15 15 from schainpy.model.data import _noise
16 16
17 17
18 18 def getNumpyDtype(dataTypeCode):
19 19
20 20 if dataTypeCode == 0:
21 21 numpyDtype = numpy.dtype([('real', '<i1'), ('imag', '<i1')])
22 22 elif dataTypeCode == 1:
23 23 numpyDtype = numpy.dtype([('real', '<i2'), ('imag', '<i2')])
24 24 elif dataTypeCode == 2:
25 25 numpyDtype = numpy.dtype([('real', '<i4'), ('imag', '<i4')])
26 26 elif dataTypeCode == 3:
27 27 numpyDtype = numpy.dtype([('real', '<i8'), ('imag', '<i8')])
28 28 elif dataTypeCode == 4:
29 29 numpyDtype = numpy.dtype([('real', '<f4'), ('imag', '<f4')])
30 30 elif dataTypeCode == 5:
31 31 numpyDtype = numpy.dtype([('real', '<f8'), ('imag', '<f8')])
32 32 else:
33 33 raise ValueError('dataTypeCode was not defined')
34 34
35 35 return numpyDtype
36 36
37 37
38 38 def getDataTypeCode(numpyDtype):
39 39
40 40 if numpyDtype == numpy.dtype([('real', '<i1'), ('imag', '<i1')]):
41 41 datatype = 0
42 42 elif numpyDtype == numpy.dtype([('real', '<i2'), ('imag', '<i2')]):
43 43 datatype = 1
44 44 elif numpyDtype == numpy.dtype([('real', '<i4'), ('imag', '<i4')]):
45 45 datatype = 2
46 46 elif numpyDtype == numpy.dtype([('real', '<i8'), ('imag', '<i8')]):
47 47 datatype = 3
48 48 elif numpyDtype == numpy.dtype([('real', '<f4'), ('imag', '<f4')]):
49 49 datatype = 4
50 50 elif numpyDtype == numpy.dtype([('real', '<f8'), ('imag', '<f8')]):
51 51 datatype = 5
52 52 else:
53 53 datatype = None
54 54
55 55 return datatype
56 56
57 57
58 58 def hildebrand_sekhon(data, navg):
59 59 """
60 60 This method is for the objective determination of the noise level in Doppler spectra. This
61 61 implementation technique is based on the fact that the standard deviation of the spectral
62 62 densities is equal to the mean spectral density for white Gaussian noise
63 63
64 64 Inputs:
65 65 Data : heights
66 66 navg : numbers of averages
67 67
68 68 Return:
69 69 mean : noise's level
70 70 """
71 71
72 72 sortdata = numpy.sort(data, axis=None)
73 73 '''
74 74 lenOfData = len(sortdata)
75 75 nums_min = lenOfData*0.2
76 76
77 77 if nums_min <= 5:
78 78
79 79 nums_min = 5
80 80
81 81 sump = 0.
82 82 sumq = 0.
83 83
84 84 j = 0
85 85 cont = 1
86 86
87 87 while((cont == 1)and(j < lenOfData)):
88 88
89 89 sump += sortdata[j]
90 90 sumq += sortdata[j]**2
91 91
92 92 if j > nums_min:
93 93 rtest = float(j)/(j-1) + 1.0/navg
94 94 if ((sumq*j) > (rtest*sump**2)):
95 95 j = j - 1
96 96 sump = sump - sortdata[j]
97 97 sumq = sumq - sortdata[j]**2
98 98 cont = 0
99 99
100 100 j += 1
101 101
102 102 lnoise = sump / j
103 103 '''
104 104 return _noise.hildebrand_sekhon(sortdata, navg)
105 105
106 106
107 107 class Beam:
108 108
109 109 def __init__(self):
110 110 self.codeList = []
111 111 self.azimuthList = []
112 112 self.zenithList = []
113 113
114 114
115 115 class GenericData(object):
116 116
117 117 flagNoData = True
118 118
119 119 def copy(self, inputObj=None):
120 120
121 121 if inputObj == None:
122 122 return copy.deepcopy(self)
123 123
124 124 for key in list(inputObj.__dict__.keys()):
125 125
126 126 attribute = inputObj.__dict__[key]
127 127
128 128 # If this attribute is a tuple or list
129 129 if type(inputObj.__dict__[key]) in (tuple, list):
130 130 self.__dict__[key] = attribute[:]
131 131 continue
132 132
133 133 # If this attribute is another object or instance
134 134 if hasattr(attribute, '__dict__'):
135 135 self.__dict__[key] = attribute.copy()
136 136 continue
137 137
138 138 self.__dict__[key] = inputObj.__dict__[key]
139 139
140 140 def deepcopy(self):
141 141
142 142 return copy.deepcopy(self)
143 143
144 144 def isEmpty(self):
145 145
146 146 return self.flagNoData
147 147
148 148 def isReady(self):
149 149
150 150 return not self.flagNoData
151 151
152 152
153 153 class JROData(GenericData):
154 154
155 155 # m_BasicHeader = BasicHeader()
156 156 # m_ProcessingHeader = ProcessingHeader()
157 157
158 158 systemHeaderObj = SystemHeader()
159 159 radarControllerHeaderObj = RadarControllerHeader()
160 160 # data = None
161 161 type = None
162 162 datatype = None # dtype but in string
163 163 # dtype = None
164 164 # nChannels = None
165 165 # nHeights = None
166 166 nProfiles = None
167 167 heightList = None
168 168 channelList = None
169 169 flagDiscontinuousBlock = False
170 170 useLocalTime = False
171 171 utctime = None
172 172 timeZone = None
173 173 dstFlag = None
174 174 errorCount = None
175 175 blocksize = None
176 176 # nCode = None
177 177 # nBaud = None
178 178 # code = None
179 179 flagDecodeData = False # asumo q la data no esta decodificada
180 180 flagDeflipData = False # asumo q la data no esta sin flip
181 181 flagShiftFFT = False
182 182 # ippSeconds = None
183 183 # timeInterval = None
184 184 nCohInt = None
185 185 # noise = None
186 186 windowOfFilter = 1
187 187 # Speed of ligth
188 188 C = 3e8
189 189 frequency = 49.92e6
190 190 realtime = False
191 191 beacon_heiIndexList = None
192 192 last_block = None
193 193 blocknow = None
194 194 azimuth = None
195 195 zenith = None
196 196 beam = Beam()
197 197 profileIndex = None
198 198 error = None
199 199 data = None
200 200 nmodes = None
201 201
202 202 def __str__(self):
203 203
204 204 return '{} - {}'.format(self.type, self.getDatatime())
205 205
206 206 def getNoise(self):
207 207
208 208 raise NotImplementedError
209 209
210 210 def getNChannels(self):
211 211
212 212 return len(self.channelList)
213 213
214 214 def getChannelIndexList(self):
215 215
216 216 return list(range(self.nChannels))
217 217
218 218 def getNHeights(self):
219 219
220 220 return len(self.heightList)
221 221
222 222 def getHeiRange(self, extrapoints=0):
223 223
224 224 heis = self.heightList
225 225 # deltah = self.heightList[1] - self.heightList[0]
226 226 #
227 227 # heis.append(self.heightList[-1])
228 228
229 229 return heis
230 230
231 231 def getDeltaH(self):
232 232
233 233 delta = self.heightList[1] - self.heightList[0]
234 234
235 235 return delta
236 236
237 237 def getltctime(self):
238 238
239 239 if self.useLocalTime:
240 240 return self.utctime - self.timeZone * 60
241 241
242 242 return self.utctime
243 243
244 244 def getDatatime(self):
245 245
246 246 datatimeValue = datetime.datetime.utcfromtimestamp(self.ltctime)
247 247 return datatimeValue
248 248
249 249 def getTimeRange(self):
250 250
251 251 datatime = []
252 252
253 253 datatime.append(self.ltctime)
254 254 datatime.append(self.ltctime + self.timeInterval + 1)
255 255
256 256 datatime = numpy.array(datatime)
257 257
258 258 return datatime
259 259
260 260 def getFmaxTimeResponse(self):
261 261
262 262 period = (10**-6) * self.getDeltaH() / (0.15)
263 263
264 264 PRF = 1. / (period * self.nCohInt)
265 265
266 266 fmax = PRF
267 267
268 268 return fmax
269 269
270 270 def getFmax(self):
271 271 PRF = 1. / (self.ippSeconds * self.nCohInt)
272 272
273 273 fmax = PRF
274 274 return fmax
275 275
276 276 def getVmax(self):
277 277
278 278 _lambda = self.C / self.frequency
279 279
280 280 vmax = self.getFmax() * _lambda / 2
281 281
282 282 return vmax
283 283
284 284 def get_ippSeconds(self):
285 285 '''
286 286 '''
287 287 return self.radarControllerHeaderObj.ippSeconds
288 288
289 289 def set_ippSeconds(self, ippSeconds):
290 290 '''
291 291 '''
292 292
293 293 self.radarControllerHeaderObj.ippSeconds = ippSeconds
294 294
295 295 return
296 296
297 297 def get_dtype(self):
298 298 '''
299 299 '''
300 300 return getNumpyDtype(self.datatype)
301 301
302 302 def set_dtype(self, numpyDtype):
303 303 '''
304 304 '''
305 305
306 306 self.datatype = getDataTypeCode(numpyDtype)
307 307
308 308 def get_code(self):
309 309 '''
310 310 '''
311 311 return self.radarControllerHeaderObj.code
312 312
313 313 def set_code(self, code):
314 314 '''
315 315 '''
316 316 self.radarControllerHeaderObj.code = code
317 317
318 318 return
319 319
320 320 def get_ncode(self):
321 321 '''
322 322 '''
323 323 return self.radarControllerHeaderObj.nCode
324 324
325 325 def set_ncode(self, nCode):
326 326 '''
327 327 '''
328 328 self.radarControllerHeaderObj.nCode = nCode
329 329
330 330 return
331 331
332 332 def get_nbaud(self):
333 333 '''
334 334 '''
335 335 return self.radarControllerHeaderObj.nBaud
336 336
337 337 def set_nbaud(self, nBaud):
338 338 '''
339 339 '''
340 340 self.radarControllerHeaderObj.nBaud = nBaud
341 341
342 342 return
343 343
344 344 nChannels = property(getNChannels, "I'm the 'nChannel' property.")
345 345 channelIndexList = property(
346 346 getChannelIndexList, "I'm the 'channelIndexList' property.")
347 347 nHeights = property(getNHeights, "I'm the 'nHeights' property.")
348 348 #noise = property(getNoise, "I'm the 'nHeights' property.")
349 349 datatime = property(getDatatime, "I'm the 'datatime' property")
350 350 ltctime = property(getltctime, "I'm the 'ltctime' property")
351 351 ippSeconds = property(get_ippSeconds, set_ippSeconds)
352 352 dtype = property(get_dtype, set_dtype)
353 353 # timeInterval = property(getTimeInterval, "I'm the 'timeInterval' property")
354 354 code = property(get_code, set_code)
355 355 nCode = property(get_ncode, set_ncode)
356 356 nBaud = property(get_nbaud, set_nbaud)
357 357
358 358
359 359 class Voltage(JROData):
360 360
361 361 # data es un numpy array de 2 dmensiones (canales, alturas)
362 362 data = None
363 363 data_intensity = None
364 364 data_velocity = None
365 365 data_specwidth = None
366 366 def __init__(self):
367 367 '''
368 368 Constructor
369 369 '''
370 370
371 371 self.useLocalTime = True
372 372 self.radarControllerHeaderObj = RadarControllerHeader()
373 373 self.systemHeaderObj = SystemHeader()
374 374 self.type = "Voltage"
375 375 self.data = None
376 376 # self.dtype = None
377 377 # self.nChannels = 0
378 378 # self.nHeights = 0
379 379 self.nProfiles = None
380 380 self.heightList = None
381 381 self.channelList = None
382 382 # self.channelIndexList = None
383 383 self.flagNoData = True
384 384 self.flagDiscontinuousBlock = False
385 385 self.utctime = None
386 386 self.timeZone = None
387 387 self.dstFlag = None
388 388 self.errorCount = None
389 389 self.nCohInt = None
390 390 self.blocksize = None
391 self.flagCohInt = False
391 392 self.flagDecodeData = False # asumo q la data no esta decodificada
392 393 self.flagDeflipData = False # asumo q la data no esta sin flip
393 394 self.flagShiftFFT = False
394 395 self.flagDataAsBlock = False # Asumo que la data es leida perfil a perfil
395 396 self.profileIndex = 0
396 397
397 398 def getNoisebyHildebrand(self, channel=None):
398 399 """
399 400 Determino el nivel de ruido usando el metodo Hildebrand-Sekhon
400 401
401 402 Return:
402 403 noiselevel
403 404 """
404 405
405 406 if channel != None:
406 407 data = self.data[channel]
407 408 nChannels = 1
408 409 else:
409 410 data = self.data
410 411 nChannels = self.nChannels
411 412
412 413 noise = numpy.zeros(nChannels)
413 414 power = data * numpy.conjugate(data)
414 415
415 416 for thisChannel in range(nChannels):
416 417 if nChannels == 1:
417 418 daux = power[:].real
418 419 else:
419 420 daux = power[thisChannel, :].real
420 421 noise[thisChannel] = hildebrand_sekhon(daux, self.nCohInt)
421 422
422 423 return noise
423 424
424 425 def getNoise(self, type=1, channel=None):
425 426
426 427 if type == 1:
427 428 noise = self.getNoisebyHildebrand(channel)
428 429
429 430 return noise
430 431
431 432 def getPower(self, channel=None):
432 433
433 434 if channel != None:
434 435 data = self.data[channel]
435 436 else:
436 437 data = self.data
437 438
438 439 power = data * numpy.conjugate(data)
439 440 powerdB = 10 * numpy.log10(power.real)
440 441 powerdB = numpy.squeeze(powerdB)
441 442
442 443 return powerdB
443 444
444 445 def getTimeInterval(self):
445 446
446 447 timeInterval = self.ippSeconds * self.nCohInt
447 448
448 449 return timeInterval
449 450
450 451 noise = property(getNoise, "I'm the 'nHeights' property.")
451 452 timeInterval = property(getTimeInterval, "I'm the 'timeInterval' property")
452 453
453 454
454 455 class Spectra(JROData):
455 456
456 457 # data spc es un numpy array de 2 dmensiones (canales, perfiles, alturas)
457 458 data_spc = None
458 459 # data cspc es un numpy array de 2 dmensiones (canales, pares, alturas)
459 460 data_cspc = None
460 461 # data dc es un numpy array de 2 dmensiones (canales, alturas)
461 462 data_dc = None
462 463 # data power
463 464 data_pwr = None
464 465 nFFTPoints = None
465 466 # nPairs = None
466 467 pairsList = None
467 468 nIncohInt = None
468 469 wavelength = None # Necesario para cacular el rango de velocidad desde la frecuencia
469 470 nCohInt = None # se requiere para determinar el valor de timeInterval
470 471 ippFactor = None
471 472 profileIndex = 0
472 473 plotting = "spectra"
473 474
474 475 def __init__(self):
475 476 '''
476 477 Constructor
477 478 '''
478 479
479 480 self.useLocalTime = True
480 481 self.radarControllerHeaderObj = RadarControllerHeader()
481 482 self.systemHeaderObj = SystemHeader()
482 483 self.type = "Spectra"
483 484 # self.data = None
484 485 # self.dtype = None
485 486 # self.nChannels = 0
486 487 # self.nHeights = 0
487 488 self.nProfiles = None
488 489 self.heightList = None
489 490 self.channelList = None
490 491 # self.channelIndexList = None
491 492 self.pairsList = None
492 493 self.flagNoData = True
493 494 self.flagDiscontinuousBlock = False
494 495 self.utctime = None
495 496 self.nCohInt = None
496 497 self.nIncohInt = None
497 498 self.blocksize = None
498 499 self.nFFTPoints = None
499 500 self.wavelength = None
500 501 self.flagDecodeData = False # asumo q la data no esta decodificada
501 502 self.flagDeflipData = False # asumo q la data no esta sin flip
502 503 self.flagShiftFFT = False
503 504 self.ippFactor = 1
504 505 #self.noise = None
505 506 self.beacon_heiIndexList = []
506 507 self.noise_estimation = None
507 508
508 509 def getNoisebyHildebrand(self, xmin_index=None, xmax_index=None, ymin_index=None, ymax_index=None):
509 510 """
510 511 Determino el nivel de ruido usando el metodo Hildebrand-Sekhon
511 512
512 513 Return:
513 514 noiselevel
514 515 """
515 516
516 517 noise = numpy.zeros(self.nChannels)
517 518
518 519 for channel in range(self.nChannels):
519 520 daux = self.data_spc[channel,
520 521 xmin_index:xmax_index, ymin_index:ymax_index]
521 522 noise[channel] = hildebrand_sekhon(daux, self.nIncohInt)
522 523
523 524 return noise
524 525
525 526 def getNoise(self, xmin_index=None, xmax_index=None, ymin_index=None, ymax_index=None):
526 527
527 528 if self.noise_estimation is not None:
528 529 # this was estimated by getNoise Operation defined in jroproc_spectra.py
529 530 return self.noise_estimation
530 531 else:
531 532 noise = self.getNoisebyHildebrand(
532 533 xmin_index, xmax_index, ymin_index, ymax_index)
533 534 return noise
534 535
535 536 def getFreqRangeTimeResponse(self, extrapoints=0):
536 537
537 538 deltafreq = self.getFmaxTimeResponse() / (self.nFFTPoints * self.ippFactor)
538 539 freqrange = deltafreq * (numpy.arange(self.nFFTPoints + extrapoints) - self.nFFTPoints / 2.) - deltafreq / 2
539 540
540 541 return freqrange
541 542
542 543 def getAcfRange(self, extrapoints=0):
543 544
544 545 deltafreq = 10. / (self.getFmax() / (self.nFFTPoints * self.ippFactor))
545 546 freqrange = deltafreq * (numpy.arange(self.nFFTPoints + extrapoints) -self.nFFTPoints / 2.) - deltafreq / 2
546 547
547 548 return freqrange
548 549
549 550 def getFreqRange(self, extrapoints=0):
550 551
551 552 deltafreq = self.getFmax() / (self.nFFTPoints * self.ippFactor)
552 553 freqrange = deltafreq * (numpy.arange(self.nFFTPoints + extrapoints) -self.nFFTPoints / 2.) - deltafreq / 2
553 554
554 555 return freqrange
555 556
556 557 def getVelRange(self, extrapoints=0):
557 558
558 559 deltav = self.getVmax() / (self.nFFTPoints * self.ippFactor)
559 560 velrange = deltav * (numpy.arange(self.nFFTPoints + extrapoints) - self.nFFTPoints / 2.)
560 561
561 562 if self.nmodes:
562 563 return velrange/self.nmodes
563 564 else:
564 565 return velrange
565 566
566 567 def getNPairs(self):
567 568
568 569 return len(self.pairsList)
569 570
570 571 def getPairsIndexList(self):
571 572
572 573 return list(range(self.nPairs))
573 574
574 575 def getNormFactor(self):
575 576
576 577 pwcode = 1
577 578
578 579 if self.flagDecodeData:
579 580 pwcode = numpy.sum(self.code[0]**2)
580 581 #normFactor = min(self.nFFTPoints,self.nProfiles)*self.nIncohInt*self.nCohInt*pwcode*self.windowOfFilter
581 582 normFactor = self.nProfiles * self.nIncohInt * self.nCohInt * pwcode * self.windowOfFilter
582 583
583 584 return normFactor
584 585
585 586 def getFlagCspc(self):
586 587
587 588 if self.data_cspc is None:
588 589 return True
589 590
590 591 return False
591 592
592 593 def getFlagDc(self):
593 594
594 595 if self.data_dc is None:
595 596 return True
596 597
597 598 return False
598 599
599 600 def getTimeInterval(self):
600 601
601 602 timeInterval = self.ippSeconds * self.nCohInt * self.nIncohInt * self.nProfiles * self.ippFactor
602 603 if self.nmodes:
603 604 return self.nmodes*timeInterval
604 605 else:
605 606 return timeInterval
606 607
607 608 def getPower(self):
608 609
609 610 factor = self.normFactor
610 611 z = self.data_spc / factor
611 612 z = numpy.where(numpy.isfinite(z), z, numpy.NAN)
612 613 avg = numpy.average(z, axis=1)
613 614
614 615 return 10 * numpy.log10(avg)
615 616
616 617 def getCoherence(self, pairsList=None, phase=False):
617 618
618 619 z = []
619 620 if pairsList is None:
620 621 pairsIndexList = self.pairsIndexList
621 622 else:
622 623 pairsIndexList = []
623 624 for pair in pairsList:
624 625 if pair not in self.pairsList:
625 626 raise ValueError("Pair %s is not in dataOut.pairsList" % (
626 627 pair))
627 628 pairsIndexList.append(self.pairsList.index(pair))
628 629 for i in range(len(pairsIndexList)):
629 630 pair = self.pairsList[pairsIndexList[i]]
630 631 ccf = numpy.average(self.data_cspc[pairsIndexList[i], :, :], axis=0)
631 632 powa = numpy.average(self.data_spc[pair[0], :, :], axis=0)
632 633 powb = numpy.average(self.data_spc[pair[1], :, :], axis=0)
633 634 avgcoherenceComplex = ccf / numpy.sqrt(powa * powb)
634 635 if phase:
635 636 data = numpy.arctan2(avgcoherenceComplex.imag,
636 637 avgcoherenceComplex.real) * 180 / numpy.pi
637 638 else:
638 639 data = numpy.abs(avgcoherenceComplex)
639 640
640 641 z.append(data)
641 642
642 643 return numpy.array(z)
643 644
644 645 def setValue(self, value):
645 646
646 647 print("This property should not be initialized")
647 648
648 649 return
649 650
650 651 nPairs = property(getNPairs, setValue, "I'm the 'nPairs' property.")
651 652 pairsIndexList = property(
652 653 getPairsIndexList, setValue, "I'm the 'pairsIndexList' property.")
653 654 normFactor = property(getNormFactor, setValue,
654 655 "I'm the 'getNormFactor' property.")
655 656 flag_cspc = property(getFlagCspc, setValue)
656 657 flag_dc = property(getFlagDc, setValue)
657 658 noise = property(getNoise, setValue, "I'm the 'nHeights' property.")
658 659 timeInterval = property(getTimeInterval, setValue,
659 660 "I'm the 'timeInterval' property")
660 661
661 662
662 663 class SpectraHeis(Spectra):
663 664
664 665 data_spc = None
665 666 data_cspc = None
666 667 data_dc = None
667 668 nFFTPoints = None
668 669 # nPairs = None
669 670 pairsList = None
670 671 nCohInt = None
671 672 nIncohInt = None
672 673
673 674 def __init__(self):
674 675
675 676 self.radarControllerHeaderObj = RadarControllerHeader()
676 677
677 678 self.systemHeaderObj = SystemHeader()
678 679
679 680 self.type = "SpectraHeis"
680 681
681 682 # self.dtype = None
682 683
683 684 # self.nChannels = 0
684 685
685 686 # self.nHeights = 0
686 687
687 688 self.nProfiles = None
688 689
689 690 self.heightList = None
690 691
691 692 self.channelList = None
692 693
693 694 # self.channelIndexList = None
694 695
695 696 self.flagNoData = True
696 697
697 698 self.flagDiscontinuousBlock = False
698 699
699 700 # self.nPairs = 0
700 701
701 702 self.utctime = None
702 703
703 704 self.blocksize = None
704 705
705 706 self.profileIndex = 0
706 707
707 708 self.nCohInt = 1
708 709
709 710 self.nIncohInt = 1
710 711
711 712 def getNormFactor(self):
712 713 pwcode = 1
713 714 if self.flagDecodeData:
714 715 pwcode = numpy.sum(self.code[0]**2)
715 716
716 717 normFactor = self.nIncohInt * self.nCohInt * pwcode
717 718
718 719 return normFactor
719 720
720 721 def getTimeInterval(self):
721 722
722 723 timeInterval = self.ippSeconds * self.nCohInt * self.nIncohInt
723 724
724 725 return timeInterval
725 726
726 727 normFactor = property(getNormFactor, "I'm the 'getNormFactor' property.")
727 728 timeInterval = property(getTimeInterval, "I'm the 'timeInterval' property")
728 729
729 730
730 731 class Fits(JROData):
731 732
732 733 heightList = None
733 734 channelList = None
734 735 flagNoData = True
735 736 flagDiscontinuousBlock = False
736 737 useLocalTime = False
737 738 utctime = None
738 739 timeZone = None
739 740 # ippSeconds = None
740 741 # timeInterval = None
741 742 nCohInt = None
742 743 nIncohInt = None
743 744 noise = None
744 745 windowOfFilter = 1
745 746 # Speed of ligth
746 747 C = 3e8
747 748 frequency = 49.92e6
748 749 realtime = False
749 750
750 751 def __init__(self):
751 752
752 753 self.type = "Fits"
753 754
754 755 self.nProfiles = None
755 756
756 757 self.heightList = None
757 758
758 759 self.channelList = None
759 760
760 761 # self.channelIndexList = None
761 762
762 763 self.flagNoData = True
763 764
764 765 self.utctime = None
765 766
766 767 self.nCohInt = 1
767 768
768 769 self.nIncohInt = 1
769 770
770 771 self.useLocalTime = True
771 772
772 773 self.profileIndex = 0
773 774
774 775 # self.utctime = None
775 776 # self.timeZone = None
776 777 # self.ltctime = None
777 778 # self.timeInterval = None
778 779 # self.header = None
779 780 # self.data_header = None
780 781 # self.data = None
781 782 # self.datatime = None
782 783 # self.flagNoData = False
783 784 # self.expName = ''
784 785 # self.nChannels = None
785 786 # self.nSamples = None
786 787 # self.dataBlocksPerFile = None
787 788 # self.comments = ''
788 789 #
789 790
790 791 def getltctime(self):
791 792
792 793 if self.useLocalTime:
793 794 return self.utctime - self.timeZone * 60
794 795
795 796 return self.utctime
796 797
797 798 def getDatatime(self):
798 799
799 800 datatime = datetime.datetime.utcfromtimestamp(self.ltctime)
800 801 return datatime
801 802
802 803 def getTimeRange(self):
803 804
804 805 datatime = []
805 806
806 807 datatime.append(self.ltctime)
807 808 datatime.append(self.ltctime + self.timeInterval)
808 809
809 810 datatime = numpy.array(datatime)
810 811
811 812 return datatime
812 813
813 814 def getHeiRange(self):
814 815
815 816 heis = self.heightList
816 817
817 818 return heis
818 819
819 820 def getNHeights(self):
820 821
821 822 return len(self.heightList)
822 823
823 824 def getNChannels(self):
824 825
825 826 return len(self.channelList)
826 827
827 828 def getChannelIndexList(self):
828 829
829 830 return list(range(self.nChannels))
830 831
831 832 def getNoise(self, type=1):
832 833
833 834 #noise = numpy.zeros(self.nChannels)
834 835
835 836 if type == 1:
836 837 noise = self.getNoisebyHildebrand()
837 838
838 839 if type == 2:
839 840 noise = self.getNoisebySort()
840 841
841 842 if type == 3:
842 843 noise = self.getNoisebyWindow()
843 844
844 845 return noise
845 846
846 847 def getTimeInterval(self):
847 848
848 849 timeInterval = self.ippSeconds * self.nCohInt * self.nIncohInt
849 850
850 851 return timeInterval
851 852
852 853 def get_ippSeconds(self):
853 854 '''
854 855 '''
855 856 return self.ipp_sec
856 857
857 858
858 859 datatime = property(getDatatime, "I'm the 'datatime' property")
859 860 nHeights = property(getNHeights, "I'm the 'nHeights' property.")
860 861 nChannels = property(getNChannels, "I'm the 'nChannel' property.")
861 862 channelIndexList = property(
862 863 getChannelIndexList, "I'm the 'channelIndexList' property.")
863 864 noise = property(getNoise, "I'm the 'nHeights' property.")
864 865
865 866 ltctime = property(getltctime, "I'm the 'ltctime' property")
866 867 timeInterval = property(getTimeInterval, "I'm the 'timeInterval' property")
867 868 ippSeconds = property(get_ippSeconds, '')
868 869
869 870 class Correlation(JROData):
870 871
871 872 noise = None
872 873 SNR = None
873 874 #--------------------------------------------------
874 875 mode = None
875 876 split = False
876 877 data_cf = None
877 878 lags = None
878 879 lagRange = None
879 880 pairsList = None
880 881 normFactor = None
881 882 #--------------------------------------------------
882 883 # calculateVelocity = None
883 884 nLags = None
884 885 nPairs = None
885 886 nAvg = None
886 887
887 888 def __init__(self):
888 889 '''
889 890 Constructor
890 891 '''
891 892 self.radarControllerHeaderObj = RadarControllerHeader()
892 893
893 894 self.systemHeaderObj = SystemHeader()
894 895
895 896 self.type = "Correlation"
896 897
897 898 self.data = None
898 899
899 900 self.dtype = None
900 901
901 902 self.nProfiles = None
902 903
903 904 self.heightList = None
904 905
905 906 self.channelList = None
906 907
907 908 self.flagNoData = True
908 909
909 910 self.flagDiscontinuousBlock = False
910 911
911 912 self.utctime = None
912 913
913 914 self.timeZone = None
914 915
915 916 self.dstFlag = None
916 917
917 918 self.errorCount = None
918 919
919 920 self.blocksize = None
920 921
921 922 self.flagDecodeData = False # asumo q la data no esta decodificada
922 923
923 924 self.flagDeflipData = False # asumo q la data no esta sin flip
924 925
925 926 self.pairsList = None
926 927
927 928 self.nPoints = None
928 929
929 930 def getPairsList(self):
930 931
931 932 return self.pairsList
932 933
933 934 def getNoise(self, mode=2):
934 935
935 936 indR = numpy.where(self.lagR == 0)[0][0]
936 937 indT = numpy.where(self.lagT == 0)[0][0]
937 938
938 939 jspectra0 = self.data_corr[:, :, indR, :]
939 940 jspectra = copy.copy(jspectra0)
940 941
941 942 num_chan = jspectra.shape[0]
942 943 num_hei = jspectra.shape[2]
943 944
944 945 freq_dc = jspectra.shape[1] / 2
945 946 ind_vel = numpy.array([-2, -1, 1, 2]) + freq_dc
946 947
947 948 if ind_vel[0] < 0:
948 949 ind_vel[list(range(0, 1))] = ind_vel[list(
949 950 range(0, 1))] + self.num_prof
950 951
951 952 if mode == 1:
952 953 jspectra[:, freq_dc, :] = (
953 954 jspectra[:, ind_vel[1], :] + jspectra[:, ind_vel[2], :]) / 2 # CORRECCION
954 955
955 956 if mode == 2:
956 957
957 958 vel = numpy.array([-2, -1, 1, 2])
958 959 xx = numpy.zeros([4, 4])
959 960
960 961 for fil in range(4):
961 962 xx[fil, :] = vel[fil]**numpy.asarray(list(range(4)))
962 963
963 964 xx_inv = numpy.linalg.inv(xx)
964 965 xx_aux = xx_inv[0, :]
965 966
966 967 for ich in range(num_chan):
967 968 yy = jspectra[ich, ind_vel, :]
968 969 jspectra[ich, freq_dc, :] = numpy.dot(xx_aux, yy)
969 970
970 971 junkid = jspectra[ich, freq_dc, :] <= 0
971 972 cjunkid = sum(junkid)
972 973
973 974 if cjunkid.any():
974 975 jspectra[ich, freq_dc, junkid.nonzero()] = (
975 976 jspectra[ich, ind_vel[1], junkid] + jspectra[ich, ind_vel[2], junkid]) / 2
976 977
977 978 noise = jspectra0[:, freq_dc, :] - jspectra[:, freq_dc, :]
978 979
979 980 return noise
980 981
981 982 def getTimeInterval(self):
982 983
983 984 timeInterval = self.ippSeconds * self.nCohInt * self.nProfiles
984 985
985 986 return timeInterval
986 987
987 988 def splitFunctions(self):
988 989
989 990 pairsList = self.pairsList
990 991 ccf_pairs = []
991 992 acf_pairs = []
992 993 ccf_ind = []
993 994 acf_ind = []
994 995 for l in range(len(pairsList)):
995 996 chan0 = pairsList[l][0]
996 997 chan1 = pairsList[l][1]
997 998
998 999 # Obteniendo pares de Autocorrelacion
999 1000 if chan0 == chan1:
1000 1001 acf_pairs.append(chan0)
1001 1002 acf_ind.append(l)
1002 1003 else:
1003 1004 ccf_pairs.append(pairsList[l])
1004 1005 ccf_ind.append(l)
1005 1006
1006 1007 data_acf = self.data_cf[acf_ind]
1007 1008 data_ccf = self.data_cf[ccf_ind]
1008 1009
1009 1010 return acf_ind, ccf_ind, acf_pairs, ccf_pairs, data_acf, data_ccf
1010 1011
1011 1012 def getNormFactor(self):
1012 1013 acf_ind, ccf_ind, acf_pairs, ccf_pairs, data_acf, data_ccf = self.splitFunctions()
1013 1014 acf_pairs = numpy.array(acf_pairs)
1014 1015 normFactor = numpy.zeros((self.nPairs, self.nHeights))
1015 1016
1016 1017 for p in range(self.nPairs):
1017 1018 pair = self.pairsList[p]
1018 1019
1019 1020 ch0 = pair[0]
1020 1021 ch1 = pair[1]
1021 1022
1022 1023 ch0_max = numpy.max(data_acf[acf_pairs == ch0, :, :], axis=1)
1023 1024 ch1_max = numpy.max(data_acf[acf_pairs == ch1, :, :], axis=1)
1024 1025 normFactor[p, :] = numpy.sqrt(ch0_max * ch1_max)
1025 1026
1026 1027 return normFactor
1027 1028
1028 1029 timeInterval = property(getTimeInterval, "I'm the 'timeInterval' property")
1029 1030 normFactor = property(getNormFactor, "I'm the 'normFactor property'")
1030 1031
1031 1032
1032 1033 class Parameters(Spectra):
1033 1034
1034 1035 experimentInfo = None # Information about the experiment
1035 1036 # Information from previous data
1036 1037 inputUnit = None # Type of data to be processed
1037 1038 operation = None # Type of operation to parametrize
1038 1039 # normFactor = None #Normalization Factor
1039 1040 groupList = None # List of Pairs, Groups, etc
1040 1041 # Parameters
1041 1042 data_param = None # Parameters obtained
1042 1043 data_pre = None # Data Pre Parametrization
1043 1044 data_SNR = None # Signal to Noise Ratio
1044 1045 # heightRange = None #Heights
1045 1046 abscissaList = None # Abscissa, can be velocities, lags or time
1046 1047 # noise = None #Noise Potency
1047 1048 utctimeInit = None # Initial UTC time
1048 1049 paramInterval = None # Time interval to calculate Parameters in seconds
1049 1050 useLocalTime = True
1050 1051 # Fitting
1051 1052 data_error = None # Error of the estimation
1052 1053 constants = None
1053 1054 library = None
1054 1055 # Output signal
1055 1056 outputInterval = None # Time interval to calculate output signal in seconds
1056 1057 data_output = None # Out signal
1057 1058 nAvg = None
1058 1059 noise_estimation = None
1059 1060 GauSPC = None # Fit gaussian SPC
1060 1061
1061 1062 def __init__(self):
1062 1063 '''
1063 1064 Constructor
1064 1065 '''
1065 1066 self.radarControllerHeaderObj = RadarControllerHeader()
1066 1067
1067 1068 self.systemHeaderObj = SystemHeader()
1068 1069
1069 1070 self.type = "Parameters"
1070 1071
1071 1072 def getTimeRange1(self, interval):
1072 1073
1073 1074 datatime = []
1074 1075
1075 1076 if self.useLocalTime:
1076 1077 time1 = self.utctimeInit - self.timeZone * 60
1077 1078 else:
1078 1079 time1 = self.utctimeInit
1079 1080
1080 1081 datatime.append(time1)
1081 1082 datatime.append(time1 + interval)
1082 1083 datatime = numpy.array(datatime)
1083 1084
1084 1085 return datatime
1085 1086
1086 1087 def getTimeInterval(self):
1087 1088
1088 1089 if hasattr(self, 'timeInterval1'):
1089 1090 return self.timeInterval1
1090 1091 else:
1091 1092 return self.paramInterval
1092 1093
1093 1094 def setValue(self, value):
1094 1095
1095 1096 print("This property should not be initialized")
1096 1097
1097 1098 return
1098 1099
1099 1100 def getNoise(self):
1100 1101
1101 1102 return self.spc_noise
1102 1103
1103 1104 timeInterval = property(getTimeInterval)
1104 1105 noise = property(getNoise, setValue, "I'm the 'Noise' property.")
1105 1106
1106 1107
1107 1108 class PlotterData(object):
1108 1109 '''
1109 1110 Object to hold data to be plotted
1110 1111 '''
1111 1112
1112 1113 MAXNUMX = 200
1113 1114 MAXNUMY = 200
1114 1115
1115 1116 def __init__(self, code, throttle_value, exp_code, buffering=True, snr=False):
1116 1117
1117 1118 self.key = code
1118 1119 self.throttle = throttle_value
1119 1120 self.exp_code = exp_code
1120 1121 self.buffering = buffering
1121 1122 self.ready = False
1122 1123 self.flagNoData = False
1123 1124 self.localtime = False
1124 1125 self.data = {}
1125 1126 self.meta = {}
1126 1127 self.__heights = []
1127 1128
1128 1129 if 'snr' in code:
1129 1130 self.plottypes = ['snr']
1130 1131 elif code == 'spc':
1131 1132 self.plottypes = ['spc', 'noise', 'rti']
1132 1133 elif code == 'cspc':
1133 1134 self.plottypes = ['cspc', 'spc', 'noise', 'rti']
1134 1135 elif code == 'rti':
1135 1136 self.plottypes = ['noise', 'rti']
1136 1137 else:
1137 1138 self.plottypes = [code]
1138 1139
1139 1140 if 'snr' not in self.plottypes and snr:
1140 1141 self.plottypes.append('snr')
1141 1142
1142 1143 for plot in self.plottypes:
1143 1144 self.data[plot] = {}
1144 1145
1145 1146
1146 1147 def __str__(self):
1147 1148 dum = ['{}{}'.format(key, self.shape(key)) for key in self.data]
1148 1149 return 'Data[{}][{}]'.format(';'.join(dum), len(self.times))
1149 1150
1150 1151 def __len__(self):
1151 1152 return len(self.data[self.key])
1152 1153
1153 1154 def __getitem__(self, key):
1154 1155
1155 1156 if key not in self.data:
1156 1157 raise KeyError(log.error('Missing key: {}'.format(key)))
1157 1158 if 'spc' in key or not self.buffering:
1158 1159 ret = self.data[key][self.tm]
1159 1160 elif 'scope' in key:
1160 1161 ret = numpy.array(self.data[key][float(self.tm)])
1161 1162 else:
1162 1163 ret = numpy.array([self.data[key][x] for x in self.times])
1163 1164 if ret.ndim > 1:
1164 1165 ret = numpy.swapaxes(ret, 0, 1)
1165 1166 return ret
1166 1167
1167 1168 def __contains__(self, key):
1168 1169 return key in self.data
1169 1170
1170 1171 def setup(self):
1171 1172 '''
1172 1173 Configure object
1173 1174 '''
1174 1175 self.type = ''
1175 1176 self.ready = False
1176 1177 del self.data
1177 1178 self.data = {}
1178 1179 self.__heights = []
1179 1180 self.__all_heights = set()
1180 1181 for plot in self.plottypes:
1181 1182 if 'snr' in plot:
1182 1183 plot = 'snr'
1183 1184 elif 'spc_moments' == plot:
1184 1185 plot = 'moments'
1185 1186 self.data[plot] = {}
1186 1187
1187 1188 if 'spc' in self.data or 'rti' in self.data or 'cspc' in self.data or 'moments' in self.data:
1188 1189 self.data['noise'] = {}
1189 1190 self.data['rti'] = {}
1190 1191 if 'noise' not in self.plottypes:
1191 1192 self.plottypes.append('noise')
1192 1193 if 'rti' not in self.plottypes:
1193 1194 self.plottypes.append('rti')
1194 1195
1195 1196 def shape(self, key):
1196 1197 '''
1197 1198 Get the shape of the one-element data for the given key
1198 1199 '''
1199 1200
1200 1201 if len(self.data[key]):
1201 1202 if 'spc' in key or not self.buffering:
1202 1203 return self.data[key].shape
1203 1204 return self.data[key][self.times[0]].shape
1204 1205 return (0,)
1205 1206
1206 1207 def update(self, dataOut, tm):
1207 1208 '''
1208 1209 Update data object with new dataOut
1209 1210 '''
1210 1211
1211 1212 self.profileIndex = dataOut.profileIndex
1212 1213 self.tm = tm
1213 1214 self.type = dataOut.type
1214 1215 self.parameters = getattr(dataOut, 'parameters', [])
1215 1216
1216 1217 if hasattr(dataOut, 'meta'):
1217 1218 self.meta.update(dataOut.meta)
1218 1219
1219 1220 if hasattr(dataOut, 'pairsList'):
1220 1221 self.pairs = dataOut.pairsList
1221 1222
1222 1223 self.interval = dataOut.getTimeInterval()
1223 1224 self.localtime = dataOut.useLocalTime
1224 1225 if True in ['spc' in ptype for ptype in self.plottypes]:
1225 1226 self.xrange = (dataOut.getFreqRange(1)/1000.,
1226 1227 dataOut.getAcfRange(1), dataOut.getVelRange(1))
1227 1228 self.__heights.append(dataOut.heightList)
1228 1229 self.__all_heights.update(dataOut.heightList)
1229 1230
1230 1231 for plot in self.plottypes:
1231 1232 if plot in ('spc', 'spc_moments', 'spc_cut'):
1232 1233 z = dataOut.data_spc/dataOut.normFactor
1233 1234 buffer = 10*numpy.log10(z)
1234 1235 if plot == 'cspc':
1235 1236 buffer = (dataOut.data_spc, dataOut.data_cspc)
1236 1237 if plot == 'noise':
1237 1238 buffer = 10*numpy.log10(dataOut.getNoise()/dataOut.normFactor)
1238 1239 if plot in ('rti', 'spcprofile'):
1239 1240 buffer = dataOut.getPower()
1240 1241 if plot == 'snr_db':
1241 1242 buffer = dataOut.data_SNR
1242 1243 if plot == 'snr':
1243 1244 buffer = 10*numpy.log10(dataOut.data_SNR)
1244 1245 if plot == 'dop':
1245 1246 buffer = dataOut.data_DOP
1246 1247 if plot == 'pow':
1247 1248 buffer = 10*numpy.log10(dataOut.data_POW)
1248 1249 if plot == 'width':
1249 1250 buffer = dataOut.data_WIDTH
1250 1251 if plot == 'coh':
1251 1252 buffer = dataOut.getCoherence()
1252 1253 if plot == 'phase':
1253 1254 buffer = dataOut.getCoherence(phase=True)
1254 1255 if plot == 'output':
1255 1256 buffer = dataOut.data_output
1256 1257 if plot == 'param':
1257 1258 buffer = dataOut.data_param
1258 1259 if plot == 'scope':
1259 1260 buffer = dataOut.data
1260 1261 self.flagDataAsBlock = dataOut.flagDataAsBlock
1261 1262 self.nProfiles = dataOut.nProfiles
1262 1263 if plot == 'pp_power':
1263 1264 buffer = dataOut.data_intensity
1264 1265 self.flagDataAsBlock = dataOut.flagDataAsBlock
1265 1266 self.nProfiles = dataOut.nProfiles
1266 1267 if plot == 'pp_velocity':
1267 1268 buffer = dataOut.data_velocity
1268 1269 self.flagDataAsBlock = dataOut.flagDataAsBlock
1269 1270 self.nProfiles = dataOut.nProfiles
1270 1271 if plot == 'pp_specwidth':
1271 1272 buffer = dataOut.data_specwidth
1272 1273 self.flagDataAsBlock = dataOut.flagDataAsBlock
1273 1274 self.nProfiles = dataOut.nProfiles
1274 1275
1275 1276 if plot == 'spc':
1276 1277 self.data['spc'][tm] = buffer
1277 1278 elif plot == 'cspc':
1278 1279 self.data['cspc'][tm] = buffer
1279 1280 elif plot == 'spc_moments':
1280 1281 self.data['spc'][tm] = buffer
1281 1282 self.data['moments'][tm] = dataOut.moments
1282 1283 else:
1283 1284 if self.buffering:
1284 1285 self.data[plot][tm] = buffer
1285 1286 else:
1286 1287 self.data[plot][tm] = buffer
1287 1288
1288 1289 if dataOut.channelList is None:
1289 1290 self.channels = range(buffer.shape[0])
1290 1291 else:
1291 1292 self.channels = dataOut.channelList
1292 1293
1293 1294 if buffer is None:
1294 1295 self.flagNoData = True
1295 1296 raise schainpy.admin.SchainWarning('Attribute data_{} is empty'.format(self.key))
1296 1297
1297 1298 def normalize_heights(self):
1298 1299 '''
1299 1300 Ensure same-dimension of the data for different heighList
1300 1301 '''
1301 1302
1302 1303 H = numpy.array(list(self.__all_heights))
1303 1304 H.sort()
1304 1305 for key in self.data:
1305 1306 shape = self.shape(key)[:-1] + H.shape
1306 1307 for tm, obj in list(self.data[key].items()):
1307 1308 h = self.__heights[self.times.index(tm)]
1308 1309 if H.size == h.size:
1309 1310 continue
1310 1311 index = numpy.where(numpy.in1d(H, h))[0]
1311 1312 dummy = numpy.zeros(shape) + numpy.nan
1312 1313 if len(shape) == 2:
1313 1314 dummy[:, index] = obj
1314 1315 else:
1315 1316 dummy[index] = obj
1316 1317 self.data[key][tm] = dummy
1317 1318
1318 1319 self.__heights = [H for tm in self.times]
1319 1320
1320 1321 def jsonify(self, tm, plot_name, plot_type, decimate=False):
1321 1322 '''
1322 1323 Convert data to json
1323 1324 '''
1324 1325
1325 1326 dy = int(self.heights.size/self.MAXNUMY) + 1
1326 1327 if self.key in ('spc', 'cspc'):
1327 1328 dx = int(self.data[self.key][tm].shape[1]/self.MAXNUMX) + 1
1328 1329 data = self.roundFloats(
1329 1330 self.data[self.key][tm][::, ::dx, ::dy].tolist())
1330 1331 else:
1331 1332 if self.key is 'noise':
1332 1333 data = [[x] for x in self.roundFloats(self.data[self.key][tm].tolist())]
1333 1334 else:
1334 1335 data = self.roundFloats(self.data[self.key][tm][::, ::dy].tolist())
1335 1336
1336 1337 meta = {}
1337 1338 ret = {
1338 1339 'plot': plot_name,
1339 1340 'code': self.exp_code,
1340 1341 'time': float(tm),
1341 1342 'data': data,
1342 1343 }
1343 1344 meta['type'] = plot_type
1344 1345 meta['interval'] = float(self.interval)
1345 1346 meta['localtime'] = self.localtime
1346 1347 meta['yrange'] = self.roundFloats(self.heights[::dy].tolist())
1347 1348 if 'spc' in self.data or 'cspc' in self.data:
1348 1349 meta['xrange'] = self.roundFloats(self.xrange[2][::dx].tolist())
1349 1350 else:
1350 1351 meta['xrange'] = []
1351 1352
1352 1353 meta.update(self.meta)
1353 1354 ret['metadata'] = meta
1354 1355 return json.dumps(ret)
1355 1356
1356 1357 @property
1357 1358 def times(self):
1358 1359 '''
1359 1360 Return the list of times of the current data
1360 1361 '''
1361 1362
1362 1363 ret = numpy.array([*self.data[self.key]])
1363 1364 if self:
1364 1365 ret.sort()
1365 1366 return ret
1366 1367
1367 1368 @property
1368 1369 def min_time(self):
1369 1370 '''
1370 1371 Return the minimun time value
1371 1372 '''
1372 1373
1373 1374 return self.times[0]
1374 1375
1375 1376 @property
1376 1377 def max_time(self):
1377 1378 '''
1378 1379 Return the maximun time value
1379 1380 '''
1380 1381
1381 1382 return self.times[-1]
1382 1383
1383 1384 @property
1384 1385 def heights(self):
1385 1386 '''
1386 1387 Return the list of heights of the current data
1387 1388 '''
1388 1389
1389 1390 return numpy.array(self.__heights[-1])
1390 1391
1391 1392 @staticmethod
1392 1393 def roundFloats(obj):
1393 1394 if isinstance(obj, list):
1394 1395 return list(map(PlotterData.roundFloats, obj))
1395 1396 elif isinstance(obj, float):
1396 1397 return round(obj, 2)
@@ -1,1580 +1,1575
1 1 """
2 2 Created on Jul 2, 2014
3 3
4 4 @author: roj-idl71
5 5 """
6 6 import os
7 7 import sys
8 8 import glob
9 9 import time
10 10 import numpy
11 11 import fnmatch
12 12 import inspect
13 13 import time
14 14 import datetime
15 15 import zmq
16 16
17 17 from schainpy.model.proc.jroproc_base import Operation, MPDecorator
18 18 from schainpy.model.data.jroheaderIO import PROCFLAG, BasicHeader, SystemHeader, RadarControllerHeader, ProcessingHeader
19 19 from schainpy.model.data.jroheaderIO import get_dtype_index, get_numpy_dtype, get_procflag_dtype, get_dtype_width
20 20 from schainpy.utils import log
21 21 import schainpy.admin
22 22
23 23 LOCALTIME = True
24 24 DT_DIRECTIVES = {
25 25 '%Y': 4,
26 26 '%y': 2,
27 27 '%m': 2,
28 28 '%d': 2,
29 29 '%j': 3,
30 30 '%H': 2,
31 31 '%M': 2,
32 32 '%S': 2,
33 33 '%f': 6
34 34 }
35 35
36 36
37 37 def isNumber(cad):
38 38 """
39 39 Chequea si el conjunto de caracteres que componen un string puede ser convertidos a un numero.
40 40
41 41 Excepciones:
42 42 Si un determinado string no puede ser convertido a numero
43 43 Input:
44 44 str, string al cual se le analiza para determinar si convertible a un numero o no
45 45
46 46 Return:
47 47 True : si el string es uno numerico
48 48 False : no es un string numerico
49 49 """
50 50 try:
51 51 float(cad)
52 52 return True
53 53 except:
54 54 return False
55 55
56 56
57 57 def isFileInEpoch(filename, startUTSeconds, endUTSeconds):
58 58 """
59 59 Esta funcion determina si un archivo de datos se encuentra o no dentro del rango de fecha especificado.
60 60
61 61 Inputs:
62 62 filename : nombre completo del archivo de datos en formato Jicamarca (.r)
63 63
64 64 startUTSeconds : fecha inicial del rango seleccionado. La fecha esta dada en
65 65 segundos contados desde 01/01/1970.
66 66 endUTSeconds : fecha final del rango seleccionado. La fecha esta dada en
67 67 segundos contados desde 01/01/1970.
68 68
69 69 Return:
70 70 Boolean : Retorna True si el archivo de datos contiene datos en el rango de
71 71 fecha especificado, de lo contrario retorna False.
72 72
73 73 Excepciones:
74 74 Si el archivo no existe o no puede ser abierto
75 75 Si la cabecera no puede ser leida.
76 76
77 77 """
78 78 basicHeaderObj = BasicHeader(LOCALTIME)
79 79
80 80 try:
81 81 fp = open(filename, 'rb')
82 82 except IOError:
83 83 print("The file %s can't be opened" % (filename))
84 84 return 0
85 85
86 86 sts = basicHeaderObj.read(fp)
87 87 fp.close()
88 88
89 89 if not(sts):
90 90 print("Skipping the file %s because it has not a valid header" % (filename))
91 91 return 0
92 92
93 93 if not ((startUTSeconds <= basicHeaderObj.utc) and (endUTSeconds > basicHeaderObj.utc)):
94 94 return 0
95 95
96 96 return 1
97 97
98 98
99 99 def isTimeInRange(thisTime, startTime, endTime):
100 100 if endTime >= startTime:
101 101 if (thisTime < startTime) or (thisTime > endTime):
102 102 return 0
103 103 return 1
104 104 else:
105 105 if (thisTime < startTime) and (thisTime > endTime):
106 106 return 0
107 107 return 1
108 108
109 109
110 110 def isFileInTimeRange(filename, startDate, endDate, startTime, endTime):
111 111 """
112 112 Retorna 1 si el archivo de datos se encuentra dentro del rango de horas especificado.
113 113
114 114 Inputs:
115 115 filename : nombre completo del archivo de datos en formato Jicamarca (.r)
116 116
117 117 startDate : fecha inicial del rango seleccionado en formato datetime.date
118 118
119 119 endDate : fecha final del rango seleccionado en formato datetime.date
120 120
121 121 startTime : tiempo inicial del rango seleccionado en formato datetime.time
122 122
123 123 endTime : tiempo final del rango seleccionado en formato datetime.time
124 124
125 125 Return:
126 126 Boolean : Retorna True si el archivo de datos contiene datos en el rango de
127 127 fecha especificado, de lo contrario retorna False.
128 128
129 129 Excepciones:
130 130 Si el archivo no existe o no puede ser abierto
131 131 Si la cabecera no puede ser leida.
132 132
133 133 """
134 134
135 135 try:
136 136 fp = open(filename, 'rb')
137 137 except IOError:
138 138 print("The file %s can't be opened" % (filename))
139 139 return None
140 140
141 141 firstBasicHeaderObj = BasicHeader(LOCALTIME)
142 142 systemHeaderObj = SystemHeader()
143 143 radarControllerHeaderObj = RadarControllerHeader()
144 144 processingHeaderObj = ProcessingHeader()
145 145
146 146 lastBasicHeaderObj = BasicHeader(LOCALTIME)
147 147
148 148 sts = firstBasicHeaderObj.read(fp)
149 149
150 150 if not(sts):
151 151 print("[Reading] Skipping the file %s because it has not a valid header" % (filename))
152 152 return None
153 153
154 154 if not systemHeaderObj.read(fp):
155 155 return None
156 156
157 157 if not radarControllerHeaderObj.read(fp):
158 158 return None
159 159
160 160 if not processingHeaderObj.read(fp):
161 161 return None
162 162
163 163 filesize = os.path.getsize(filename)
164 164
165 165 offset = processingHeaderObj.blockSize + 24 # header size
166 166
167 167 if filesize <= offset:
168 168 print("[Reading] %s: This file has not enough data" % filename)
169 169 return None
170 170
171 171 fp.seek(-offset, 2)
172 172
173 173 sts = lastBasicHeaderObj.read(fp)
174 174
175 175 fp.close()
176 176
177 177 thisDatetime = lastBasicHeaderObj.datatime
178 178 thisTime_last_block = thisDatetime.time()
179 179
180 180 thisDatetime = firstBasicHeaderObj.datatime
181 181 thisDate = thisDatetime.date()
182 182 thisTime_first_block = thisDatetime.time()
183 183
184 184 # General case
185 185 # o>>>>>>>>>>>>>><<<<<<<<<<<<<<o
186 186 #-----------o----------------------------o-----------
187 187 # startTime endTime
188 188
189 189 if endTime >= startTime:
190 190 if (thisTime_last_block < startTime) or (thisTime_first_block > endTime):
191 191 return None
192 192
193 193 return thisDatetime
194 194
195 195 # If endTime < startTime then endTime belongs to the next day
196 196
197 197 #<<<<<<<<<<<o o>>>>>>>>>>>
198 198 #-----------o----------------------------o-----------
199 199 # endTime startTime
200 200
201 201 if (thisDate == startDate) and (thisTime_last_block < startTime):
202 202 return None
203 203
204 204 if (thisDate == endDate) and (thisTime_first_block > endTime):
205 205 return None
206 206
207 207 if (thisTime_last_block < startTime) and (thisTime_first_block > endTime):
208 208 return None
209 209
210 210 return thisDatetime
211 211
212 212
213 213 def isFolderInDateRange(folder, startDate=None, endDate=None):
214 214 """
215 215 Retorna 1 si el archivo de datos se encuentra dentro del rango de horas especificado.
216 216
217 217 Inputs:
218 218 folder : nombre completo del directorio.
219 219 Su formato deberia ser "/path_root/?YYYYDDD"
220 220
221 221 siendo:
222 222 YYYY : Anio (ejemplo 2015)
223 223 DDD : Dia del anio (ejemplo 305)
224 224
225 225 startDate : fecha inicial del rango seleccionado en formato datetime.date
226 226
227 227 endDate : fecha final del rango seleccionado en formato datetime.date
228 228
229 229 Return:
230 230 Boolean : Retorna True si el archivo de datos contiene datos en el rango de
231 231 fecha especificado, de lo contrario retorna False.
232 232 Excepciones:
233 233 Si el directorio no tiene el formato adecuado
234 234 """
235 235
236 236 basename = os.path.basename(folder)
237 237
238 238 if not isRadarFolder(basename):
239 239 print("The folder %s has not the rigth format" % folder)
240 240 return 0
241 241
242 242 if startDate and endDate:
243 243 thisDate = getDateFromRadarFolder(basename)
244 244
245 245 if thisDate < startDate:
246 246 return 0
247 247
248 248 if thisDate > endDate:
249 249 return 0
250 250
251 251 return 1
252 252
253 253
254 254 def isFileInDateRange(filename, startDate=None, endDate=None):
255 255 """
256 256 Retorna 1 si el archivo de datos se encuentra dentro del rango de horas especificado.
257 257
258 258 Inputs:
259 259 filename : nombre completo del archivo de datos en formato Jicamarca (.r)
260 260
261 261 Su formato deberia ser "?YYYYDDDsss"
262 262
263 263 siendo:
264 264 YYYY : Anio (ejemplo 2015)
265 265 DDD : Dia del anio (ejemplo 305)
266 266 sss : set
267 267
268 268 startDate : fecha inicial del rango seleccionado en formato datetime.date
269 269
270 270 endDate : fecha final del rango seleccionado en formato datetime.date
271 271
272 272 Return:
273 273 Boolean : Retorna True si el archivo de datos contiene datos en el rango de
274 274 fecha especificado, de lo contrario retorna False.
275 275 Excepciones:
276 276 Si el archivo no tiene el formato adecuado
277 277 """
278 278
279 279 basename = os.path.basename(filename)
280 280
281 281 if not isRadarFile(basename):
282 282 print("The filename %s has not the rigth format" % filename)
283 283 return 0
284 284
285 285 if startDate and endDate:
286 286 thisDate = getDateFromRadarFile(basename)
287 287
288 288 if thisDate < startDate:
289 289 return 0
290 290
291 291 if thisDate > endDate:
292 292 return 0
293 293
294 294 return 1
295 295
296 296
297 297 def getFileFromSet(path, ext, set):
298 298 validFilelist = []
299 299 fileList = os.listdir(path)
300 300
301 301 # 0 1234 567 89A BCDE
302 302 # H YYYY DDD SSS .ext
303 303
304 304 for thisFile in fileList:
305 305 try:
306 306 year = int(thisFile[1:5])
307 307 doy = int(thisFile[5:8])
308 308 except:
309 309 continue
310 310
311 311 if (os.path.splitext(thisFile)[-1].lower() != ext.lower()):
312 312 continue
313 313
314 314 validFilelist.append(thisFile)
315 315
316 316 myfile = fnmatch.filter(
317 317 validFilelist, '*%4.4d%3.3d%3.3d*' % (year, doy, set))
318 318
319 319 if len(myfile) != 0:
320 320 return myfile[0]
321 321 else:
322 322 filename = '*%4.4d%3.3d%3.3d%s' % (year, doy, set, ext.lower())
323 323 print('the filename %s does not exist' % filename)
324 324 print('...going to the last file: ')
325 325
326 326 if validFilelist:
327 327 validFilelist = sorted(validFilelist, key=str.lower)
328 328 return validFilelist[-1]
329 329
330 330 return None
331 331
332 332
333 333 def getlastFileFromPath(path, ext):
334 334 """
335 335 Depura el fileList dejando solo los que cumplan el formato de "PYYYYDDDSSS.ext"
336 336 al final de la depuracion devuelve el ultimo file de la lista que quedo.
337 337
338 338 Input:
339 339 fileList : lista conteniendo todos los files (sin path) que componen una determinada carpeta
340 340 ext : extension de los files contenidos en una carpeta
341 341
342 342 Return:
343 343 El ultimo file de una determinada carpeta, no se considera el path.
344 344 """
345 345 validFilelist = []
346 346 fileList = os.listdir(path)
347 347
348 348 # 0 1234 567 89A BCDE
349 349 # H YYYY DDD SSS .ext
350 350
351 351 for thisFile in fileList:
352 352
353 353 year = thisFile[1:5]
354 354 if not isNumber(year):
355 355 continue
356 356
357 357 doy = thisFile[5:8]
358 358 if not isNumber(doy):
359 359 continue
360 360
361 361 year = int(year)
362 362 doy = int(doy)
363 363
364 364 if (os.path.splitext(thisFile)[-1].lower() != ext.lower()):
365 365 continue
366 366
367 367 validFilelist.append(thisFile)
368 368
369 369 if validFilelist:
370 370 validFilelist = sorted(validFilelist, key=str.lower)
371 371 return validFilelist[-1]
372 372
373 373 return None
374 374
375 375
376 376 def isRadarFolder(folder):
377 377 try:
378 378 year = int(folder[1:5])
379 379 doy = int(folder[5:8])
380 380 except:
381 381 return 0
382 382
383 383 return 1
384 384
385 385
386 386 def isRadarFile(file):
387 387 try:
388 388 year = int(file[1:5])
389 389 doy = int(file[5:8])
390 390 set = int(file[8:11])
391 391 except:
392 392 return 0
393 393
394 394 return 1
395 395
396 396
397 397 def getDateFromRadarFile(file):
398 398 try:
399 399 year = int(file[1:5])
400 400 doy = int(file[5:8])
401 401 set = int(file[8:11])
402 402 except:
403 403 return None
404 404
405 405 thisDate = datetime.date(year, 1, 1) + datetime.timedelta(doy - 1)
406 406 return thisDate
407 407
408 408
409 409 def getDateFromRadarFolder(folder):
410 410 try:
411 411 year = int(folder[1:5])
412 412 doy = int(folder[5:8])
413 413 except:
414 414 return None
415 415
416 416 thisDate = datetime.date(year, 1, 1) + datetime.timedelta(doy - 1)
417 417 return thisDate
418 418
419 419 def parse_format(s, fmt):
420 420
421 421 for i in range(fmt.count('%')):
422 422 x = fmt.index('%')
423 423 d = DT_DIRECTIVES[fmt[x:x+2]]
424 424 fmt = fmt.replace(fmt[x:x+2], s[x:x+d])
425 425 return fmt
426 426
427 427 class Reader(object):
428 428
429 429 c = 3E8
430 430 isConfig = False
431 431 dtype = None
432 432 pathList = []
433 433 filenameList = []
434 434 datetimeList = []
435 435 filename = None
436 436 ext = None
437 437 flagIsNewFile = 1
438 438 flagDiscontinuousBlock = 0
439 439 flagIsNewBlock = 0
440 440 flagNoMoreFiles = 0
441 441 fp = None
442 442 firstHeaderSize = 0
443 443 basicHeaderSize = 24
444 444 versionFile = 1103
445 445 fileSize = None
446 446 fileSizeByHeader = None
447 447 fileIndex = -1
448 448 profileIndex = None
449 449 blockIndex = 0
450 450 nTotalBlocks = 0
451 451 maxTimeStep = 30
452 452 lastUTTime = None
453 453 datablock = None
454 454 dataOut = None
455 455 getByBlock = False
456 456 path = None
457 457 startDate = None
458 458 endDate = None
459 459 startTime = datetime.time(0, 0, 0)
460 460 endTime = datetime.time(23, 59, 59)
461 461 set = None
462 462 expLabel = ""
463 463 online = False
464 464 delay = 60
465 465 nTries = 3 # quantity tries
466 466 nFiles = 3 # number of files for searching
467 467 walk = True
468 468 getblock = False
469 469 nTxs = 1
470 470 realtime = False
471 471 blocksize = 0
472 472 blocktime = None
473 473 warnings = True
474 474 verbose = True
475 475 server = None
476 476 format = None
477 477 oneDDict = None
478 478 twoDDict = None
479 479 independentParam = None
480 480 filefmt = None
481 481 folderfmt = None
482 482 open_file = open
483 483 open_mode = 'rb'
484 484
485 485 def run(self):
486 486
487 487 raise NotImplementedError
488 488
489 489 def getAllowedArgs(self):
490 490 if hasattr(self, '__attrs__'):
491 491 return self.__attrs__
492 492 else:
493 493 return inspect.getargspec(self.run).args
494 494
495 495 def set_kwargs(self, **kwargs):
496 496
497 497 for key, value in kwargs.items():
498 498 setattr(self, key, value)
499 499
500 500 def find_folders(self, path, startDate, endDate, folderfmt, last=False):
501 501
502 502 folders = [x for f in path.split(',')
503 503 for x in os.listdir(f) if os.path.isdir(os.path.join(f, x))]
504 504 folders.sort()
505 505
506 506 if last:
507 507 folders = [folders[-1]]
508 508
509 509 for folder in folders:
510 510 try:
511 511 dt = datetime.datetime.strptime(parse_format(folder, folderfmt), folderfmt).date()
512 512 if dt >= startDate and dt <= endDate:
513 513 yield os.path.join(path, folder)
514 514 else:
515 515 log.log('Skiping folder {}'.format(folder), self.name)
516 516 except Exception as e:
517 517 log.log('Skiping folder {}'.format(folder), self.name)
518 518 continue
519 519 return
520 520
521 521 def find_files(self, folders, ext, filefmt, startDate=None, endDate=None,
522 522 expLabel='', last=False):
523 523
524 524 for path in folders:
525 525 files = glob.glob1(path, '*{}'.format(ext))
526 526 files.sort()
527 527 if last:
528 528 if files:
529 529 fo = files[-1]
530 530 try:
531 531 dt = datetime.datetime.strptime(parse_format(fo, filefmt), filefmt).date()
532 532 yield os.path.join(path, expLabel, fo)
533 533 except Exception as e:
534 534 pass
535 535 return
536 536 else:
537 537 return
538 538
539 539 for fo in files:
540 540 try:
541 541 dt = datetime.datetime.strptime(parse_format(fo, filefmt), filefmt).date()
542 542 if dt >= startDate and dt <= endDate:
543 543 yield os.path.join(path, expLabel, fo)
544 544 else:
545 545 log.log('Skiping file {}'.format(fo), self.name)
546 546 except Exception as e:
547 547 log.log('Skiping file {}'.format(fo), self.name)
548 548 continue
549 549
550 550 def searchFilesOffLine(self, path, startDate, endDate,
551 551 expLabel, ext, walk,
552 552 filefmt, folderfmt):
553 553 """Search files in offline mode for the given arguments
554 554
555 555 Return:
556 556 Generator of files
557 557 """
558 558
559 559 if walk:
560 560 folders = self.find_folders(
561 561 path, startDate, endDate, folderfmt)
562 562 else:
563 563 folders = path.split(',')
564 564
565 565 return self.find_files(
566 566 folders, ext, filefmt, startDate, endDate, expLabel)
567 567
568 568 def searchFilesOnLine(self, path, startDate, endDate,
569 569 expLabel, ext, walk,
570 570 filefmt, folderfmt):
571 571 """Search for the last file of the last folder
572 572
573 573 Arguments:
574 574 path : carpeta donde estan contenidos los files que contiene data
575 575 expLabel : Nombre del subexperimento (subfolder)
576 576 ext : extension de los files
577 577 walk : Si es habilitado no realiza busquedas dentro de los ubdirectorios (doypath)
578 578
579 579 Return:
580 580 generator with the full path of last filename
581 581 """
582 582
583 583 if walk:
584 584 folders = self.find_folders(
585 585 path, startDate, endDate, folderfmt, last=True)
586 586 else:
587 587 folders = path.split(',')
588 588
589 589 return self.find_files(
590 590 folders, ext, filefmt, startDate, endDate, expLabel, last=True)
591 591
592 592 def setNextFile(self):
593 593 """Set the next file to be readed open it and parse de file header"""
594 594
595 595 while True:
596 596 if self.fp != None:
597 597 self.fp.close()
598 598
599 599 if self.online:
600 600 newFile = self.setNextFileOnline()
601 601 else:
602 602 newFile = self.setNextFileOffline()
603 603
604 604 if not(newFile):
605 605 if self.online:
606 606 raise schainpy.admin.SchainError('Time to wait for new files reach')
607 607 else:
608 608 if self.fileIndex == -1:
609 609 raise schainpy.admin.SchainWarning('No files found in the given path')
610 610 else:
611 611 raise schainpy.admin.SchainWarning('No more files to read')
612 612
613 613 if self.verifyFile(self.filename):
614 614 break
615 615
616 616 log.log('Opening file: %s' % self.filename, self.name)
617 617
618 618 self.readFirstHeader()
619 619 self.nReadBlocks = 0
620 620
621 621 def setNextFileOnline(self):
622 622 """Check for the next file to be readed in online mode.
623 623
624 624 Set:
625 625 self.filename
626 626 self.fp
627 627 self.filesize
628 628
629 629 Return:
630 630 boolean
631 631
632 632 """
633 633 nextFile = True
634 634 nextDay = False
635 635
636 636 for nFiles in range(self.nFiles+1):
637 637 for nTries in range(self.nTries):
638 638 fullfilename, filename = self.checkForRealPath(nextFile, nextDay)
639 639 if fullfilename is not None:
640 640 break
641 641 log.warning(
642 642 "Waiting %0.2f sec for the next file: \"%s\" , try %02d ..." % (self.delay, filename, nTries + 1),
643 643 self.name)
644 644 time.sleep(self.delay)
645 645 nextFile = False
646 646 continue
647 647
648 648 if fullfilename is not None:
649 649 break
650 650
651 651 self.nTries = 1
652 652 nextFile = True
653 653
654 654 if nFiles == (self.nFiles - 1):
655 655 log.log('Trying with next day...', self.name)
656 656 nextDay = True
657 657 self.nTries = 3
658 658
659 659 if fullfilename:
660 660 self.fileSize = os.path.getsize(fullfilename)
661 661 self.filename = fullfilename
662 662 self.flagIsNewFile = 1
663 663 if self.fp != None:
664 664 self.fp.close()
665 665 self.fp = self.open_file(fullfilename, self.open_mode)
666 666 self.flagNoMoreFiles = 0
667 667 self.fileIndex += 1
668 668 return 1
669 669 else:
670 670 return 0
671 671
672 672 def setNextFileOffline(self):
673 673 """Open the next file to be readed in offline mode"""
674 674
675 675 try:
676 676 filename = next(self.filenameList)
677 677 self.fileIndex +=1
678 678 except StopIteration:
679 679 self.flagNoMoreFiles = 1
680 680 return 0
681 681
682 682 self.filename = filename
683 683 self.fileSize = os.path.getsize(filename)
684 684 self.fp = self.open_file(filename, self.open_mode)
685 685 self.flagIsNewFile = 1
686 686
687 687 return 1
688 688
689 689 @staticmethod
690 690 def isDateTimeInRange(dt, startDate, endDate, startTime, endTime):
691 691 """Check if the given datetime is in range"""
692 692
693 693 if startDate <= dt.date() <= endDate:
694 694 if startTime <= dt.time() <= endTime:
695 695 return True
696 696 return False
697 697
698 698 def verifyFile(self, filename):
699 699 """Check for a valid file
700 700
701 701 Arguments:
702 702 filename -- full path filename
703 703
704 704 Return:
705 705 boolean
706 706 """
707 707
708 708 return True
709 709
710 710 def checkForRealPath(self, nextFile, nextDay):
711 711 """Check if the next file to be readed exists"""
712 712
713 713 raise NotImplementedError
714 714
715 715 def readFirstHeader(self):
716 716 """Parse the file header"""
717 717
718 718 pass
719 719
720 720 class JRODataReader(Reader):
721 721
722 722 utc = 0
723 723 nReadBlocks = 0
724 724 foldercounter = 0
725 725 firstHeaderSize = 0
726 726 basicHeaderSize = 24
727 727 __isFirstTimeOnline = 1
728 728 filefmt = "*%Y%j***"
729 729 folderfmt = "*%Y%j"
730 730 __attrs__ = ['path', 'startDate', 'endDate', 'startTime', 'endTime', 'online', 'delay', 'walk']
731 731
732 732 def getDtypeWidth(self):
733 733
734 734 dtype_index = get_dtype_index(self.dtype)
735 735 dtype_width = get_dtype_width(dtype_index)
736 736
737 737 return dtype_width
738 738
739 739 def checkForRealPath(self, nextFile, nextDay):
740 740 """Check if the next file to be readed exists.
741 741
742 742 Example :
743 743 nombre correcto del file es .../.../D2009307/P2009307367.ext
744 744
745 745 Entonces la funcion prueba con las siguientes combinaciones
746 746 .../.../y2009307367.ext
747 747 .../.../Y2009307367.ext
748 748 .../.../x2009307/y2009307367.ext
749 749 .../.../x2009307/Y2009307367.ext
750 750 .../.../X2009307/y2009307367.ext
751 751 .../.../X2009307/Y2009307367.ext
752 752 siendo para este caso, la ultima combinacion de letras, identica al file buscado
753 753
754 754 Return:
755 755 str -- fullpath of the file
756 756 """
757 757
758 758
759 759 if nextFile:
760 760 self.set += 1
761 761 if nextDay:
762 762 self.set = 0
763 763 self.doy += 1
764 764 foldercounter = 0
765 765 prefixDirList = [None, 'd', 'D']
766 766 if self.ext.lower() == ".r": # voltage
767 767 prefixFileList = ['d', 'D']
768 768 elif self.ext.lower() == ".pdata": # spectra
769 769 prefixFileList = ['p', 'P']
770 770
771 771 # barrido por las combinaciones posibles
772 772 for prefixDir in prefixDirList:
773 773 thispath = self.path
774 774 if prefixDir != None:
775 775 # formo el nombre del directorio xYYYYDDD (x=d o x=D)
776 776 if foldercounter == 0:
777 777 thispath = os.path.join(self.path, "%s%04d%03d" %
778 778 (prefixDir, self.year, self.doy))
779 779 else:
780 780 thispath = os.path.join(self.path, "%s%04d%03d_%02d" % (
781 781 prefixDir, self.year, self.doy, foldercounter))
782 782 for prefixFile in prefixFileList: # barrido por las dos combinaciones posibles de "D"
783 783 # formo el nombre del file xYYYYDDDSSS.ext
784 784 filename = "%s%04d%03d%03d%s" % (prefixFile, self.year, self.doy, self.set, self.ext)
785 785 fullfilename = os.path.join(
786 786 thispath, filename)
787 787
788 788 if os.path.exists(fullfilename):
789 789 return fullfilename, filename
790 790
791 791 return None, filename
792 792
793 793 def __waitNewBlock(self):
794 794 """
795 795 Return 1 si se encontro un nuevo bloque de datos, 0 de otra forma.
796 796
797 797 Si el modo de lectura es OffLine siempre retorn 0
798 798 """
799 799 if not self.online:
800 800 return 0
801 801
802 802 if (self.nReadBlocks >= self.processingHeaderObj.dataBlocksPerFile):
803 803 return 0
804 804
805 805 currentPointer = self.fp.tell()
806 806
807 807 neededSize = self.processingHeaderObj.blockSize + self.basicHeaderSize
808 808
809 809 for nTries in range(self.nTries):
810 810
811 811 self.fp.close()
812 812 self.fp = open(self.filename, 'rb')
813 813 self.fp.seek(currentPointer)
814 814
815 815 self.fileSize = os.path.getsize(self.filename)
816 816 currentSize = self.fileSize - currentPointer
817 817
818 818 if (currentSize >= neededSize):
819 819 self.basicHeaderObj.read(self.fp)
820 820 return 1
821 821
822 822 if self.fileSize == self.fileSizeByHeader:
823 823 # self.flagEoF = True
824 824 return 0
825 825
826 826 print("[Reading] Waiting %0.2f seconds for the next block, try %03d ..." % (self.delay, nTries + 1))
827 827 time.sleep(self.delay)
828 828
829 829 return 0
830 830
831 831 def waitDataBlock(self, pointer_location, blocksize=None):
832 832
833 833 currentPointer = pointer_location
834 834 if blocksize is None:
835 835 neededSize = self.processingHeaderObj.blockSize # + self.basicHeaderSize
836 836 else:
837 837 neededSize = blocksize
838 838
839 839 for nTries in range(self.nTries):
840 840 self.fp.close()
841 841 self.fp = open(self.filename, 'rb')
842 842 self.fp.seek(currentPointer)
843 843
844 844 self.fileSize = os.path.getsize(self.filename)
845 845 currentSize = self.fileSize - currentPointer
846 846
847 847 if (currentSize >= neededSize):
848 848 return 1
849 849
850 850 log.warning(
851 851 "Waiting %0.2f seconds for the next block, try %03d ..." % (self.delay, nTries + 1),
852 852 self.name
853 853 )
854 854 time.sleep(self.delay)
855 855
856 856 return 0
857 857
858 858 def __setNewBlock(self):
859 859
860 860 if self.fp == None:
861 861 return 0
862 862
863 863 if self.flagIsNewFile:
864 864 self.lastUTTime = self.basicHeaderObj.utc
865 865 return 1
866 866
867 867 if self.realtime:
868 868 self.flagDiscontinuousBlock = 1
869 869 if not(self.setNextFile()):
870 870 return 0
871 871 else:
872 872 return 1
873 873
874 874 currentSize = self.fileSize - self.fp.tell()
875 875 neededSize = self.processingHeaderObj.blockSize + self.basicHeaderSize
876 876
877 877 if (currentSize >= neededSize):
878 878 self.basicHeaderObj.read(self.fp)
879 879 self.lastUTTime = self.basicHeaderObj.utc
880 880 return 1
881 881
882 882 if self.__waitNewBlock():
883 883 self.lastUTTime = self.basicHeaderObj.utc
884 884 return 1
885 885
886 886 if not(self.setNextFile()):
887 887 return 0
888 888
889 889 deltaTime = self.basicHeaderObj.utc - self.lastUTTime
890 890 self.lastUTTime = self.basicHeaderObj.utc
891 891
892 892 self.flagDiscontinuousBlock = 0
893 893
894 894 if deltaTime > self.maxTimeStep:
895 895 self.flagDiscontinuousBlock = 1
896 896
897 897 return 1
898 898
899 899 def readNextBlock(self):
900 900
901 901 while True:
902 self.__setNewBlock()
902 if not(self.__setNewBlock()):
903 continue
903 904
904 905 if not(self.readBlock()):
905 906 return 0
906 907
907 908 self.getBasicHeader()
908 909
909 910 if not self.isDateTimeInRange(self.dataOut.datatime, self.startDate, self.endDate, self.startTime, self.endTime):
910 911 print("[Reading] Block No. %d/%d -> %s [Skipping]" % (self.nReadBlocks,
911 912 self.processingHeaderObj.dataBlocksPerFile,
912 913 self.dataOut.datatime.ctime()))
913 914 continue
914 915
915 916 break
916 917
917 918 if self.verbose:
918 919 print("[Reading] Block No. %d/%d -> %s" % (self.nReadBlocks,
919 920 self.processingHeaderObj.dataBlocksPerFile,
920 921 self.dataOut.datatime.ctime()))
921 922 return 1
922 923
923 924 def readFirstHeader(self):
924 925
925 926 self.basicHeaderObj.read(self.fp)
926 927 self.systemHeaderObj.read(self.fp)
927 928 self.radarControllerHeaderObj.read(self.fp)
928 929 self.processingHeaderObj.read(self.fp)
929 930 self.firstHeaderSize = self.basicHeaderObj.size
930 931
931 932 datatype = int(numpy.log2((self.processingHeaderObj.processFlags &
932 933 PROCFLAG.DATATYPE_MASK)) - numpy.log2(PROCFLAG.DATATYPE_CHAR))
933 934 if datatype == 0:
934 935 datatype_str = numpy.dtype([('real', '<i1'), ('imag', '<i1')])
935 936 elif datatype == 1:
936 937 datatype_str = numpy.dtype([('real', '<i2'), ('imag', '<i2')])
937 938 elif datatype == 2:
938 939 datatype_str = numpy.dtype([('real', '<i4'), ('imag', '<i4')])
939 940 elif datatype == 3:
940 941 datatype_str = numpy.dtype([('real', '<i8'), ('imag', '<i8')])
941 942 elif datatype == 4:
942 943 datatype_str = numpy.dtype([('real', '<f4'), ('imag', '<f4')])
943 944 elif datatype == 5:
944 945 datatype_str = numpy.dtype([('real', '<f8'), ('imag', '<f8')])
945 946 else:
946 947 raise ValueError('Data type was not defined')
947 948
948 949 self.dtype = datatype_str
949 950 #self.ippSeconds = 2 * 1000 * self.radarControllerHeaderObj.ipp / self.c
950 951 self.fileSizeByHeader = self.processingHeaderObj.dataBlocksPerFile * self.processingHeaderObj.blockSize + \
951 952 self.firstHeaderSize + self.basicHeaderSize * \
952 953 (self.processingHeaderObj.dataBlocksPerFile - 1)
953 954 # self.dataOut.channelList = numpy.arange(self.systemHeaderObj.numChannels)
954 955 # self.dataOut.channelIndexList = numpy.arange(self.systemHeaderObj.numChannels)
955 956 self.getBlockDimension()
956 957
957 def verifyFile(self, filename, msgFlag=True):
958 def verifyFile(self, filename):
958 959
959 msg = None
960 flag = True
960 961
961 962 try:
962 963 fp = open(filename, 'rb')
963 964 except IOError:
964
965 if msgFlag:
966 print("[Reading] File %s can't be opened" % (filename))
967
965 log.error("File {} can't be opened".format(filename), self.name)
968 966 return False
969 967
970 if self.waitDataBlock(0):
971 basicHeaderObj = BasicHeader(LOCALTIME)
972 systemHeaderObj = SystemHeader()
973 radarControllerHeaderObj = RadarControllerHeader()
974 processingHeaderObj = ProcessingHeader()
975
976 if not(basicHeaderObj.read(fp)):
977 fp.close()
978 return False
979
980 if not(systemHeaderObj.read(fp)):
981 fp.close()
982 return False
983
984 if not(radarControllerHeaderObj.read(fp)):
985 fp.close()
986 return False
987
988 if not(processingHeaderObj.read(fp)):
989 fp.close()
990 return False
991
992 if not self.online:
993 dt1 = basicHeaderObj.datatime
994 fp.seek(self.fileSize-processingHeaderObj.blockSize-24)
968 if self.online and self.waitDataBlock(0):
969 pass
970
971 basicHeaderObj = BasicHeader(LOCALTIME)
972 systemHeaderObj = SystemHeader()
973 radarControllerHeaderObj = RadarControllerHeader()
974 processingHeaderObj = ProcessingHeader()
975
976 if not(basicHeaderObj.read(fp)):
977 flag = False
978 if not(systemHeaderObj.read(fp)):
979 flag = False
980 if not(radarControllerHeaderObj.read(fp)):
981 flag = False
982 if not(processingHeaderObj.read(fp)):
983 flag = False
984 if not self.online:
985 dt1 = basicHeaderObj.datatime
986 pos = self.fileSize-processingHeaderObj.blockSize-24
987 if pos<0:
988 flag = False
989 log.error('Invalid size for file: {}'.format(self.filename), self.name)
990 else:
991 fp.seek(pos)
995 992 if not(basicHeaderObj.read(fp)):
996 fp.close()
997 return False
998 dt2 = basicHeaderObj.datatime
999 if not self.isDateTimeInRange(dt1, self.startDate, self.endDate, self.startTime, self.endTime) and not \
1000 self.isDateTimeInRange(dt2, self.startDate, self.endDate, self.startTime, self.endTime):
1001 return False
993 flag = False
994 dt2 = basicHeaderObj.datatime
995 if not self.isDateTimeInRange(dt1, self.startDate, self.endDate, self.startTime, self.endTime) and not \
996 self.isDateTimeInRange(dt2, self.startDate, self.endDate, self.startTime, self.endTime):
997 flag = False
1002 998
1003 999 fp.close()
1004
1005 return True
1000 return flag
1006 1001
1007 1002 def findDatafiles(self, path, startDate=None, endDate=None, expLabel='', ext='.r', walk=True, include_path=False):
1008 1003
1009 1004 path_empty = True
1010 1005
1011 1006 dateList = []
1012 1007 pathList = []
1013 1008
1014 1009 multi_path = path.split(',')
1015 1010
1016 1011 if not walk:
1017 1012
1018 1013 for single_path in multi_path:
1019 1014
1020 1015 if not os.path.isdir(single_path):
1021 1016 continue
1022 1017
1023 1018 fileList = glob.glob1(single_path, "*" + ext)
1024 1019
1025 1020 if not fileList:
1026 1021 continue
1027 1022
1028 1023 path_empty = False
1029 1024
1030 1025 fileList.sort()
1031 1026
1032 1027 for thisFile in fileList:
1033 1028
1034 1029 if not os.path.isfile(os.path.join(single_path, thisFile)):
1035 1030 continue
1036 1031
1037 1032 if not isRadarFile(thisFile):
1038 1033 continue
1039 1034
1040 1035 if not isFileInDateRange(thisFile, startDate, endDate):
1041 1036 continue
1042 1037
1043 1038 thisDate = getDateFromRadarFile(thisFile)
1044 1039
1045 1040 if thisDate in dateList or single_path in pathList:
1046 1041 continue
1047 1042
1048 1043 dateList.append(thisDate)
1049 1044 pathList.append(single_path)
1050 1045
1051 1046 else:
1052 1047 for single_path in multi_path:
1053 1048
1054 1049 if not os.path.isdir(single_path):
1055 1050 continue
1056 1051
1057 1052 dirList = []
1058 1053
1059 1054 for thisPath in os.listdir(single_path):
1060 1055
1061 1056 if not os.path.isdir(os.path.join(single_path, thisPath)):
1062 1057 continue
1063 1058
1064 1059 if not isRadarFolder(thisPath):
1065 1060 continue
1066 1061
1067 1062 if not isFolderInDateRange(thisPath, startDate, endDate):
1068 1063 continue
1069 1064
1070 1065 dirList.append(thisPath)
1071 1066
1072 1067 if not dirList:
1073 1068 continue
1074 1069
1075 1070 dirList.sort()
1076 1071
1077 1072 for thisDir in dirList:
1078 1073
1079 1074 datapath = os.path.join(single_path, thisDir, expLabel)
1080 1075 fileList = glob.glob1(datapath, "*" + ext)
1081 1076
1082 1077 if not fileList:
1083 1078 continue
1084 1079
1085 1080 path_empty = False
1086 1081
1087 1082 thisDate = getDateFromRadarFolder(thisDir)
1088 1083
1089 1084 pathList.append(datapath)
1090 1085 dateList.append(thisDate)
1091 1086
1092 1087 dateList.sort()
1093 1088
1094 1089 if walk:
1095 1090 pattern_path = os.path.join(multi_path[0], "[dYYYYDDD]", expLabel)
1096 1091 else:
1097 1092 pattern_path = multi_path[0]
1098 1093
1099 1094 if path_empty:
1100 1095 raise schainpy.admin.SchainError("[Reading] No *%s files in %s for %s to %s" % (ext, pattern_path, startDate, endDate))
1101 1096 else:
1102 1097 if not dateList:
1103 1098 raise schainpy.admin.SchainError("[Reading] Date range selected invalid [%s - %s]: No *%s files in %s)" % (startDate, endDate, ext, path))
1104 1099
1105 1100 if include_path:
1106 1101 return dateList, pathList
1107 1102
1108 1103 return dateList
1109 1104
1110 1105 def setup(self, **kwargs):
1111 1106
1112 1107 self.set_kwargs(**kwargs)
1113 1108 if not self.ext.startswith('.'):
1114 1109 self.ext = '.{}'.format(self.ext)
1115 1110
1116 1111 if self.server is not None:
1117 1112 if 'tcp://' in self.server:
1118 1113 address = server
1119 1114 else:
1120 1115 address = 'ipc:///tmp/%s' % self.server
1121 1116 self.server = address
1122 1117 self.context = zmq.Context()
1123 1118 self.receiver = self.context.socket(zmq.PULL)
1124 1119 self.receiver.connect(self.server)
1125 1120 time.sleep(0.5)
1126 1121 print('[Starting] ReceiverData from {}'.format(self.server))
1127 1122 else:
1128 1123 self.server = None
1129 1124 if self.path == None:
1130 1125 raise ValueError("[Reading] The path is not valid")
1131 1126
1132 1127 if self.online:
1133 1128 log.log("[Reading] Searching files in online mode...", self.name)
1134 1129
1135 1130 for nTries in range(self.nTries):
1136 1131 fullpath = self.searchFilesOnLine(self.path, self.startDate,
1137 1132 self.endDate, self.expLabel, self.ext, self.walk,
1138 1133 self.filefmt, self.folderfmt)
1139 1134
1140 1135 try:
1141 1136 fullpath = next(fullpath)
1142 1137 except:
1143 1138 fullpath = None
1144 1139
1145 1140 if fullpath:
1146 1141 break
1147 1142
1148 1143 log.warning(
1149 1144 'Waiting {} sec for a valid file in {}: try {} ...'.format(
1150 1145 self.delay, self.path, nTries + 1),
1151 1146 self.name)
1152 1147 time.sleep(self.delay)
1153 1148
1154 1149 if not(fullpath):
1155 1150 raise schainpy.admin.SchainError(
1156 1151 'There isn\'t any valid file in {}'.format(self.path))
1157 1152
1158 1153 pathname, filename = os.path.split(fullpath)
1159 1154 self.year = int(filename[1:5])
1160 1155 self.doy = int(filename[5:8])
1161 1156 self.set = int(filename[8:11]) - 1
1162 1157 else:
1163 1158 log.log("Searching files in {}".format(self.path), self.name)
1164 1159 self.filenameList = self.searchFilesOffLine(self.path, self.startDate,
1165 1160 self.endDate, self.expLabel, self.ext, self.walk, self.filefmt, self.folderfmt)
1166 1161
1167 1162 self.setNextFile()
1168 1163
1169 1164 return
1170 1165
1171 1166 def getBasicHeader(self):
1172 1167
1173 1168 self.dataOut.utctime = self.basicHeaderObj.utc + self.basicHeaderObj.miliSecond / \
1174 1169 1000. + self.profileIndex * self.radarControllerHeaderObj.ippSeconds
1175 1170
1176 1171 self.dataOut.flagDiscontinuousBlock = self.flagDiscontinuousBlock
1177 1172
1178 1173 self.dataOut.timeZone = self.basicHeaderObj.timeZone
1179 1174
1180 1175 self.dataOut.dstFlag = self.basicHeaderObj.dstFlag
1181 1176
1182 1177 self.dataOut.errorCount = self.basicHeaderObj.errorCount
1183 1178
1184 1179 self.dataOut.useLocalTime = self.basicHeaderObj.useLocalTime
1185 1180
1186 1181 self.dataOut.ippSeconds = self.radarControllerHeaderObj.ippSeconds / self.nTxs
1187 1182
1188 1183 # self.dataOut.nProfiles = self.processingHeaderObj.profilesPerBlock*self.nTxs
1189 1184
1190 1185 def getFirstHeader(self):
1191 1186
1192 1187 raise NotImplementedError
1193 1188
1194 1189 def getData(self):
1195 1190
1196 1191 raise NotImplementedError
1197 1192
1198 1193 def hasNotDataInBuffer(self):
1199 1194
1200 1195 raise NotImplementedError
1201 1196
1202 1197 def readBlock(self):
1203 1198
1204 1199 raise NotImplementedError
1205 1200
1206 1201 def isEndProcess(self):
1207 1202
1208 1203 return self.flagNoMoreFiles
1209 1204
1210 1205 def printReadBlocks(self):
1211 1206
1212 1207 print("[Reading] Number of read blocks per file %04d" % self.nReadBlocks)
1213 1208
1214 1209 def printTotalBlocks(self):
1215 1210
1216 1211 print("[Reading] Number of read blocks %04d" % self.nTotalBlocks)
1217 1212
1218 1213 def run(self, **kwargs):
1219 1214 """
1220 1215
1221 1216 Arguments:
1222 1217 path :
1223 1218 startDate :
1224 1219 endDate :
1225 1220 startTime :
1226 1221 endTime :
1227 1222 set :
1228 1223 expLabel :
1229 1224 ext :
1230 1225 online :
1231 1226 delay :
1232 1227 walk :
1233 1228 getblock :
1234 1229 nTxs :
1235 1230 realtime :
1236 1231 blocksize :
1237 1232 blocktime :
1238 1233 skip :
1239 1234 cursor :
1240 1235 warnings :
1241 1236 server :
1242 1237 verbose :
1243 1238 format :
1244 1239 oneDDict :
1245 1240 twoDDict :
1246 1241 independentParam :
1247 1242 """
1248 1243
1249 1244 if not(self.isConfig):
1250 1245 self.setup(**kwargs)
1251 1246 self.isConfig = True
1252 1247 if self.server is None:
1253 1248 self.getData()
1254 1249 else:
1255 1250 self.getFromServer()
1256 1251
1257 1252
1258 1253 class JRODataWriter(Reader):
1259 1254
1260 1255 """
1261 1256 Esta clase permite escribir datos a archivos procesados (.r o ,pdata). La escritura
1262 1257 de los datos siempre se realiza por bloques.
1263 1258 """
1264 1259
1265 1260 setFile = None
1266 1261 profilesPerBlock = None
1267 1262 blocksPerFile = None
1268 1263 nWriteBlocks = 0
1269 1264 fileDate = None
1270 1265
1271 1266 def __init__(self, dataOut=None):
1272 1267 raise NotImplementedError
1273 1268
1274 1269 def hasAllDataInBuffer(self):
1275 1270 raise NotImplementedError
1276 1271
1277 1272 def setBlockDimension(self):
1278 1273 raise NotImplementedError
1279 1274
1280 1275 def writeBlock(self):
1281 1276 raise NotImplementedError
1282 1277
1283 1278 def putData(self):
1284 1279 raise NotImplementedError
1285 1280
1286 1281 def getDtypeWidth(self):
1287 1282
1288 1283 dtype_index = get_dtype_index(self.dtype)
1289 1284 dtype_width = get_dtype_width(dtype_index)
1290 1285
1291 1286 return dtype_width
1292 1287
1293 1288 def getProcessFlags(self):
1294 1289
1295 1290 processFlags = 0
1296 1291
1297 1292 dtype_index = get_dtype_index(self.dtype)
1298 1293 procflag_dtype = get_procflag_dtype(dtype_index)
1299 1294
1300 1295 processFlags += procflag_dtype
1301 1296
1302 1297 if self.dataOut.flagDecodeData:
1303 1298 processFlags += PROCFLAG.DECODE_DATA
1304 1299
1305 1300 if self.dataOut.flagDeflipData:
1306 1301 processFlags += PROCFLAG.DEFLIP_DATA
1307 1302
1308 1303 if self.dataOut.code is not None:
1309 1304 processFlags += PROCFLAG.DEFINE_PROCESS_CODE
1310 1305
1311 1306 if self.dataOut.nCohInt > 1:
1312 1307 processFlags += PROCFLAG.COHERENT_INTEGRATION
1313 1308
1314 1309 if self.dataOut.type == "Spectra":
1315 1310 if self.dataOut.nIncohInt > 1:
1316 1311 processFlags += PROCFLAG.INCOHERENT_INTEGRATION
1317 1312
1318 1313 if self.dataOut.data_dc is not None:
1319 1314 processFlags += PROCFLAG.SAVE_CHANNELS_DC
1320 1315
1321 1316 if self.dataOut.flagShiftFFT:
1322 1317 processFlags += PROCFLAG.SHIFT_FFT_DATA
1323 1318
1324 1319 return processFlags
1325 1320
1326 1321 def setBasicHeader(self):
1327 1322
1328 1323 self.basicHeaderObj.size = self.basicHeaderSize # bytes
1329 1324 self.basicHeaderObj.version = self.versionFile
1330 1325 self.basicHeaderObj.dataBlock = self.nTotalBlocks
1331 1326 utc = numpy.floor(self.dataOut.utctime)
1332 1327 milisecond = (self.dataOut.utctime - utc) * 1000.0
1333 1328 self.basicHeaderObj.utc = utc
1334 1329 self.basicHeaderObj.miliSecond = milisecond
1335 1330 self.basicHeaderObj.timeZone = self.dataOut.timeZone
1336 1331 self.basicHeaderObj.dstFlag = self.dataOut.dstFlag
1337 1332 self.basicHeaderObj.errorCount = self.dataOut.errorCount
1338 1333
1339 1334 def setFirstHeader(self):
1340 1335 """
1341 1336 Obtiene una copia del First Header
1342 1337
1343 1338 Affected:
1344 1339
1345 1340 self.basicHeaderObj
1346 1341 self.systemHeaderObj
1347 1342 self.radarControllerHeaderObj
1348 1343 self.processingHeaderObj self.
1349 1344
1350 1345 Return:
1351 1346 None
1352 1347 """
1353 1348
1354 1349 raise NotImplementedError
1355 1350
1356 1351 def __writeFirstHeader(self):
1357 1352 """
1358 1353 Escribe el primer header del file es decir el Basic header y el Long header (SystemHeader, RadarControllerHeader, ProcessingHeader)
1359 1354
1360 1355 Affected:
1361 1356 __dataType
1362 1357
1363 1358 Return:
1364 1359 None
1365 1360 """
1366 1361
1367 1362 # CALCULAR PARAMETROS
1368 1363
1369 1364 sizeLongHeader = self.systemHeaderObj.size + \
1370 1365 self.radarControllerHeaderObj.size + self.processingHeaderObj.size
1371 1366 self.basicHeaderObj.size = self.basicHeaderSize + sizeLongHeader
1372 1367
1373 1368 self.basicHeaderObj.write(self.fp)
1374 1369 self.systemHeaderObj.write(self.fp)
1375 1370 self.radarControllerHeaderObj.write(self.fp)
1376 1371 self.processingHeaderObj.write(self.fp)
1377 1372
1378 1373 def __setNewBlock(self):
1379 1374 """
1380 1375 Si es un nuevo file escribe el First Header caso contrario escribe solo el Basic Header
1381 1376
1382 1377 Return:
1383 1378 0 : si no pudo escribir nada
1384 1379 1 : Si escribio el Basic el First Header
1385 1380 """
1386 1381 if self.fp == None:
1387 1382 self.setNextFile()
1388 1383
1389 1384 if self.flagIsNewFile:
1390 1385 return 1
1391 1386
1392 1387 if self.blockIndex < self.processingHeaderObj.dataBlocksPerFile:
1393 1388 self.basicHeaderObj.write(self.fp)
1394 1389 return 1
1395 1390
1396 1391 if not(self.setNextFile()):
1397 1392 return 0
1398 1393
1399 1394 return 1
1400 1395
1401 1396 def writeNextBlock(self):
1402 1397 """
1403 1398 Selecciona el bloque siguiente de datos y los escribe en un file
1404 1399
1405 1400 Return:
1406 1401 0 : Si no hizo pudo escribir el bloque de datos
1407 1402 1 : Si no pudo escribir el bloque de datos
1408 1403 """
1409 1404 if not(self.__setNewBlock()):
1410 1405 return 0
1411 1406
1412 1407 self.writeBlock()
1413 1408
1414 1409 print("[Writing] Block No. %d/%d" % (self.blockIndex,
1415 1410 self.processingHeaderObj.dataBlocksPerFile))
1416 1411
1417 1412 return 1
1418 1413
1419 1414 def setNextFile(self):
1420 1415 """Determina el siguiente file que sera escrito
1421 1416
1422 1417 Affected:
1423 1418 self.filename
1424 1419 self.subfolder
1425 1420 self.fp
1426 1421 self.setFile
1427 1422 self.flagIsNewFile
1428 1423
1429 1424 Return:
1430 1425 0 : Si el archivo no puede ser escrito
1431 1426 1 : Si el archivo esta listo para ser escrito
1432 1427 """
1433 1428 ext = self.ext
1434 1429 path = self.path
1435 1430
1436 1431 if self.fp != None:
1437 1432 self.fp.close()
1438 1433
1439 1434 if not os.path.exists(path):
1440 1435 os.mkdir(path)
1441 1436
1442 1437 timeTuple = time.localtime(self.dataOut.utctime)
1443 1438 subfolder = 'd%4.4d%3.3d' % (timeTuple.tm_year, timeTuple.tm_yday)
1444 1439
1445 1440 fullpath = os.path.join(path, subfolder)
1446 1441 setFile = self.setFile
1447 1442
1448 1443 if not(os.path.exists(fullpath)):
1449 1444 os.mkdir(fullpath)
1450 1445 setFile = -1 # inicializo mi contador de seteo
1451 1446 else:
1452 1447 filesList = os.listdir(fullpath)
1453 1448 if len(filesList) > 0:
1454 1449 filesList = sorted(filesList, key=str.lower)
1455 1450 filen = filesList[-1]
1456 1451 # el filename debera tener el siguiente formato
1457 1452 # 0 1234 567 89A BCDE (hex)
1458 1453 # x YYYY DDD SSS .ext
1459 1454 if isNumber(filen[8:11]):
1460 1455 # inicializo mi contador de seteo al seteo del ultimo file
1461 1456 setFile = int(filen[8:11])
1462 1457 else:
1463 1458 setFile = -1
1464 1459 else:
1465 1460 setFile = -1 # inicializo mi contador de seteo
1466 1461
1467 1462 setFile += 1
1468 1463
1469 1464 # If this is a new day it resets some values
1470 1465 if self.dataOut.datatime.date() > self.fileDate:
1471 1466 setFile = 0
1472 1467 self.nTotalBlocks = 0
1473 1468
1474 1469 filen = '{}{:04d}{:03d}{:03d}{}'.format(
1475 1470 self.optchar, timeTuple.tm_year, timeTuple.tm_yday, setFile, ext)
1476 1471
1477 1472 filename = os.path.join(path, subfolder, filen)
1478 1473
1479 1474 fp = open(filename, 'wb')
1480 1475
1481 1476 self.blockIndex = 0
1482 1477 self.filename = filename
1483 1478 self.subfolder = subfolder
1484 1479 self.fp = fp
1485 1480 self.setFile = setFile
1486 1481 self.flagIsNewFile = 1
1487 1482 self.fileDate = self.dataOut.datatime.date()
1488 1483 self.setFirstHeader()
1489 1484
1490 1485 print('[Writing] Opening file: %s' % self.filename)
1491 1486
1492 1487 self.__writeFirstHeader()
1493 1488
1494 1489 return 1
1495 1490
1496 1491 def setup(self, dataOut, path, blocksPerFile, profilesPerBlock=64, set=None, ext=None, datatype=4):
1497 1492 """
1498 1493 Setea el tipo de formato en la cual sera guardada la data y escribe el First Header
1499 1494
1500 1495 Inputs:
1501 1496 path : directory where data will be saved
1502 1497 profilesPerBlock : number of profiles per block
1503 1498 set : initial file set
1504 1499 datatype : An integer number that defines data type:
1505 1500 0 : int8 (1 byte)
1506 1501 1 : int16 (2 bytes)
1507 1502 2 : int32 (4 bytes)
1508 1503 3 : int64 (8 bytes)
1509 1504 4 : float32 (4 bytes)
1510 1505 5 : double64 (8 bytes)
1511 1506
1512 1507 Return:
1513 1508 0 : Si no realizo un buen seteo
1514 1509 1 : Si realizo un buen seteo
1515 1510 """
1516 1511
1517 1512 if ext == None:
1518 1513 ext = self.ext
1519 1514
1520 1515 self.ext = ext.lower()
1521 1516
1522 1517 self.path = path
1523 1518
1524 1519 if set is None:
1525 1520 self.setFile = -1
1526 1521 else:
1527 1522 self.setFile = set - 1
1528 1523
1529 1524 self.blocksPerFile = blocksPerFile
1530 1525 self.profilesPerBlock = profilesPerBlock
1531 1526 self.dataOut = dataOut
1532 1527 self.fileDate = self.dataOut.datatime.date()
1533 1528 self.dtype = self.dataOut.dtype
1534 1529
1535 1530 if datatype is not None:
1536 1531 self.dtype = get_numpy_dtype(datatype)
1537 1532
1538 1533 if not(self.setNextFile()):
1539 1534 print("[Writing] There isn't a next file")
1540 1535 return 0
1541 1536
1542 1537 self.setBlockDimension()
1543 1538
1544 1539 return 1
1545 1540
1546 1541 def run(self, dataOut, path, blocksPerFile=100, profilesPerBlock=64, set=None, ext=None, datatype=4, **kwargs):
1547 1542
1548 1543 if not(self.isConfig):
1549 1544
1550 1545 self.setup(dataOut, path, blocksPerFile, profilesPerBlock=profilesPerBlock,
1551 1546 set=set, ext=ext, datatype=datatype, **kwargs)
1552 1547 self.isConfig = True
1553 1548
1554 1549 self.dataOut = dataOut
1555 1550 self.putData()
1556 1551 return self.dataOut
1557 1552
1558 1553 @MPDecorator
1559 1554 class printInfo(Operation):
1560 1555
1561 1556 def __init__(self):
1562 1557
1563 1558 Operation.__init__(self)
1564 1559 self.__printInfo = True
1565 1560
1566 1561 def run(self, dataOut, headers = ['systemHeaderObj', 'radarControllerHeaderObj', 'processingHeaderObj']):
1567 1562 if self.__printInfo == False:
1568 1563 return
1569 1564
1570 1565 for header in headers:
1571 1566 if hasattr(dataOut, header):
1572 1567 obj = getattr(dataOut, header)
1573 1568 if hasattr(obj, 'printInfo'):
1574 1569 obj.printInfo()
1575 1570 else:
1576 1571 print(obj)
1577 1572 else:
1578 1573 log.warning('Header {} Not found in object'.format(header))
1579 1574
1580 1575 self.__printInfo = False
@@ -1,527 +1,527
1 1 '''
2 2 Created on Jul 2, 2014
3 3
4 4 @author: roj-idl71
5 5 '''
6 6 import numpy
7 7
8 8 from schainpy.model.io.jroIO_base import LOCALTIME, JRODataReader, JRODataWriter
9 9 from schainpy.model.proc.jroproc_base import ProcessingUnit, Operation, MPDecorator
10 10 from schainpy.model.data.jroheaderIO import PROCFLAG, BasicHeader, SystemHeader, RadarControllerHeader, ProcessingHeader
11 11 from schainpy.model.data.jrodata import Spectra
12 12 from schainpy.utils import log
13 13
14 14
15 15 class SpectraReader(JRODataReader, ProcessingUnit):
16 16 """
17 17 Esta clase permite leer datos de espectros desde archivos procesados (.pdata). La lectura
18 18 de los datos siempre se realiza por bloques. Los datos leidos (array de 3 dimensiones)
19 19 son almacenados en tres buffer's para el Self Spectra, el Cross Spectra y el DC Channel.
20 20
21 21 paresCanalesIguales * alturas * perfiles (Self Spectra)
22 22 paresCanalesDiferentes * alturas * perfiles (Cross Spectra)
23 23 canales * alturas (DC Channels)
24 24
25 25 Esta clase contiene instancias (objetos) de las clases BasicHeader, SystemHeader,
26 26 RadarControllerHeader y Spectra. Los tres primeros se usan para almacenar informacion de la
27 27 cabecera de datos (metadata), y el cuarto (Spectra) para obtener y almacenar un bloque de
28 28 datos desde el "buffer" cada vez que se ejecute el metodo "getData".
29 29
30 30 Example:
31 31 dpath = "/home/myuser/data"
32 32
33 33 startTime = datetime.datetime(2010,1,20,0,0,0,0,0,0)
34 34
35 35 endTime = datetime.datetime(2010,1,21,23,59,59,0,0,0)
36 36
37 37 readerObj = SpectraReader()
38 38
39 39 readerObj.setup(dpath, startTime, endTime)
40 40
41 41 while(True):
42 42
43 43 readerObj.getData()
44 44
45 45 print readerObj.data_spc
46 46
47 47 print readerObj.data_cspc
48 48
49 49 print readerObj.data_dc
50 50
51 51 if readerObj.flagNoMoreFiles:
52 52 break
53 53
54 54 """
55 55
56 56 def __init__(self):#, **kwargs):
57 57 """
58 58 Inicializador de la clase SpectraReader para la lectura de datos de espectros.
59 59
60 60 Inputs:
61 61 dataOut : Objeto de la clase Spectra. Este objeto sera utilizado para
62 62 almacenar un perfil de datos cada vez que se haga un requerimiento
63 63 (getData). El perfil sera obtenido a partir del buffer de datos,
64 64 si el buffer esta vacio se hara un nuevo proceso de lectura de un
65 65 bloque de datos.
66 66 Si este parametro no es pasado se creara uno internamente.
67 67
68 68 Affected:
69 69 self.dataOut
70 70
71 71 Return : None
72 72 """
73 73
74 74 ProcessingUnit.__init__(self)
75 75
76 76 self.pts2read_SelfSpectra = 0
77 77 self.pts2read_CrossSpectra = 0
78 78 self.pts2read_DCchannels = 0
79 79 self.ext = ".pdata"
80 80 self.optchar = "P"
81 81 self.basicHeaderObj = BasicHeader(LOCALTIME)
82 82 self.systemHeaderObj = SystemHeader()
83 83 self.radarControllerHeaderObj = RadarControllerHeader()
84 84 self.processingHeaderObj = ProcessingHeader()
85 85 self.lastUTTime = 0
86 86 self.maxTimeStep = 30
87 87 self.dataOut = Spectra()
88 88 self.profileIndex = 1
89 89 self.nRdChannels = None
90 90 self.nRdPairs = None
91 91 self.rdPairList = []
92 92
93 93 def createObjByDefault(self):
94 94
95 95 dataObj = Spectra()
96 96
97 97 return dataObj
98 98
99 99 def __hasNotDataInBuffer(self):
100 100 return 1
101 101
102 102
103 103 def getBlockDimension(self):
104 104 """
105 105 Obtiene la cantidad de puntos a leer por cada bloque de datos
106 106
107 107 Affected:
108 108 self.nRdChannels
109 109 self.nRdPairs
110 110 self.pts2read_SelfSpectra
111 111 self.pts2read_CrossSpectra
112 112 self.pts2read_DCchannels
113 113 self.blocksize
114 114 self.dataOut.nChannels
115 115 self.dataOut.nPairs
116 116
117 117 Return:
118 118 None
119 119 """
120 120 self.nRdChannels = 0
121 121 self.nRdPairs = 0
122 122 self.rdPairList = []
123 123
124 124 for i in range(0, self.processingHeaderObj.totalSpectra*2, 2):
125 125 if self.processingHeaderObj.spectraComb[i] == self.processingHeaderObj.spectraComb[i+1]:
126 126 self.nRdChannels = self.nRdChannels + 1 #par de canales iguales
127 127 else:
128 128 self.nRdPairs = self.nRdPairs + 1 #par de canales diferentes
129 129 self.rdPairList.append((self.processingHeaderObj.spectraComb[i], self.processingHeaderObj.spectraComb[i+1]))
130 130
131 131 pts2read = self.processingHeaderObj.nHeights * self.processingHeaderObj.profilesPerBlock
132 132
133 133 self.pts2read_SelfSpectra = int(self.nRdChannels * pts2read)
134 134 self.blocksize = self.pts2read_SelfSpectra
135 135
136 136 if self.processingHeaderObj.flag_cspc:
137 137 self.pts2read_CrossSpectra = int(self.nRdPairs * pts2read)
138 138 self.blocksize += self.pts2read_CrossSpectra
139 139
140 140 if self.processingHeaderObj.flag_dc:
141 141 self.pts2read_DCchannels = int(self.systemHeaderObj.nChannels * self.processingHeaderObj.nHeights)
142 142 self.blocksize += self.pts2read_DCchannels
143 143
144 144 def readBlock(self):
145 145 """
146 146 Lee el bloque de datos desde la posicion actual del puntero del archivo
147 147 (self.fp) y actualiza todos los parametros relacionados al bloque de datos
148 148 (metadata + data). La data leida es almacenada en el buffer y el contador del buffer
149 149 es seteado a 0
150 150
151 151 Return: None
152 152
153 153 Variables afectadas:
154 154
155 155 self.flagIsNewFile
156 156 self.flagIsNewBlock
157 157 self.nTotalBlocks
158 158 self.data_spc
159 159 self.data_cspc
160 160 self.data_dc
161 161
162 162 Exceptions:
163 163 Si un bloque leido no es un bloque valido
164 164 """
165 165
166 166 fpointer = self.fp.tell()
167 167
168 168 spc = numpy.fromfile( self.fp, self.dtype[0], self.pts2read_SelfSpectra )
169 169 spc = spc.reshape( (self.nRdChannels, self.processingHeaderObj.nHeights, self.processingHeaderObj.profilesPerBlock) ) #transforma a un arreglo 3D
170 170
171 171 if self.processingHeaderObj.flag_cspc:
172 172 cspc = numpy.fromfile( self.fp, self.dtype, self.pts2read_CrossSpectra )
173 173 cspc = cspc.reshape( (self.nRdPairs, self.processingHeaderObj.nHeights, self.processingHeaderObj.profilesPerBlock) ) #transforma a un arreglo 3D
174 174
175 175 if self.processingHeaderObj.flag_dc:
176 176 dc = numpy.fromfile( self.fp, self.dtype, self.pts2read_DCchannels ) #int(self.processingHeaderObj.nHeights*self.systemHeaderObj.nChannels) )
177 177 dc = dc.reshape( (self.systemHeaderObj.nChannels, self.processingHeaderObj.nHeights) ) #transforma a un arreglo 2D
178 178
179 179 if not self.processingHeaderObj.shif_fft:
180 180 #desplaza a la derecha en el eje 2 determinadas posiciones
181 181 shift = int(self.processingHeaderObj.profilesPerBlock/2)
182 182 spc = numpy.roll( spc, shift , axis=2 )
183 183
184 184 if self.processingHeaderObj.flag_cspc:
185 185 #desplaza a la derecha en el eje 2 determinadas posiciones
186 186 cspc = numpy.roll( cspc, shift, axis=2 )
187 187
188 188 #Dimensions : nChannels, nProfiles, nSamples
189 189 spc = numpy.transpose( spc, (0,2,1) )
190 190 self.data_spc = spc
191 191
192 192 if self.processingHeaderObj.flag_cspc:
193 193 cspc = numpy.transpose( cspc, (0,2,1) )
194 194 self.data_cspc = cspc['real'] + cspc['imag']*1j
195 195 else:
196 196 self.data_cspc = None
197 197
198 198 if self.processingHeaderObj.flag_dc:
199 199 self.data_dc = dc['real'] + dc['imag']*1j
200 200 else:
201 201 self.data_dc = None
202 202
203 203 self.flagIsNewFile = 0
204 204 self.flagIsNewBlock = 1
205 205
206 206 self.nTotalBlocks += 1
207 207 self.nReadBlocks += 1
208 208
209 209 return 1
210 210
211 211 def getFirstHeader(self):
212 212
213 213 self.getBasicHeader()
214 214 self.dataOut.systemHeaderObj = self.systemHeaderObj.copy()
215 215 self.dataOut.radarControllerHeaderObj = self.radarControllerHeaderObj.copy()
216 216 self.dataOut.dtype = self.dtype
217 217 self.dataOut.pairsList = self.rdPairList
218 218 self.dataOut.nProfiles = self.processingHeaderObj.profilesPerBlock
219 219 self.dataOut.nFFTPoints = self.processingHeaderObj.profilesPerBlock
220 220 self.dataOut.nCohInt = self.processingHeaderObj.nCohInt
221 221 self.dataOut.nIncohInt = self.processingHeaderObj.nIncohInt
222 222 xf = self.processingHeaderObj.firstHeight + self.processingHeaderObj.nHeights*self.processingHeaderObj.deltaHeight
223 223 self.dataOut.heightList = numpy.arange(self.processingHeaderObj.firstHeight, xf, self.processingHeaderObj.deltaHeight)
224 224 self.dataOut.channelList = list(range(self.systemHeaderObj.nChannels))
225 225 self.dataOut.flagShiftFFT = True #Data is always shifted
226 226 self.dataOut.flagDecodeData = self.processingHeaderObj.flag_decode #asumo q la data no esta decodificada
227 227 self.dataOut.flagDeflipData = self.processingHeaderObj.flag_deflip #asumo q la data esta sin flip
228 228
229 229 def getData(self):
230 230 """
231 231 First method to execute before "RUN" is called.
232 232
233 233 Copia el buffer de lectura a la clase "Spectra",
234 234 con todos los parametros asociados a este (metadata). cuando no hay datos en el buffer de
235 235 lectura es necesario hacer una nueva lectura de los bloques de datos usando "readNextBlock"
236 236
237 237 Return:
238 238 0 : Si no hay mas archivos disponibles
239 239 1 : Si hizo una buena copia del buffer
240 240
241 241 Affected:
242 242 self.dataOut
243 243 self.flagDiscontinuousBlock
244 244 self.flagIsNewBlock
245 245 """
246 246
247 247 if self.flagNoMoreFiles:
248 248 self.dataOut.flagNoData = True
249 249 return 0
250 250
251 251 self.flagDiscontinuousBlock = 0
252 252 self.flagIsNewBlock = 0
253 253
254 254 if self.__hasNotDataInBuffer():
255 255
256 256 if not( self.readNextBlock() ):
257 257 self.dataOut.flagNoData = True
258 258 return 0
259 259
260 260 #data es un numpy array de 3 dmensiones (perfiles, alturas y canales)
261 261
262 262 if self.data_spc is None:
263 263 self.dataOut.flagNoData = True
264 264 return 0
265 265
266 266 self.getBasicHeader()
267 267 self.getFirstHeader()
268 268 self.dataOut.data_spc = self.data_spc
269 269 self.dataOut.data_cspc = self.data_cspc
270 270 self.dataOut.data_dc = self.data_dc
271 271 self.dataOut.flagNoData = False
272 272 self.dataOut.realtime = self.online
273 273
274 274 return self.dataOut.data_spc
275 275
276 276
277 277 @MPDecorator
278 278 class SpectraWriter(JRODataWriter, Operation):
279 279
280 280 """
281 281 Esta clase permite escribir datos de espectros a archivos procesados (.pdata). La escritura
282 282 de los datos siempre se realiza por bloques.
283 283 """
284 284
285 285 def __init__(self):
286 286 """
287 287 Inicializador de la clase SpectraWriter para la escritura de datos de espectros.
288 288
289 289 Affected:
290 290 self.dataOut
291 291 self.basicHeaderObj
292 292 self.systemHeaderObj
293 293 self.radarControllerHeaderObj
294 294 self.processingHeaderObj
295 295
296 296 Return: None
297 297 """
298 298
299 299 Operation.__init__(self)
300 300
301 301 self.ext = ".pdata"
302 302 self.optchar = "P"
303 303 self.shape_spc_Buffer = None
304 304 self.shape_cspc_Buffer = None
305 305 self.shape_dc_Buffer = None
306 306 self.data_spc = None
307 307 self.data_cspc = None
308 308 self.data_dc = None
309 309 self.setFile = None
310 310 self.noMoreFiles = 0
311 311 self.basicHeaderObj = BasicHeader(LOCALTIME)
312 312 self.systemHeaderObj = SystemHeader()
313 313 self.radarControllerHeaderObj = RadarControllerHeader()
314 314 self.processingHeaderObj = ProcessingHeader()
315 315
316 316 def hasAllDataInBuffer(self):
317 317 return 1
318 318
319 319
320 320 def setBlockDimension(self):
321 321 """
322 322 Obtiene las formas dimensionales del los subbloques de datos que componen un bloque
323 323
324 324 Affected:
325 325 self.shape_spc_Buffer
326 326 self.shape_cspc_Buffer
327 327 self.shape_dc_Buffer
328 328
329 329 Return: None
330 330 """
331 331 self.shape_spc_Buffer = (self.dataOut.nChannels,
332 332 self.processingHeaderObj.nHeights,
333 333 self.processingHeaderObj.profilesPerBlock)
334 334
335 335 self.shape_cspc_Buffer = (self.dataOut.nPairs,
336 336 self.processingHeaderObj.nHeights,
337 337 self.processingHeaderObj.profilesPerBlock)
338 338
339 339 self.shape_dc_Buffer = (self.dataOut.nChannels,
340 340 self.processingHeaderObj.nHeights)
341 341
342 342
343 343 def writeBlock(self):
344 344 """processingHeaderObj
345 345 Escribe el buffer en el file designado
346 346
347 347 Affected:
348 348 self.data_spc
349 349 self.data_cspc
350 350 self.data_dc
351 351 self.flagIsNewFile
352 352 self.flagIsNewBlock
353 353 self.nTotalBlocks
354 354 self.nWriteBlocks
355 355
356 356 Return: None
357 357 """
358 358
359 359 spc = numpy.transpose( self.data_spc, (0,2,1) )
360 360 if not self.processingHeaderObj.shif_fft:
361 spc = numpy.roll( spc, self.processingHeaderObj.profilesPerBlock/2, axis=2 ) #desplaza a la derecha en el eje 2 determinadas posiciones
361 spc = numpy.roll( spc, int(self.processingHeaderObj.profilesPerBlock/2), axis=2 ) #desplaza a la derecha en el eje 2 determinadas posiciones
362 362 data = spc.reshape((-1))
363 363 data = data.astype(self.dtype[0])
364 364 data.tofile(self.fp)
365 365
366 366 if self.data_cspc is not None:
367 367
368 368 cspc = numpy.transpose( self.data_cspc, (0,2,1) )
369 369 data = numpy.zeros( numpy.shape(cspc), self.dtype )
370 370 #print 'data.shape', self.shape_cspc_Buffer
371 371 if not self.processingHeaderObj.shif_fft:
372 cspc = numpy.roll( cspc, self.processingHeaderObj.profilesPerBlock/2, axis=2 ) #desplaza a la derecha en el eje 2 determinadas posiciones
372 cspc = numpy.roll( cspc, int(self.processingHeaderObj.profilesPerBlock/2), axis=2 ) #desplaza a la derecha en el eje 2 determinadas posiciones
373 373 data['real'] = cspc.real
374 374 data['imag'] = cspc.imag
375 375 data = data.reshape((-1))
376 376 data.tofile(self.fp)
377 377
378 378 if self.data_dc is not None:
379 379
380 380 dc = self.data_dc
381 381 data = numpy.zeros( numpy.shape(dc), self.dtype )
382 382 data['real'] = dc.real
383 383 data['imag'] = dc.imag
384 384 data = data.reshape((-1))
385 385 data.tofile(self.fp)
386 386
387 387 # self.data_spc.fill(0)
388 388 #
389 389 # if self.data_dc is not None:
390 390 # self.data_dc.fill(0)
391 391 #
392 392 # if self.data_cspc is not None:
393 393 # self.data_cspc.fill(0)
394 394
395 395 self.flagIsNewFile = 0
396 396 self.flagIsNewBlock = 1
397 397 self.nTotalBlocks += 1
398 398 self.nWriteBlocks += 1
399 399 self.blockIndex += 1
400 400
401 401 # print "[Writing] Block = %d04" %self.blockIndex
402 402
403 403 def putData(self):
404 404 """
405 405 Setea un bloque de datos y luego los escribe en un file
406 406
407 407 Affected:
408 408 self.data_spc
409 409 self.data_cspc
410 410 self.data_dc
411 411
412 412 Return:
413 413 0 : Si no hay data o no hay mas files que puedan escribirse
414 414 1 : Si se escribio la data de un bloque en un file
415 415 """
416 416
417 417 if self.dataOut.flagNoData:
418 418 return 0
419 419
420 420 self.flagIsNewBlock = 0
421 421
422 422 if self.dataOut.flagDiscontinuousBlock:
423 423 self.data_spc.fill(0)
424 424 if self.dataOut.data_cspc is not None:
425 425 self.data_cspc.fill(0)
426 426 if self.dataOut.data_dc is not None:
427 427 self.data_dc.fill(0)
428 428 self.setNextFile()
429 429
430 430 if self.flagIsNewFile == 0:
431 431 self.setBasicHeader()
432 432
433 433 self.data_spc = self.dataOut.data_spc.copy()
434 434
435 435 if self.dataOut.data_cspc is not None:
436 436 self.data_cspc = self.dataOut.data_cspc.copy()
437 437
438 438 if self.dataOut.data_dc is not None:
439 439 self.data_dc = self.dataOut.data_dc.copy()
440 440
441 441 # #self.processingHeaderObj.dataBlocksPerFile)
442 442 if self.hasAllDataInBuffer():
443 443 # self.setFirstHeader()
444 444 self.writeNextBlock()
445 445
446 446 def __getBlockSize(self):
447 447 '''
448 448 Este metodos determina el cantidad de bytes para un bloque de datos de tipo Spectra
449 449 '''
450 450
451 451 dtype_width = self.getDtypeWidth()
452 452
453 453 pts2write = self.dataOut.nHeights * self.dataOut.nFFTPoints
454 454
455 455 pts2write_SelfSpectra = int(self.dataOut.nChannels * pts2write)
456 456 blocksize = (pts2write_SelfSpectra*dtype_width)
457 457
458 458 if self.dataOut.data_cspc is not None:
459 459 pts2write_CrossSpectra = int(self.dataOut.nPairs * pts2write)
460 460 blocksize += (pts2write_CrossSpectra*dtype_width*2)
461 461
462 462 if self.dataOut.data_dc is not None:
463 463 pts2write_DCchannels = int(self.dataOut.nChannels * self.dataOut.nHeights)
464 464 blocksize += (pts2write_DCchannels*dtype_width*2)
465 465
466 466 # blocksize = blocksize #* datatypeValue * 2 #CORREGIR ESTO
467 467
468 468 return blocksize
469 469
470 470 def setFirstHeader(self):
471 471
472 472 """
473 473 Obtiene una copia del First Header
474 474
475 475 Affected:
476 476 self.systemHeaderObj
477 477 self.radarControllerHeaderObj
478 478 self.dtype
479 479
480 480 Return:
481 481 None
482 482 """
483 483
484 484 self.systemHeaderObj = self.dataOut.systemHeaderObj.copy()
485 485 self.systemHeaderObj.nChannels = self.dataOut.nChannels
486 486 self.radarControllerHeaderObj = self.dataOut.radarControllerHeaderObj.copy()
487 487
488 488 self.processingHeaderObj.dtype = 1 # Spectra
489 489 self.processingHeaderObj.blockSize = self.__getBlockSize()
490 490 self.processingHeaderObj.profilesPerBlock = self.dataOut.nFFTPoints
491 491 self.processingHeaderObj.dataBlocksPerFile = self.blocksPerFile
492 492 self.processingHeaderObj.nWindows = 1 #podria ser 1 o self.dataOut.processingHeaderObj.nWindows
493 493 self.processingHeaderObj.nCohInt = self.dataOut.nCohInt# Se requiere para determinar el valor de timeInterval
494 494 self.processingHeaderObj.nIncohInt = self.dataOut.nIncohInt
495 495 self.processingHeaderObj.totalSpectra = self.dataOut.nPairs + self.dataOut.nChannels
496 496 self.processingHeaderObj.shif_fft = self.dataOut.flagShiftFFT
497 497
498 498 if self.processingHeaderObj.totalSpectra > 0:
499 499 channelList = []
500 500 for channel in range(self.dataOut.nChannels):
501 501 channelList.append(channel)
502 502 channelList.append(channel)
503 503
504 504 pairsList = []
505 505 if self.dataOut.nPairs > 0:
506 506 for pair in self.dataOut.pairsList:
507 507 pairsList.append(pair[0])
508 508 pairsList.append(pair[1])
509 509
510 510 spectraComb = channelList + pairsList
511 511 spectraComb = numpy.array(spectraComb, dtype="u1")
512 512 self.processingHeaderObj.spectraComb = spectraComb
513 513
514 514 if self.dataOut.code is not None:
515 515 self.processingHeaderObj.code = self.dataOut.code
516 516 self.processingHeaderObj.nCode = self.dataOut.nCode
517 517 self.processingHeaderObj.nBaud = self.dataOut.nBaud
518 518
519 519 if self.processingHeaderObj.nWindows != 0:
520 520 self.processingHeaderObj.firstHeight = self.dataOut.heightList[0]
521 521 self.processingHeaderObj.deltaHeight = self.dataOut.heightList[1] - self.dataOut.heightList[0]
522 522 self.processingHeaderObj.nHeights = self.dataOut.nHeights
523 523 self.processingHeaderObj.samplesWin = self.dataOut.nHeights
524 524
525 525 self.processingHeaderObj.processFlags = self.getProcessFlags()
526 526
527 527 self.setBasicHeader() No newline at end of file
@@ -1,207 +1,207
1 1 '''
2 2 Base clases to create Processing units and operations, the MPDecorator
3 3 must be used in plotting and writing operations to allow to run as an
4 4 external process.
5 5 '''
6 6
7 7 import inspect
8 8 import zmq
9 9 import time
10 10 import pickle
11 11 import traceback
12 12 try:
13 13 from queue import Queue
14 14 except:
15 15 from Queue import Queue
16 16 from threading import Thread
17 17 from multiprocessing import Process, Queue
18 18 from schainpy.utils import log
19 19
20 20
21 21 class ProcessingUnit(object):
22 22 '''
23 23 Base class to create Signal Chain Units
24 24 '''
25 25
26 26 proc_type = 'processing'
27 27
28 28 def __init__(self):
29 29
30 30 self.dataIn = None
31 31 self.dataOut = None
32 32 self.isConfig = False
33 33 self.operations = []
34 34
35 35 def setInput(self, unit):
36 36
37 37 self.dataIn = unit.dataOut
38 38
39 39 def getAllowedArgs(self):
40 40 if hasattr(self, '__attrs__'):
41 41 return self.__attrs__
42 42 else:
43 43 return inspect.getargspec(self.run).args
44 44
45 45 def addOperation(self, conf, operation):
46 46 '''
47 47 '''
48 48
49 49 self.operations.append((operation, conf.type, conf.getKwargs()))
50 50
51 51 def getOperationObj(self, objId):
52 52
53 53 if objId not in list(self.operations.keys()):
54 54 return None
55 55
56 56 return self.operations[objId]
57 57
58 58 def call(self, **kwargs):
59 59 '''
60 60 '''
61 61
62 62 try:
63 63 if self.dataIn is not None and self.dataIn.flagNoData and not self.dataIn.error:
64 64 return self.dataIn.isReady()
65 65 elif self.dataIn is None or not self.dataIn.error:
66 66 self.run(**kwargs)
67 67 elif self.dataIn.error:
68 68 self.dataOut.error = self.dataIn.error
69 69 self.dataOut.flagNoData = True
70 70 except:
71 71 err = traceback.format_exc()
72 72 if 'SchainWarning' in err:
73 73 log.warning(err.split('SchainWarning:')[-1].split('\n')[0].strip(), self.name)
74 74 elif 'SchainError' in err:
75 75 log.error(err.split('SchainError:')[-1].split('\n')[0].strip(), self.name)
76 76 else:
77 log.error(err.split('\n')[-2], self.name)
77 log.error(err, self.name)
78 78 self.dataOut.error = True
79 79
80 80 for op, optype, opkwargs in self.operations:
81 81 if optype == 'other' and not self.dataOut.flagNoData:
82 82 self.dataOut = op.run(self.dataOut, **opkwargs)
83 83 elif optype == 'external' and not self.dataOut.flagNoData:
84 84 op.queue.put(self.dataOut)
85 85 elif optype == 'external' and self.dataOut.error:
86 86 op.queue.put(self.dataOut)
87 87
88 88 return 'Error' if self.dataOut.error else self.dataOut.isReady()
89 89
90 90 def setup(self):
91 91
92 92 raise NotImplementedError
93 93
94 94 def run(self):
95 95
96 96 raise NotImplementedError
97 97
98 98 def close(self):
99 99
100 100 return
101 101
102 102
103 103 class Operation(object):
104 104
105 105 '''
106 106 '''
107 107
108 108 proc_type = 'operation'
109 109
110 110 def __init__(self):
111 111
112 112 self.id = None
113 113 self.isConfig = False
114 114
115 115 if not hasattr(self, 'name'):
116 116 self.name = self.__class__.__name__
117 117
118 118 def getAllowedArgs(self):
119 119 if hasattr(self, '__attrs__'):
120 120 return self.__attrs__
121 121 else:
122 122 return inspect.getargspec(self.run).args
123 123
124 124 def setup(self):
125 125
126 126 self.isConfig = True
127 127
128 128 raise NotImplementedError
129 129
130 130 def run(self, dataIn, **kwargs):
131 131 """
132 132 Realiza las operaciones necesarias sobre la dataIn.data y actualiza los
133 133 atributos del objeto dataIn.
134 134
135 135 Input:
136 136
137 137 dataIn : objeto del tipo JROData
138 138
139 139 Return:
140 140
141 141 None
142 142
143 143 Affected:
144 144 __buffer : buffer de recepcion de datos.
145 145
146 146 """
147 147 if not self.isConfig:
148 148 self.setup(**kwargs)
149 149
150 150 raise NotImplementedError
151 151
152 152 def close(self):
153 153
154 154 return
155 155
156 156
157 157 def MPDecorator(BaseClass):
158 158 """
159 159 Multiprocessing class decorator
160 160
161 161 This function add multiprocessing features to a BaseClass.
162 162 """
163 163
164 164 class MPClass(BaseClass, Process):
165 165
166 166 def __init__(self, *args, **kwargs):
167 167 super(MPClass, self).__init__()
168 168 Process.__init__(self)
169 169
170 170 self.args = args
171 171 self.kwargs = kwargs
172 172 self.t = time.time()
173 173 self.op_type = 'external'
174 174 self.name = BaseClass.__name__
175 175 self.__doc__ = BaseClass.__doc__
176 176
177 177 if 'plot' in self.name.lower() and not self.name.endswith('_'):
178 178 self.name = '{}{}'.format(self.CODE.upper(), 'Plot')
179 179
180 180 self.start_time = time.time()
181 181 self.err_queue = args[3]
182 182 self.queue = Queue(maxsize=1)
183 183 self.myrun = BaseClass.run
184 184
185 185 def run(self):
186 186
187 187 while True:
188 188
189 189 dataOut = self.queue.get()
190 190
191 191 if not dataOut.error:
192 192 try:
193 193 BaseClass.run(self, dataOut, **self.kwargs)
194 194 except:
195 195 err = traceback.format_exc()
196 log.error(err.split('\n')[-2], self.name)
196 log.error(err, self.name)
197 197 else:
198 198 break
199 199
200 200 self.close()
201 201
202 202 def close(self):
203 203
204 204 BaseClass.close(self)
205 205 log.success('Done...(Time:{:4.2f} secs)'.format(time.time()-self.start_time), self.name)
206 206
207 207 return MPClass
@@ -1,1604 +1,1604
1 1 import sys
2 2 import numpy,math
3 3 from scipy import interpolate
4 4 from schainpy.model.proc.jroproc_base import ProcessingUnit, Operation, MPDecorator
5 5 from schainpy.model.data.jrodata import Voltage
6 6 from schainpy.utils import log
7 7 from time import time
8 8
9 9
10 10
11 11 class VoltageProc(ProcessingUnit):
12 12
13 13 def __init__(self):
14 14
15 15 ProcessingUnit.__init__(self)
16 16
17 17 self.dataOut = Voltage()
18 18 self.flip = 1
19 19 self.setupReq = False
20 20
21 21 def run(self):
22 22
23 23 if self.dataIn.type == 'AMISR':
24 24 self.__updateObjFromAmisrInput()
25 25
26 26 if self.dataIn.type == 'Voltage':
27 27 self.dataOut.copy(self.dataIn)
28 28
29 29 def __updateObjFromAmisrInput(self):
30 30
31 31 self.dataOut.timeZone = self.dataIn.timeZone
32 32 self.dataOut.dstFlag = self.dataIn.dstFlag
33 33 self.dataOut.errorCount = self.dataIn.errorCount
34 34 self.dataOut.useLocalTime = self.dataIn.useLocalTime
35 35
36 36 self.dataOut.flagNoData = self.dataIn.flagNoData
37 37 self.dataOut.data = self.dataIn.data
38 38 self.dataOut.utctime = self.dataIn.utctime
39 39 self.dataOut.channelList = self.dataIn.channelList
40 40 #self.dataOut.timeInterval = self.dataIn.timeInterval
41 41 self.dataOut.heightList = self.dataIn.heightList
42 42 self.dataOut.nProfiles = self.dataIn.nProfiles
43 43
44 44 self.dataOut.nCohInt = self.dataIn.nCohInt
45 45 self.dataOut.ippSeconds = self.dataIn.ippSeconds
46 46 self.dataOut.frequency = self.dataIn.frequency
47 47
48 48 self.dataOut.azimuth = self.dataIn.azimuth
49 49 self.dataOut.zenith = self.dataIn.zenith
50 50
51 51 self.dataOut.beam.codeList = self.dataIn.beam.codeList
52 52 self.dataOut.beam.azimuthList = self.dataIn.beam.azimuthList
53 53 self.dataOut.beam.zenithList = self.dataIn.beam.zenithList
54 54
55 55
56 56 class selectChannels(Operation):
57 57
58 58 def run(self, dataOut, channelList):
59 59
60 60 channelIndexList = []
61 61 self.dataOut = dataOut
62 62 for channel in channelList:
63 63 if channel not in self.dataOut.channelList:
64 64 raise ValueError("Channel %d is not in %s" %(channel, str(self.dataOut.channelList)))
65 65
66 66 index = self.dataOut.channelList.index(channel)
67 67 channelIndexList.append(index)
68 68 self.selectChannelsByIndex(channelIndexList)
69 69 return self.dataOut
70 70
71 71 def selectChannelsByIndex(self, channelIndexList):
72 72 """
73 73 Selecciona un bloque de datos en base a canales segun el channelIndexList
74 74
75 75 Input:
76 76 channelIndexList : lista sencilla de canales a seleccionar por ej. [2,3,7]
77 77
78 78 Affected:
79 79 self.dataOut.data
80 80 self.dataOut.channelIndexList
81 81 self.dataOut.nChannels
82 82 self.dataOut.m_ProcessingHeader.totalSpectra
83 83 self.dataOut.systemHeaderObj.numChannels
84 84 self.dataOut.m_ProcessingHeader.blockSize
85 85
86 86 Return:
87 87 None
88 88 """
89 89
90 90 for channelIndex in channelIndexList:
91 91 if channelIndex not in self.dataOut.channelIndexList:
92 92 raise ValueError("The value %d in channelIndexList is not valid" %channelIndex)
93 93
94 94 if self.dataOut.type == 'Voltage':
95 95 if self.dataOut.flagDataAsBlock:
96 96 """
97 97 Si la data es obtenida por bloques, dimension = [nChannels, nProfiles, nHeis]
98 98 """
99 99 data = self.dataOut.data[channelIndexList,:,:]
100 100 else:
101 101 data = self.dataOut.data[channelIndexList,:]
102 102
103 103 self.dataOut.data = data
104 104 # self.dataOut.channelList = [self.dataOut.channelList[i] for i in channelIndexList]
105 105 self.dataOut.channelList = range(len(channelIndexList))
106 106
107 107 elif self.dataOut.type == 'Spectra':
108 108 data_spc = self.dataOut.data_spc[channelIndexList, :]
109 109 data_dc = self.dataOut.data_dc[channelIndexList, :]
110 110
111 111 self.dataOut.data_spc = data_spc
112 112 self.dataOut.data_dc = data_dc
113 113
114 114 # self.dataOut.channelList = [self.dataOut.channelList[i] for i in channelIndexList]
115 115 self.dataOut.channelList = range(len(channelIndexList))
116 116 self.__selectPairsByChannel(channelIndexList)
117 117
118 118 return 1
119 119
120 120 def __selectPairsByChannel(self, channelList=None):
121 121
122 122 if channelList == None:
123 123 return
124 124
125 125 pairsIndexListSelected = []
126 126 for pairIndex in self.dataOut.pairsIndexList:
127 127 # First pair
128 128 if self.dataOut.pairsList[pairIndex][0] not in channelList:
129 129 continue
130 130 # Second pair
131 131 if self.dataOut.pairsList[pairIndex][1] not in channelList:
132 132 continue
133 133
134 134 pairsIndexListSelected.append(pairIndex)
135 135
136 136 if not pairsIndexListSelected:
137 137 self.dataOut.data_cspc = None
138 138 self.dataOut.pairsList = []
139 139 return
140 140
141 141 self.dataOut.data_cspc = self.dataOut.data_cspc[pairsIndexListSelected]
142 142 self.dataOut.pairsList = [self.dataOut.pairsList[i]
143 143 for i in pairsIndexListSelected]
144 144
145 145 return
146 146
147 147 class selectHeights(Operation):
148 148
149 149 def run(self, dataOut, minHei=None, maxHei=None):
150 150 """
151 151 Selecciona un bloque de datos en base a un grupo de valores de alturas segun el rango
152 152 minHei <= height <= maxHei
153 153
154 154 Input:
155 155 minHei : valor minimo de altura a considerar
156 156 maxHei : valor maximo de altura a considerar
157 157
158 158 Affected:
159 159 Indirectamente son cambiados varios valores a travez del metodo selectHeightsByIndex
160 160
161 161 Return:
162 162 1 si el metodo se ejecuto con exito caso contrario devuelve 0
163 163 """
164 164
165 165 self.dataOut = dataOut
166 166
167 167 if minHei == None:
168 168 minHei = self.dataOut.heightList[0]
169 169
170 170 if maxHei == None:
171 171 maxHei = self.dataOut.heightList[-1]
172 172
173 173 if (minHei < self.dataOut.heightList[0]):
174 174 minHei = self.dataOut.heightList[0]
175 175
176 176 if (maxHei > self.dataOut.heightList[-1]):
177 177 maxHei = self.dataOut.heightList[-1]
178 178
179 179 minIndex = 0
180 180 maxIndex = 0
181 181 heights = self.dataOut.heightList
182 182
183 183 inda = numpy.where(heights >= minHei)
184 184 indb = numpy.where(heights <= maxHei)
185 185
186 186 try:
187 187 minIndex = inda[0][0]
188 188 except:
189 189 minIndex = 0
190 190
191 191 try:
192 192 maxIndex = indb[0][-1]
193 193 except:
194 194 maxIndex = len(heights)
195 195
196 196 self.selectHeightsByIndex(minIndex, maxIndex)
197 197
198 198 return self.dataOut
199 199
200 200 def selectHeightsByIndex(self, minIndex, maxIndex):
201 201 """
202 202 Selecciona un bloque de datos en base a un grupo indices de alturas segun el rango
203 203 minIndex <= index <= maxIndex
204 204
205 205 Input:
206 206 minIndex : valor de indice minimo de altura a considerar
207 207 maxIndex : valor de indice maximo de altura a considerar
208 208
209 209 Affected:
210 210 self.dataOut.data
211 211 self.dataOut.heightList
212 212
213 213 Return:
214 214 1 si el metodo se ejecuto con exito caso contrario devuelve 0
215 215 """
216 216
217 217 if self.dataOut.type == 'Voltage':
218 218 if (minIndex < 0) or (minIndex > maxIndex):
219 219 raise ValueError("Height index range (%d,%d) is not valid" % (minIndex, maxIndex))
220 220
221 221 if (maxIndex >= self.dataOut.nHeights):
222 222 maxIndex = self.dataOut.nHeights
223 223
224 224 #voltage
225 225 if self.dataOut.flagDataAsBlock:
226 226 """
227 227 Si la data es obtenida por bloques, dimension = [nChannels, nProfiles, nHeis]
228 228 """
229 229 data = self.dataOut.data[:,:, minIndex:maxIndex]
230 230 else:
231 231 data = self.dataOut.data[:, minIndex:maxIndex]
232 232
233 233 # firstHeight = self.dataOut.heightList[minIndex]
234 234
235 235 self.dataOut.data = data
236 236 self.dataOut.heightList = self.dataOut.heightList[minIndex:maxIndex]
237 237
238 238 if self.dataOut.nHeights <= 1:
239 239 raise ValueError("selectHeights: Too few heights. Current number of heights is %d" %(self.dataOut.nHeights))
240 240 elif self.dataOut.type == 'Spectra':
241 241 if (minIndex < 0) or (minIndex > maxIndex):
242 242 raise ValueError("Error selecting heights: Index range (%d,%d) is not valid" % (
243 243 minIndex, maxIndex))
244 244
245 245 if (maxIndex >= self.dataOut.nHeights):
246 246 maxIndex = self.dataOut.nHeights - 1
247 247
248 248 # Spectra
249 249 data_spc = self.dataOut.data_spc[:, :, minIndex:maxIndex + 1]
250 250
251 251 data_cspc = None
252 252 if self.dataOut.data_cspc is not None:
253 253 data_cspc = self.dataOut.data_cspc[:, :, minIndex:maxIndex + 1]
254 254
255 255 data_dc = None
256 256 if self.dataOut.data_dc is not None:
257 257 data_dc = self.dataOut.data_dc[:, minIndex:maxIndex + 1]
258 258
259 259 self.dataOut.data_spc = data_spc
260 260 self.dataOut.data_cspc = data_cspc
261 261 self.dataOut.data_dc = data_dc
262 262
263 263 self.dataOut.heightList = self.dataOut.heightList[minIndex:maxIndex + 1]
264 264
265 265 return 1
266 266
267 267
268 268 class filterByHeights(Operation):
269 269
270 270 def run(self, dataOut, window):
271 271
272 272 deltaHeight = dataOut.heightList[1] - dataOut.heightList[0]
273 273
274 274 if window == None:
275 275 window = (dataOut.radarControllerHeaderObj.txA/dataOut.radarControllerHeaderObj.nBaud) / deltaHeight
276 276
277 277 newdelta = deltaHeight * window
278 278 r = dataOut.nHeights % window
279 279 newheights = (dataOut.nHeights-r)/window
280 280
281 281 if newheights <= 1:
282 282 raise ValueError("filterByHeights: Too few heights. Current number of heights is %d and window is %d" %(dataOut.nHeights, window))
283 283
284 284 if dataOut.flagDataAsBlock:
285 285 """
286 286 Si la data es obtenida por bloques, dimension = [nChannels, nProfiles, nHeis]
287 287 """
288 288 buffer = dataOut.data[:, :, 0:int(dataOut.nHeights-r)]
289 289 buffer = buffer.reshape(dataOut.nChannels, dataOut.nProfiles, int(dataOut.nHeights/window), window)
290 290 buffer = numpy.sum(buffer,3)
291 291
292 292 else:
293 293 buffer = dataOut.data[:,0:int(dataOut.nHeights-r)]
294 294 buffer = buffer.reshape(dataOut.nChannels,int(dataOut.nHeights/window),int(window))
295 295 buffer = numpy.sum(buffer,2)
296 296
297 297 dataOut.data = buffer
298 298 dataOut.heightList = dataOut.heightList[0] + numpy.arange( newheights )*newdelta
299 299 dataOut.windowOfFilter = window
300 300
301 301 return dataOut
302 302
303 303
304 304 class setH0(Operation):
305 305
306 306 def run(self, dataOut, h0, deltaHeight = None):
307 307
308 308 if not deltaHeight:
309 309 deltaHeight = dataOut.heightList[1] - dataOut.heightList[0]
310 310
311 311 nHeights = dataOut.nHeights
312 312
313 313 newHeiRange = h0 + numpy.arange(nHeights)*deltaHeight
314 314
315 315 dataOut.heightList = newHeiRange
316 316
317 317 return dataOut
318 318
319 319
320 320 class deFlip(Operation):
321 321
322 322 def run(self, dataOut, channelList = []):
323 323
324 324 data = dataOut.data.copy()
325 325
326 326 if dataOut.flagDataAsBlock:
327 327 flip = self.flip
328 328 profileList = list(range(dataOut.nProfiles))
329 329
330 330 if not channelList:
331 331 for thisProfile in profileList:
332 332 data[:,thisProfile,:] = data[:,thisProfile,:]*flip
333 333 flip *= -1.0
334 334 else:
335 335 for thisChannel in channelList:
336 336 if thisChannel not in dataOut.channelList:
337 337 continue
338 338
339 339 for thisProfile in profileList:
340 340 data[thisChannel,thisProfile,:] = data[thisChannel,thisProfile,:]*flip
341 341 flip *= -1.0
342 342
343 343 self.flip = flip
344 344
345 345 else:
346 346 if not channelList:
347 347 data[:,:] = data[:,:]*self.flip
348 348 else:
349 349 for thisChannel in channelList:
350 350 if thisChannel not in dataOut.channelList:
351 351 continue
352 352
353 353 data[thisChannel,:] = data[thisChannel,:]*self.flip
354 354
355 355 self.flip *= -1.
356 356
357 357 dataOut.data = data
358 358
359 359 return dataOut
360 360
361 361
362 362 class setAttribute(Operation):
363 363 '''
364 364 Set an arbitrary attribute(s) to dataOut
365 365 '''
366 366
367 367 def __init__(self):
368 368
369 369 Operation.__init__(self)
370 370 self._ready = False
371 371
372 372 def run(self, dataOut, **kwargs):
373 373
374 374 for key, value in kwargs.items():
375 375 setattr(dataOut, key, value)
376 376
377 377 return dataOut
378 378
379 379
380 380 @MPDecorator
381 381 class printAttribute(Operation):
382 382 '''
383 383 Print an arbitrary attribute of dataOut
384 384 '''
385 385
386 386 def __init__(self):
387 387
388 388 Operation.__init__(self)
389 389
390 390 def run(self, dataOut, attributes):
391 391
392 392 for attr in attributes:
393 393 if hasattr(dataOut, attr):
394 394 log.log(getattr(dataOut, attr), attr)
395 395
396 396
397 397 class interpolateHeights(Operation):
398 398
399 399 def run(self, dataOut, topLim, botLim):
400 400 #69 al 72 para julia
401 401 #82-84 para meteoros
402 402 if len(numpy.shape(dataOut.data))==2:
403 403 sampInterp = (dataOut.data[:,botLim-1] + dataOut.data[:,topLim+1])/2
404 404 sampInterp = numpy.transpose(numpy.tile(sampInterp,(topLim-botLim + 1,1)))
405 405 #dataOut.data[:,botLim:limSup+1] = sampInterp
406 406 dataOut.data[:,botLim:topLim+1] = sampInterp
407 407 else:
408 408 nHeights = dataOut.data.shape[2]
409 409 x = numpy.hstack((numpy.arange(botLim),numpy.arange(topLim+1,nHeights)))
410 410 y = dataOut.data[:,:,list(range(botLim))+list(range(topLim+1,nHeights))]
411 411 f = interpolate.interp1d(x, y, axis = 2)
412 412 xnew = numpy.arange(botLim,topLim+1)
413 413 ynew = f(xnew)
414 414 dataOut.data[:,:,botLim:topLim+1] = ynew
415 415
416 416 return dataOut
417 417
418 418
419 419 class CohInt(Operation):
420 420
421 421 isConfig = False
422 422 __profIndex = 0
423 423 __byTime = False
424 424 __initime = None
425 425 __lastdatatime = None
426 426 __integrationtime = None
427 427 __buffer = None
428 428 __bufferStride = []
429 429 __dataReady = False
430 430 __profIndexStride = 0
431 431 __dataToPutStride = False
432 432 n = None
433 433
434 434 def __init__(self, **kwargs):
435 435
436 436 Operation.__init__(self, **kwargs)
437 437
438 # self.isConfig = False
439
440 438 def setup(self, n=None, timeInterval=None, stride=None, overlapping=False, byblock=False):
441 439 """
442 440 Set the parameters of the integration class.
443 441
444 442 Inputs:
445 443
446 444 n : Number of coherent integrations
447 445 timeInterval : Time of integration. If the parameter "n" is selected this one does not work
448 446 overlapping :
449 447 """
450 448
451 449 self.__initime = None
452 450 self.__lastdatatime = 0
453 451 self.__buffer = None
454 452 self.__dataReady = False
455 453 self.byblock = byblock
456 454 self.stride = stride
457 455
458 456 if n == None and timeInterval == None:
459 457 raise ValueError("n or timeInterval should be specified ...")
460 458
461 459 if n != None:
462 460 self.n = n
463 461 self.__byTime = False
464 462 else:
465 463 self.__integrationtime = timeInterval #* 60. #if (type(timeInterval)!=integer) -> change this line
466 464 self.n = 9999
467 465 self.__byTime = True
468 466
469 467 if overlapping:
470 468 self.__withOverlapping = True
471 469 self.__buffer = None
472 470 else:
473 471 self.__withOverlapping = False
474 472 self.__buffer = 0
475 473
476 474 self.__profIndex = 0
477 475
478 476 def putData(self, data):
479 477
480 478 """
481 479 Add a profile to the __buffer and increase in one the __profileIndex
482 480
483 481 """
484 482
485 483 if not self.__withOverlapping:
486 484 self.__buffer += data.copy()
487 485 self.__profIndex += 1
488 486 return
489 487
490 488 #Overlapping data
491 489 nChannels, nHeis = data.shape
492 490 data = numpy.reshape(data, (1, nChannels, nHeis))
493 491
494 492 #If the buffer is empty then it takes the data value
495 493 if self.__buffer is None:
496 494 self.__buffer = data
497 495 self.__profIndex += 1
498 496 return
499 497
500 498 #If the buffer length is lower than n then stakcing the data value
501 499 if self.__profIndex < self.n:
502 500 self.__buffer = numpy.vstack((self.__buffer, data))
503 501 self.__profIndex += 1
504 502 return
505 503
506 504 #If the buffer length is equal to n then replacing the last buffer value with the data value
507 505 self.__buffer = numpy.roll(self.__buffer, -1, axis=0)
508 506 self.__buffer[self.n-1] = data
509 507 self.__profIndex = self.n
510 508 return
511 509
512 510
513 511 def pushData(self):
514 512 """
515 513 Return the sum of the last profiles and the profiles used in the sum.
516 514
517 515 Affected:
518 516
519 517 self.__profileIndex
520 518
521 519 """
522 520
523 521 if not self.__withOverlapping:
524 522 data = self.__buffer
525 523 n = self.__profIndex
526 524
527 525 self.__buffer = 0
528 526 self.__profIndex = 0
529 527
530 528 return data, n
531 529
532 530 #Integration with Overlapping
533 531 data = numpy.sum(self.__buffer, axis=0)
534 532 # print data
535 533 # raise
536 534 n = self.__profIndex
537 535
538 536 return data, n
539 537
540 538 def byProfiles(self, data):
541 539
542 540 self.__dataReady = False
543 541 avgdata = None
544 542 # n = None
545 543 # print data
546 544 # raise
547 545 self.putData(data)
548 546
549 547 if self.__profIndex == self.n:
550 548 avgdata, n = self.pushData()
551 549 self.__dataReady = True
552 550
553 551 return avgdata
554 552
555 553 def byTime(self, data, datatime):
556 554
557 555 self.__dataReady = False
558 556 avgdata = None
559 557 n = None
560 558
561 559 self.putData(data)
562 560
563 561 if (datatime - self.__initime) >= self.__integrationtime:
564 562 avgdata, n = self.pushData()
565 563 self.n = n
566 564 self.__dataReady = True
567 565
568 566 return avgdata
569 567
570 568 def integrateByStride(self, data, datatime):
571 569 # print data
572 570 if self.__profIndex == 0:
573 571 self.__buffer = [[data.copy(), datatime]]
574 572 else:
575 573 self.__buffer.append([data.copy(),datatime])
576 574 self.__profIndex += 1
577 575 self.__dataReady = False
578 576
579 577 if self.__profIndex == self.n * self.stride :
580 578 self.__dataToPutStride = True
581 579 self.__profIndexStride = 0
582 580 self.__profIndex = 0
583 581 self.__bufferStride = []
584 582 for i in range(self.stride):
585 583 current = self.__buffer[i::self.stride]
586 584 data = numpy.sum([t[0] for t in current], axis=0)
587 585 avgdatatime = numpy.average([t[1] for t in current])
588 586 # print data
589 587 self.__bufferStride.append((data, avgdatatime))
590 588
591 589 if self.__dataToPutStride:
592 590 self.__dataReady = True
593 591 self.__profIndexStride += 1
594 592 if self.__profIndexStride == self.stride:
595 593 self.__dataToPutStride = False
596 594 # print self.__bufferStride[self.__profIndexStride - 1]
597 595 # raise
598 596 return self.__bufferStride[self.__profIndexStride - 1]
599 597
600 598
601 599 return None, None
602 600
603 601 def integrate(self, data, datatime=None):
604 602
605 603 if self.__initime == None:
606 604 self.__initime = datatime
607 605
608 606 if self.__byTime:
609 607 avgdata = self.byTime(data, datatime)
610 608 else:
611 609 avgdata = self.byProfiles(data)
612 610
613 611
614 612 self.__lastdatatime = datatime
615 613
616 614 if avgdata is None:
617 615 return None, None
618 616
619 617 avgdatatime = self.__initime
620 618
621 619 deltatime = datatime - self.__lastdatatime
622 620
623 621 if not self.__withOverlapping:
624 622 self.__initime = datatime
625 623 else:
626 624 self.__initime += deltatime
627 625
628 626 return avgdata, avgdatatime
629 627
630 628 def integrateByBlock(self, dataOut):
631 629
632 630 times = int(dataOut.data.shape[1]/self.n)
633 631 avgdata = numpy.zeros((dataOut.nChannels, times, dataOut.nHeights), dtype=numpy.complex)
634 632
635 633 id_min = 0
636 634 id_max = self.n
637 635
638 636 for i in range(times):
639 637 junk = dataOut.data[:,id_min:id_max,:]
640 638 avgdata[:,i,:] = junk.sum(axis=1)
641 639 id_min += self.n
642 640 id_max += self.n
643 641
644 642 timeInterval = dataOut.ippSeconds*self.n
645 643 avgdatatime = (times - 1) * timeInterval + dataOut.utctime
646 644 self.__dataReady = True
647 645 return avgdata, avgdatatime
648 646
649 647 def run(self, dataOut, n=None, timeInterval=None, stride=None, overlapping=False, byblock=False, **kwargs):
650 648
651 649 if not self.isConfig:
652 650 self.setup(n=n, stride=stride, timeInterval=timeInterval, overlapping=overlapping, byblock=byblock, **kwargs)
653 651 self.isConfig = True
654 652
655 653 if dataOut.flagDataAsBlock:
656 654 """
657 655 Si la data es leida por bloques, dimension = [nChannels, nProfiles, nHeis]
658 656 """
659 657 avgdata, avgdatatime = self.integrateByBlock(dataOut)
660 658 dataOut.nProfiles /= self.n
661 659 else:
662 660 if stride is None:
663 661 avgdata, avgdatatime = self.integrate(dataOut.data, dataOut.utctime)
664 662 else:
665 663 avgdata, avgdatatime = self.integrateByStride(dataOut.data, dataOut.utctime)
666 664
667 665
668 666 # dataOut.timeInterval *= n
669 667 dataOut.flagNoData = True
670 668
671 669 if self.__dataReady:
672 670 dataOut.data = avgdata
673 dataOut.nCohInt *= self.n
671 if not dataOut.flagCohInt:
672 dataOut.nCohInt *= self.n
673 dataOut.flagCohInt = True
674 674 dataOut.utctime = avgdatatime
675 675 # print avgdata, avgdatatime
676 676 # raise
677 677 # dataOut.timeInterval = dataOut.ippSeconds * dataOut.nCohInt
678 678 dataOut.flagNoData = False
679 679 return dataOut
680 680
681 681 class Decoder(Operation):
682 682
683 683 isConfig = False
684 684 __profIndex = 0
685 685
686 686 code = None
687 687
688 688 nCode = None
689 689 nBaud = None
690 690
691 691 def __init__(self, **kwargs):
692 692
693 693 Operation.__init__(self, **kwargs)
694 694
695 695 self.times = None
696 696 self.osamp = None
697 697 # self.__setValues = False
698 698 self.isConfig = False
699 699 self.setupReq = False
700 700 def setup(self, code, osamp, dataOut):
701 701
702 702 self.__profIndex = 0
703 703
704 704 self.code = code
705 705
706 706 self.nCode = len(code)
707 707 self.nBaud = len(code[0])
708 708
709 709 if (osamp != None) and (osamp >1):
710 710 self.osamp = osamp
711 711 self.code = numpy.repeat(code, repeats=self.osamp, axis=1)
712 712 self.nBaud = self.nBaud*self.osamp
713 713
714 714 self.__nChannels = dataOut.nChannels
715 715 self.__nProfiles = dataOut.nProfiles
716 716 self.__nHeis = dataOut.nHeights
717 717
718 718 if self.__nHeis < self.nBaud:
719 719 raise ValueError('Number of heights (%d) should be greater than number of bauds (%d)' %(self.__nHeis, self.nBaud))
720 720
721 721 #Frequency
722 722 __codeBuffer = numpy.zeros((self.nCode, self.__nHeis), dtype=numpy.complex)
723 723
724 724 __codeBuffer[:,0:self.nBaud] = self.code
725 725
726 726 self.fft_code = numpy.conj(numpy.fft.fft(__codeBuffer, axis=1))
727 727
728 728 if dataOut.flagDataAsBlock:
729 729
730 730 self.ndatadec = self.__nHeis #- self.nBaud + 1
731 731
732 732 self.datadecTime = numpy.zeros((self.__nChannels, self.__nProfiles, self.ndatadec), dtype=numpy.complex)
733 733
734 734 else:
735 735
736 736 #Time
737 737 self.ndatadec = self.__nHeis #- self.nBaud + 1
738 738
739 739 self.datadecTime = numpy.zeros((self.__nChannels, self.ndatadec), dtype=numpy.complex)
740 740
741 741 def __convolutionInFreq(self, data):
742 742
743 743 fft_code = self.fft_code[self.__profIndex].reshape(1,-1)
744 744
745 745 fft_data = numpy.fft.fft(data, axis=1)
746 746
747 747 conv = fft_data*fft_code
748 748
749 749 data = numpy.fft.ifft(conv,axis=1)
750 750
751 751 return data
752 752
753 753 def __convolutionInFreqOpt(self, data):
754 754
755 755 raise NotImplementedError
756 756
757 757 def __convolutionInTime(self, data):
758 758
759 759 code = self.code[self.__profIndex]
760 760 for i in range(self.__nChannels):
761 761 self.datadecTime[i,:] = numpy.correlate(data[i,:], code, mode='full')[self.nBaud-1:]
762 762
763 763 return self.datadecTime
764 764
765 765 def __convolutionByBlockInTime(self, data):
766 766
767 767 repetitions = int(self.__nProfiles / self.nCode)
768 768 junk = numpy.lib.stride_tricks.as_strided(self.code, (repetitions, self.code.size), (0, self.code.itemsize))
769 769 junk = junk.flatten()
770 770 code_block = numpy.reshape(junk, (self.nCode*repetitions, self.nBaud))
771 771 profilesList = range(self.__nProfiles)
772 772
773 773 for i in range(self.__nChannels):
774 774 for j in profilesList:
775 775 self.datadecTime[i,j,:] = numpy.correlate(data[i,j,:], code_block[j,:], mode='full')[self.nBaud-1:]
776 776 return self.datadecTime
777 777
778 778 def __convolutionByBlockInFreq(self, data):
779 779
780 780 raise NotImplementedError("Decoder by frequency fro Blocks not implemented")
781 781
782 782
783 783 fft_code = self.fft_code[self.__profIndex].reshape(1,-1)
784 784
785 785 fft_data = numpy.fft.fft(data, axis=2)
786 786
787 787 conv = fft_data*fft_code
788 788
789 789 data = numpy.fft.ifft(conv,axis=2)
790 790
791 791 return data
792 792
793 793
794 794 def run(self, dataOut, code=None, nCode=None, nBaud=None, mode = 0, osamp=None, times=None):
795 795
796 796 if dataOut.flagDecodeData:
797 797 print("This data is already decoded, recoding again ...")
798 798
799 799 if not self.isConfig:
800 800
801 801 if code is None:
802 802 if dataOut.code is None:
803 803 raise ValueError("Code could not be read from %s instance. Enter a value in Code parameter" %dataOut.type)
804 804
805 805 code = dataOut.code
806 806 else:
807 807 code = numpy.array(code).reshape(nCode,nBaud)
808 808 self.setup(code, osamp, dataOut)
809 809
810 810 self.isConfig = True
811 811
812 812 if mode == 3:
813 813 sys.stderr.write("Decoder Warning: mode=%d is not valid, using mode=0\n" %mode)
814 814
815 815 if times != None:
816 816 sys.stderr.write("Decoder Warning: Argument 'times' in not used anymore\n")
817 817
818 818 if self.code is None:
819 819 print("Fail decoding: Code is not defined.")
820 820 return
821 821
822 822 self.__nProfiles = dataOut.nProfiles
823 823 datadec = None
824 824
825 825 if mode == 3:
826 826 mode = 0
827 827
828 828 if dataOut.flagDataAsBlock:
829 829 """
830 830 Decoding when data have been read as block,
831 831 """
832 832
833 833 if mode == 0:
834 834 datadec = self.__convolutionByBlockInTime(dataOut.data)
835 835 if mode == 1:
836 836 datadec = self.__convolutionByBlockInFreq(dataOut.data)
837 837 else:
838 838 """
839 839 Decoding when data have been read profile by profile
840 840 """
841 841 if mode == 0:
842 842 datadec = self.__convolutionInTime(dataOut.data)
843 843
844 844 if mode == 1:
845 845 datadec = self.__convolutionInFreq(dataOut.data)
846 846
847 847 if mode == 2:
848 848 datadec = self.__convolutionInFreqOpt(dataOut.data)
849 849
850 850 if datadec is None:
851 851 raise ValueError("Codification mode selected is not valid: mode=%d. Try selecting 0 or 1" %mode)
852 852
853 853 dataOut.code = self.code
854 854 dataOut.nCode = self.nCode
855 855 dataOut.nBaud = self.nBaud
856 856
857 857 dataOut.data = datadec
858 858
859 859 dataOut.heightList = dataOut.heightList[0:datadec.shape[-1]]
860 860
861 861 dataOut.flagDecodeData = True #asumo q la data esta decodificada
862 862
863 863 if self.__profIndex == self.nCode-1:
864 864 self.__profIndex = 0
865 865 return dataOut
866 866
867 867 self.__profIndex += 1
868 868
869 869 return dataOut
870 870 # dataOut.flagDeflipData = True #asumo q la data no esta sin flip
871 871
872 872
873 873 class ProfileConcat(Operation):
874 874
875 875 isConfig = False
876 876 buffer = None
877 877
878 878 def __init__(self, **kwargs):
879 879
880 880 Operation.__init__(self, **kwargs)
881 881 self.profileIndex = 0
882 882
883 883 def reset(self):
884 884 self.buffer = numpy.zeros_like(self.buffer)
885 885 self.start_index = 0
886 886 self.times = 1
887 887
888 888 def setup(self, data, m, n=1):
889 889 self.buffer = numpy.zeros((data.shape[0],data.shape[1]*m),dtype=type(data[0,0]))
890 890 self.nHeights = data.shape[1]#.nHeights
891 891 self.start_index = 0
892 892 self.times = 1
893 893
894 894 def concat(self, data):
895 895
896 896 self.buffer[:,self.start_index:self.nHeights*self.times] = data.copy()
897 897 self.start_index = self.start_index + self.nHeights
898 898
899 899 def run(self, dataOut, m):
900 900 dataOut.flagNoData = True
901 901
902 902 if not self.isConfig:
903 903 self.setup(dataOut.data, m, 1)
904 904 self.isConfig = True
905 905
906 906 if dataOut.flagDataAsBlock:
907 907 raise ValueError("ProfileConcat can only be used when voltage have been read profile by profile, getBlock = False")
908 908
909 909 else:
910 910 self.concat(dataOut.data)
911 911 self.times += 1
912 912 if self.times > m:
913 913 dataOut.data = self.buffer
914 914 self.reset()
915 915 dataOut.flagNoData = False
916 916 # se deben actualizar mas propiedades del header y del objeto dataOut, por ejemplo, las alturas
917 917 deltaHeight = dataOut.heightList[1] - dataOut.heightList[0]
918 918 xf = dataOut.heightList[0] + dataOut.nHeights * deltaHeight * m
919 919 dataOut.heightList = numpy.arange(dataOut.heightList[0], xf, deltaHeight)
920 920 dataOut.ippSeconds *= m
921 921 return dataOut
922 922
923 923 class ProfileSelector(Operation):
924 924
925 925 profileIndex = None
926 926 # Tamanho total de los perfiles
927 927 nProfiles = None
928 928
929 929 def __init__(self, **kwargs):
930 930
931 931 Operation.__init__(self, **kwargs)
932 932 self.profileIndex = 0
933 933
934 934 def incProfileIndex(self):
935 935
936 936 self.profileIndex += 1
937 937
938 938 if self.profileIndex >= self.nProfiles:
939 939 self.profileIndex = 0
940 940
941 941 def isThisProfileInRange(self, profileIndex, minIndex, maxIndex):
942 942
943 943 if profileIndex < minIndex:
944 944 return False
945 945
946 946 if profileIndex > maxIndex:
947 947 return False
948 948
949 949 return True
950 950
951 951 def isThisProfileInList(self, profileIndex, profileList):
952 952
953 953 if profileIndex not in profileList:
954 954 return False
955 955
956 956 return True
957 957
958 958 def run(self, dataOut, profileList=None, profileRangeList=None, beam=None, byblock=False, rangeList = None, nProfiles=None):
959 959
960 960 """
961 961 ProfileSelector:
962 962
963 963 Inputs:
964 964 profileList : Index of profiles selected. Example: profileList = (0,1,2,7,8)
965 965
966 966 profileRangeList : Minimum and maximum profile indexes. Example: profileRangeList = (4, 30)
967 967
968 968 rangeList : List of profile ranges. Example: rangeList = ((4, 30), (32, 64), (128, 256))
969 969
970 970 """
971 971
972 972 if rangeList is not None:
973 973 if type(rangeList[0]) not in (tuple, list):
974 974 rangeList = [rangeList]
975 975
976 976 dataOut.flagNoData = True
977 977
978 978 if dataOut.flagDataAsBlock:
979 979 """
980 980 data dimension = [nChannels, nProfiles, nHeis]
981 981 """
982 982 if profileList != None:
983 983 dataOut.data = dataOut.data[:,profileList,:]
984 984
985 985 if profileRangeList != None:
986 986 minIndex = profileRangeList[0]
987 987 maxIndex = profileRangeList[1]
988 988 profileList = list(range(minIndex, maxIndex+1))
989 989
990 990 dataOut.data = dataOut.data[:,minIndex:maxIndex+1,:]
991 991
992 992 if rangeList != None:
993 993
994 994 profileList = []
995 995
996 996 for thisRange in rangeList:
997 997 minIndex = thisRange[0]
998 998 maxIndex = thisRange[1]
999 999
1000 1000 profileList.extend(list(range(minIndex, maxIndex+1)))
1001 1001
1002 1002 dataOut.data = dataOut.data[:,profileList,:]
1003 1003
1004 1004 dataOut.nProfiles = len(profileList)
1005 1005 dataOut.profileIndex = dataOut.nProfiles - 1
1006 1006 dataOut.flagNoData = False
1007 1007
1008 1008 return dataOut
1009 1009
1010 1010 """
1011 1011 data dimension = [nChannels, nHeis]
1012 1012 """
1013 1013
1014 1014 if profileList != None:
1015 1015
1016 1016 if self.isThisProfileInList(dataOut.profileIndex, profileList):
1017 1017
1018 1018 self.nProfiles = len(profileList)
1019 1019 dataOut.nProfiles = self.nProfiles
1020 1020 dataOut.profileIndex = self.profileIndex
1021 1021 dataOut.flagNoData = False
1022 1022
1023 1023 self.incProfileIndex()
1024 1024 return dataOut
1025 1025
1026 1026 if profileRangeList != None:
1027 1027
1028 1028 minIndex = profileRangeList[0]
1029 1029 maxIndex = profileRangeList[1]
1030 1030
1031 1031 if self.isThisProfileInRange(dataOut.profileIndex, minIndex, maxIndex):
1032 1032
1033 1033 self.nProfiles = maxIndex - minIndex + 1
1034 1034 dataOut.nProfiles = self.nProfiles
1035 1035 dataOut.profileIndex = self.profileIndex
1036 1036 dataOut.flagNoData = False
1037 1037
1038 1038 self.incProfileIndex()
1039 1039 return dataOut
1040 1040
1041 1041 if rangeList != None:
1042 1042
1043 1043 nProfiles = 0
1044 1044
1045 1045 for thisRange in rangeList:
1046 1046 minIndex = thisRange[0]
1047 1047 maxIndex = thisRange[1]
1048 1048
1049 1049 nProfiles += maxIndex - minIndex + 1
1050 1050
1051 1051 for thisRange in rangeList:
1052 1052
1053 1053 minIndex = thisRange[0]
1054 1054 maxIndex = thisRange[1]
1055 1055
1056 1056 if self.isThisProfileInRange(dataOut.profileIndex, minIndex, maxIndex):
1057 1057
1058 1058 self.nProfiles = nProfiles
1059 1059 dataOut.nProfiles = self.nProfiles
1060 1060 dataOut.profileIndex = self.profileIndex
1061 1061 dataOut.flagNoData = False
1062 1062
1063 1063 self.incProfileIndex()
1064 1064
1065 1065 break
1066 1066
1067 1067 return dataOut
1068 1068
1069 1069
1070 1070 if beam != None: #beam is only for AMISR data
1071 1071 if self.isThisProfileInList(dataOut.profileIndex, dataOut.beamRangeDict[beam]):
1072 1072 dataOut.flagNoData = False
1073 1073 dataOut.profileIndex = self.profileIndex
1074 1074
1075 1075 self.incProfileIndex()
1076 1076
1077 1077 return dataOut
1078 1078
1079 1079 raise ValueError("ProfileSelector needs profileList, profileRangeList or rangeList parameter")
1080 1080
1081 1081
1082 1082 class Reshaper(Operation):
1083 1083
1084 1084 def __init__(self, **kwargs):
1085 1085
1086 1086 Operation.__init__(self, **kwargs)
1087 1087
1088 1088 self.__buffer = None
1089 1089 self.__nitems = 0
1090 1090
1091 1091 def __appendProfile(self, dataOut, nTxs):
1092 1092
1093 1093 if self.__buffer is None:
1094 1094 shape = (dataOut.nChannels, int(dataOut.nHeights/nTxs) )
1095 1095 self.__buffer = numpy.empty(shape, dtype = dataOut.data.dtype)
1096 1096
1097 1097 ini = dataOut.nHeights * self.__nitems
1098 1098 end = ini + dataOut.nHeights
1099 1099
1100 1100 self.__buffer[:, ini:end] = dataOut.data
1101 1101
1102 1102 self.__nitems += 1
1103 1103
1104 1104 return int(self.__nitems*nTxs)
1105 1105
1106 1106 def __getBuffer(self):
1107 1107
1108 1108 if self.__nitems == int(1./self.__nTxs):
1109 1109
1110 1110 self.__nitems = 0
1111 1111
1112 1112 return self.__buffer.copy()
1113 1113
1114 1114 return None
1115 1115
1116 1116 def __checkInputs(self, dataOut, shape, nTxs):
1117 1117
1118 1118 if shape is None and nTxs is None:
1119 1119 raise ValueError("Reshaper: shape of factor should be defined")
1120 1120
1121 1121 if nTxs:
1122 1122 if nTxs < 0:
1123 1123 raise ValueError("nTxs should be greater than 0")
1124 1124
1125 1125 if nTxs < 1 and dataOut.nProfiles % (1./nTxs) != 0:
1126 1126 raise ValueError("nProfiles= %d is not divisibled by (1./nTxs) = %f" %(dataOut.nProfiles, (1./nTxs)))
1127 1127
1128 1128 shape = [dataOut.nChannels, dataOut.nProfiles*nTxs, dataOut.nHeights/nTxs]
1129 1129
1130 1130 return shape, nTxs
1131 1131
1132 1132 if len(shape) != 2 and len(shape) != 3:
1133 1133 raise ValueError("shape dimension should be equal to 2 or 3. shape = (nProfiles, nHeis) or (nChannels, nProfiles, nHeis). Actually shape = (%d, %d, %d)" %(dataOut.nChannels, dataOut.nProfiles, dataOut.nHeights))
1134 1134
1135 1135 if len(shape) == 2:
1136 1136 shape_tuple = [dataOut.nChannels]
1137 1137 shape_tuple.extend(shape)
1138 1138 else:
1139 1139 shape_tuple = list(shape)
1140 1140
1141 1141 nTxs = 1.0*shape_tuple[1]/dataOut.nProfiles
1142 1142
1143 1143 return shape_tuple, nTxs
1144 1144
1145 1145 def run(self, dataOut, shape=None, nTxs=None):
1146 1146
1147 1147 shape_tuple, self.__nTxs = self.__checkInputs(dataOut, shape, nTxs)
1148 1148
1149 1149 dataOut.flagNoData = True
1150 1150 profileIndex = None
1151 1151
1152 1152 if dataOut.flagDataAsBlock:
1153 1153
1154 1154 dataOut.data = numpy.reshape(dataOut.data, shape_tuple)
1155 1155 dataOut.flagNoData = False
1156 1156
1157 1157 profileIndex = int(dataOut.nProfiles*self.__nTxs) - 1
1158 1158
1159 1159 else:
1160 1160
1161 1161 if self.__nTxs < 1:
1162 1162
1163 1163 self.__appendProfile(dataOut, self.__nTxs)
1164 1164 new_data = self.__getBuffer()
1165 1165
1166 1166 if new_data is not None:
1167 1167 dataOut.data = new_data
1168 1168 dataOut.flagNoData = False
1169 1169
1170 1170 profileIndex = dataOut.profileIndex*nTxs
1171 1171
1172 1172 else:
1173 1173 raise ValueError("nTxs should be greater than 0 and lower than 1, or use VoltageReader(..., getblock=True)")
1174 1174
1175 1175 deltaHeight = dataOut.heightList[1] - dataOut.heightList[0]
1176 1176
1177 1177 dataOut.heightList = numpy.arange(dataOut.nHeights/self.__nTxs) * deltaHeight + dataOut.heightList[0]
1178 1178
1179 1179 dataOut.nProfiles = int(dataOut.nProfiles*self.__nTxs)
1180 1180
1181 1181 dataOut.profileIndex = profileIndex
1182 1182
1183 1183 dataOut.ippSeconds /= self.__nTxs
1184 1184
1185 1185 return dataOut
1186 1186
1187 1187 class SplitProfiles(Operation):
1188 1188
1189 1189 def __init__(self, **kwargs):
1190 1190
1191 1191 Operation.__init__(self, **kwargs)
1192 1192
1193 1193 def run(self, dataOut, n):
1194 1194
1195 1195 dataOut.flagNoData = True
1196 1196 profileIndex = None
1197 1197
1198 1198 if dataOut.flagDataAsBlock:
1199 1199
1200 1200 #nchannels, nprofiles, nsamples
1201 1201 shape = dataOut.data.shape
1202 1202
1203 1203 if shape[2] % n != 0:
1204 1204 raise ValueError("Could not split the data, n=%d has to be multiple of %d" %(n, shape[2]))
1205 1205
1206 1206 new_shape = shape[0], shape[1]*n, int(shape[2]/n)
1207 1207
1208 1208 dataOut.data = numpy.reshape(dataOut.data, new_shape)
1209 1209 dataOut.flagNoData = False
1210 1210
1211 1211 profileIndex = int(dataOut.nProfiles/n) - 1
1212 1212
1213 1213 else:
1214 1214
1215 1215 raise ValueError("Could not split the data when is read Profile by Profile. Use VoltageReader(..., getblock=True)")
1216 1216
1217 1217 deltaHeight = dataOut.heightList[1] - dataOut.heightList[0]
1218 1218
1219 1219 dataOut.heightList = numpy.arange(dataOut.nHeights/n) * deltaHeight + dataOut.heightList[0]
1220 1220
1221 1221 dataOut.nProfiles = int(dataOut.nProfiles*n)
1222 1222
1223 1223 dataOut.profileIndex = profileIndex
1224 1224
1225 1225 dataOut.ippSeconds /= n
1226 1226
1227 1227 return dataOut
1228 1228
1229 1229 class CombineProfiles(Operation):
1230 1230 def __init__(self, **kwargs):
1231 1231
1232 1232 Operation.__init__(self, **kwargs)
1233 1233
1234 1234 self.__remData = None
1235 1235 self.__profileIndex = 0
1236 1236
1237 1237 def run(self, dataOut, n):
1238 1238
1239 1239 dataOut.flagNoData = True
1240 1240 profileIndex = None
1241 1241
1242 1242 if dataOut.flagDataAsBlock:
1243 1243
1244 1244 #nchannels, nprofiles, nsamples
1245 1245 shape = dataOut.data.shape
1246 1246 new_shape = shape[0], shape[1]/n, shape[2]*n
1247 1247
1248 1248 if shape[1] % n != 0:
1249 1249 raise ValueError("Could not split the data, n=%d has to be multiple of %d" %(n, shape[1]))
1250 1250
1251 1251 dataOut.data = numpy.reshape(dataOut.data, new_shape)
1252 1252 dataOut.flagNoData = False
1253 1253
1254 1254 profileIndex = int(dataOut.nProfiles*n) - 1
1255 1255
1256 1256 else:
1257 1257
1258 1258 #nchannels, nsamples
1259 1259 if self.__remData is None:
1260 1260 newData = dataOut.data
1261 1261 else:
1262 1262 newData = numpy.concatenate((self.__remData, dataOut.data), axis=1)
1263 1263
1264 1264 self.__profileIndex += 1
1265 1265
1266 1266 if self.__profileIndex < n:
1267 1267 self.__remData = newData
1268 1268 #continue
1269 1269 return
1270 1270
1271 1271 self.__profileIndex = 0
1272 1272 self.__remData = None
1273 1273
1274 1274 dataOut.data = newData
1275 1275 dataOut.flagNoData = False
1276 1276
1277 1277 profileIndex = dataOut.profileIndex/n
1278 1278
1279 1279
1280 1280 deltaHeight = dataOut.heightList[1] - dataOut.heightList[0]
1281 1281
1282 1282 dataOut.heightList = numpy.arange(dataOut.nHeights*n) * deltaHeight + dataOut.heightList[0]
1283 1283
1284 1284 dataOut.nProfiles = int(dataOut.nProfiles/n)
1285 1285
1286 1286 dataOut.profileIndex = profileIndex
1287 1287
1288 1288 dataOut.ippSeconds *= n
1289 1289
1290 1290 return dataOut
1291 1291
1292 1292 class PulsePairVoltage(Operation):
1293 1293 '''
1294 1294 Function PulsePair(Signal Power, Velocity)
1295 1295 The real component of Lag[0] provides Intensity Information
1296 1296 The imag component of Lag[1] Phase provides Velocity Information
1297 1297
1298 1298 Configuration Parameters:
1299 1299 nPRF = Number of Several PRF
1300 1300 theta = Degree Azimuth angel Boundaries
1301 1301
1302 1302 Input:
1303 1303 self.dataOut
1304 1304 lag[N]
1305 1305 Affected:
1306 1306 self.dataOut.spc
1307 1307 '''
1308 1308 isConfig = False
1309 1309 __profIndex = 0
1310 1310 __initime = None
1311 1311 __lastdatatime = None
1312 1312 __buffer = None
1313 1313 noise = None
1314 1314 __dataReady = False
1315 1315 n = None
1316 1316 __nch = 0
1317 1317 __nHeis = 0
1318 1318 removeDC = False
1319 1319 ipp = None
1320 1320 lambda_ = 0
1321 1321
1322 1322 def __init__(self,**kwargs):
1323 1323 Operation.__init__(self,**kwargs)
1324 1324
1325 1325 def setup(self, dataOut, n = None, removeDC=False):
1326 1326 '''
1327 1327 n= Numero de PRF's de entrada
1328 1328 '''
1329 1329 self.__initime = None
1330 1330 self.__lastdatatime = 0
1331 1331 self.__dataReady = False
1332 1332 self.__buffer = 0
1333 1333 self.__profIndex = 0
1334 1334 self.noise = None
1335 1335 self.__nch = dataOut.nChannels
1336 1336 self.__nHeis = dataOut.nHeights
1337 1337 self.removeDC = removeDC
1338 1338 self.lambda_ = 3.0e8/(9345.0e6)
1339 1339 self.ippSec = dataOut.ippSeconds
1340 1340 self.nCohInt = dataOut.nCohInt
1341 1341 print("IPPseconds",dataOut.ippSeconds)
1342 1342
1343 1343 print("ELVALOR DE n es:", n)
1344 1344 if n == None:
1345 1345 raise ValueError("n should be specified.")
1346 1346
1347 1347 if n != None:
1348 1348 if n<2:
1349 1349 raise ValueError("n should be greater than 2")
1350 1350
1351 1351 self.n = n
1352 1352 self.__nProf = n
1353 1353
1354 1354 self.__buffer = numpy.zeros((dataOut.nChannels,
1355 1355 n,
1356 1356 dataOut.nHeights),
1357 1357 dtype='complex')
1358 1358 #self.noise = numpy.zeros([self.__nch,self.__nHeis])
1359 1359 #for i in range(self.__nch):
1360 1360 # self.noise[i]=dataOut.getNoise(channel=i)
1361 1361
1362 1362 def putData(self,data):
1363 1363 '''
1364 1364 Add a profile to he __buffer and increase in one the __profiel Index
1365 1365 '''
1366 1366 self.__buffer[:,self.__profIndex,:]= data
1367 1367 self.__profIndex += 1
1368 1368 return
1369 1369
1370 1370 def pushData(self,dataOut):
1371 1371 '''
1372 1372 Return the PULSEPAIR and the profiles used in the operation
1373 1373 Affected : self.__profileIndex
1374 1374 '''
1375 1375 if self.removeDC==True:
1376 1376 mean = numpy.mean(self.__buffer,1)
1377 1377 tmp = mean.reshape(self.__nch,1,self.__nHeis)
1378 1378 dc= numpy.tile(tmp,[1,self.__nProf,1])
1379 1379 self.__buffer = self.__buffer - dc
1380 1380
1381 1381 lag_0 = numpy.sum(self.__buffer*numpy.conj(self.__buffer),1)
1382 1382 data_intensity = lag_0/(self.n*self.nCohInt)#*self.nCohInt)
1383 1383
1384 1384 pair1 = self.__buffer[:,:-1,:]*numpy.conjugate(self.__buffer[:,1:,:])
1385 1385 lag_1 = numpy.sum(pair1,1)
1386 1386 #angle = numpy.angle(numpy.sum(pair1,1))*180/(math.pi)
1387 1387 data_velocity = (-1.0*self.lambda_/(4*math.pi*self.ippSec))*numpy.angle(lag_1)#self.ippSec*self.nCohInt
1388 1388
1389 1389 self.noise = numpy.zeros([self.__nch,self.__nHeis])
1390 1390 for i in range(self.__nch):
1391 1391 self.noise[i]=dataOut.getNoise(channel=i)
1392 1392
1393 1393 lag_0 = lag_0.real/(self.n)
1394 1394 lag_1 = lag_1/(self.n-1)
1395 1395 R1 = numpy.abs(lag_1)
1396 1396 S = (lag_0-self.noise)
1397 1397
1398 1398 data_snrPP = S/self.noise
1399 1399 data_snrPP = numpy.where(data_snrPP<0,1,data_snrPP)
1400 1400
1401 1401 L = S/R1
1402 1402 L = numpy.where(L<0,1,L)
1403 1403 L = numpy.log(L)
1404 1404
1405 1405 tmp = numpy.sqrt(numpy.absolute(L))
1406 1406
1407 1407 data_specwidth = (self.lambda_/(2*math.sqrt(2)*math.pi*self.ippSec))*tmp*numpy.sign(L)
1408 1408 #data_specwidth = (self.lambda_/(2*math.sqrt(2)*math.pi*self.ippSec))*k
1409 1409 n = self.__profIndex
1410 1410
1411 1411 self.__buffer = numpy.zeros((self.__nch, self.__nProf,self.__nHeis), dtype='complex')
1412 1412 self.__profIndex = 0
1413 1413 return data_intensity,data_velocity,data_snrPP,data_specwidth,n
1414 1414
1415 1415 def pulsePairbyProfiles(self,dataOut):
1416 1416
1417 1417 self.__dataReady = False
1418 1418 data_intensity = None
1419 1419 data_velocity = None
1420 1420 data_specwidth = None
1421 1421 data_snrPP = None
1422 1422 self.putData(data=dataOut.data)
1423 1423 if self.__profIndex == self.n:
1424 1424 #self.noise = numpy.zeros([self.__nch,self.__nHeis])
1425 1425 #for i in range(self.__nch):
1426 1426 # self.noise[i]=data.getNoise(channel=i)
1427 1427 #print(self.noise.shape)
1428 1428 data_intensity, data_velocity,data_snrPP,data_specwidth, n = self.pushData(dataOut=dataOut)
1429 1429 self.__dataReady = True
1430 1430
1431 1431 return data_intensity, data_velocity,data_snrPP,data_specwidth
1432 1432
1433 1433 def pulsePairOp(self, dataOut, datatime= None):
1434 1434
1435 1435 if self.__initime == None:
1436 1436 self.__initime = datatime
1437 1437 #print("hola")
1438 1438 data_intensity, data_velocity,data_snrPP,data_specwidth = self.pulsePairbyProfiles(dataOut)
1439 1439 self.__lastdatatime = datatime
1440 1440
1441 1441 if data_intensity is None:
1442 1442 return None, None,None,None,None
1443 1443
1444 1444 avgdatatime = self.__initime
1445 1445 deltatime = datatime - self.__lastdatatime
1446 1446 self.__initime = datatime
1447 1447
1448 1448 return data_intensity, data_velocity,data_snrPP,data_specwidth,avgdatatime
1449 1449
1450 1450 def run(self, dataOut,n = None,removeDC= False, overlapping= False,**kwargs):
1451 1451
1452 1452 if not self.isConfig:
1453 1453 self.setup(dataOut = dataOut, n = n , removeDC=removeDC , **kwargs)
1454 1454 self.isConfig = True
1455 1455 data_intensity, data_velocity,data_snrPP,data_specwidth, avgdatatime = self.pulsePairOp(dataOut, dataOut.utctime)
1456 1456 dataOut.flagNoData = True
1457 1457
1458 1458 if self.__dataReady:
1459 1459 dataOut.nCohInt *= self.n
1460 1460 dataOut.data_intensity = data_intensity #valor para intensidad
1461 1461 dataOut.data_velocity = data_velocity #valor para velocidad
1462 1462 dataOut.data_snrPP = data_snrPP # valor para snr
1463 1463 dataOut.data_specwidth = data_specwidth
1464 1464 dataOut.PRFbyAngle = self.n #numero de PRF*cada angulo rotado que equivale a un tiempo.
1465 1465 dataOut.utctime = avgdatatime
1466 1466 dataOut.flagNoData = False
1467 1467 return dataOut
1468 1468
1469 1469
1470 1470 # import collections
1471 1471 # from scipy.stats import mode
1472 1472 #
1473 1473 # class Synchronize(Operation):
1474 1474 #
1475 1475 # isConfig = False
1476 1476 # __profIndex = 0
1477 1477 #
1478 1478 # def __init__(self, **kwargs):
1479 1479 #
1480 1480 # Operation.__init__(self, **kwargs)
1481 1481 # # self.isConfig = False
1482 1482 # self.__powBuffer = None
1483 1483 # self.__startIndex = 0
1484 1484 # self.__pulseFound = False
1485 1485 #
1486 1486 # def __findTxPulse(self, dataOut, channel=0, pulse_with = None):
1487 1487 #
1488 1488 # #Read data
1489 1489 #
1490 1490 # powerdB = dataOut.getPower(channel = channel)
1491 1491 # noisedB = dataOut.getNoise(channel = channel)[0]
1492 1492 #
1493 1493 # self.__powBuffer.extend(powerdB.flatten())
1494 1494 #
1495 1495 # dataArray = numpy.array(self.__powBuffer)
1496 1496 #
1497 1497 # filteredPower = numpy.correlate(dataArray, dataArray[0:self.__nSamples], "same")
1498 1498 #
1499 1499 # maxValue = numpy.nanmax(filteredPower)
1500 1500 #
1501 1501 # if maxValue < noisedB + 10:
1502 1502 # #No se encuentra ningun pulso de transmision
1503 1503 # return None
1504 1504 #
1505 1505 # maxValuesIndex = numpy.where(filteredPower > maxValue - 0.1*abs(maxValue))[0]
1506 1506 #
1507 1507 # if len(maxValuesIndex) < 2:
1508 1508 # #Solo se encontro un solo pulso de transmision de un baudio, esperando por el siguiente TX
1509 1509 # return None
1510 1510 #
1511 1511 # phasedMaxValuesIndex = maxValuesIndex - self.__nSamples
1512 1512 #
1513 1513 # #Seleccionar solo valores con un espaciamiento de nSamples
1514 1514 # pulseIndex = numpy.intersect1d(maxValuesIndex, phasedMaxValuesIndex)
1515 1515 #
1516 1516 # if len(pulseIndex) < 2:
1517 1517 # #Solo se encontro un pulso de transmision con ancho mayor a 1
1518 1518 # return None
1519 1519 #
1520 1520 # spacing = pulseIndex[1:] - pulseIndex[:-1]
1521 1521 #
1522 1522 # #remover senales que se distancien menos de 10 unidades o muestras
1523 1523 # #(No deberian existir IPP menor a 10 unidades)
1524 1524 #
1525 1525 # realIndex = numpy.where(spacing > 10 )[0]
1526 1526 #
1527 1527 # if len(realIndex) < 2:
1528 1528 # #Solo se encontro un pulso de transmision con ancho mayor a 1
1529 1529 # return None
1530 1530 #
1531 1531 # #Eliminar pulsos anchos (deja solo la diferencia entre IPPs)
1532 1532 # realPulseIndex = pulseIndex[realIndex]
1533 1533 #
1534 1534 # period = mode(realPulseIndex[1:] - realPulseIndex[:-1])[0][0]
1535 1535 #
1536 1536 # print "IPP = %d samples" %period
1537 1537 #
1538 1538 # self.__newNSamples = dataOut.nHeights #int(period)
1539 1539 # self.__startIndex = int(realPulseIndex[0])
1540 1540 #
1541 1541 # return 1
1542 1542 #
1543 1543 #
1544 1544 # def setup(self, nSamples, nChannels, buffer_size = 4):
1545 1545 #
1546 1546 # self.__powBuffer = collections.deque(numpy.zeros( buffer_size*nSamples,dtype=numpy.float),
1547 1547 # maxlen = buffer_size*nSamples)
1548 1548 #
1549 1549 # bufferList = []
1550 1550 #
1551 1551 # for i in range(nChannels):
1552 1552 # bufferByChannel = collections.deque(numpy.zeros( buffer_size*nSamples, dtype=numpy.complex) + numpy.NAN,
1553 1553 # maxlen = buffer_size*nSamples)
1554 1554 #
1555 1555 # bufferList.append(bufferByChannel)
1556 1556 #
1557 1557 # self.__nSamples = nSamples
1558 1558 # self.__nChannels = nChannels
1559 1559 # self.__bufferList = bufferList
1560 1560 #
1561 1561 # def run(self, dataOut, channel = 0):
1562 1562 #
1563 1563 # if not self.isConfig:
1564 1564 # nSamples = dataOut.nHeights
1565 1565 # nChannels = dataOut.nChannels
1566 1566 # self.setup(nSamples, nChannels)
1567 1567 # self.isConfig = True
1568 1568 #
1569 1569 # #Append new data to internal buffer
1570 1570 # for thisChannel in range(self.__nChannels):
1571 1571 # bufferByChannel = self.__bufferList[thisChannel]
1572 1572 # bufferByChannel.extend(dataOut.data[thisChannel])
1573 1573 #
1574 1574 # if self.__pulseFound:
1575 1575 # self.__startIndex -= self.__nSamples
1576 1576 #
1577 1577 # #Finding Tx Pulse
1578 1578 # if not self.__pulseFound:
1579 1579 # indexFound = self.__findTxPulse(dataOut, channel)
1580 1580 #
1581 1581 # if indexFound == None:
1582 1582 # dataOut.flagNoData = True
1583 1583 # return
1584 1584 #
1585 1585 # self.__arrayBuffer = numpy.zeros((self.__nChannels, self.__newNSamples), dtype = numpy.complex)
1586 1586 # self.__pulseFound = True
1587 1587 # self.__startIndex = indexFound
1588 1588 #
1589 1589 # #If pulse was found ...
1590 1590 # for thisChannel in range(self.__nChannels):
1591 1591 # bufferByChannel = self.__bufferList[thisChannel]
1592 1592 # #print self.__startIndex
1593 1593 # x = numpy.array(bufferByChannel)
1594 1594 # self.__arrayBuffer[thisChannel] = x[self.__startIndex:self.__startIndex+self.__newNSamples]
1595 1595 #
1596 1596 # deltaHeight = dataOut.heightList[1] - dataOut.heightList[0]
1597 1597 # dataOut.heightList = numpy.arange(self.__newNSamples)*deltaHeight
1598 1598 # # dataOut.ippSeconds = (self.__newNSamples / deltaHeight)/1e6
1599 1599 #
1600 1600 # dataOut.data = self.__arrayBuffer
1601 1601 #
1602 1602 # self.__startIndex += self.__newNSamples
1603 1603 #
1604 1604 # return
General Comments 0
You need to be logged in to leave comments. Login now