diff options
author | Paul Brossier <piem@debian.org> | 2020-01-08 13:39:53 +0100 |
---|---|---|
committer | Paul Brossier <piem@debian.org> | 2020-01-08 13:39:53 +0100 |
commit | 8da857771f881b86aef7b83c8c9d3b83ea709c06 (patch) | |
tree | 31dfc0e7d1de038aa138951604be023de3d13f33 | |
parent | 8437981c29ba966827fcb77ca1cfade799d7295a (diff) | |
parent | 790568351a84844cfcb4a831e755282539fbb3c1 (diff) |
Record aubio (0.4.9-4) in archive suite sid
363 files changed, 20692 insertions, 10679 deletions
@@ -1,4 +1,592 @@ -2015-08-16 Paul Brossier <piem@aubio.org> +2018-12-19 Paul Brossier <piem@aubio.org> + + [ Overview ] + + * VERSION: bump to 0.4.9 + * library: improve stability, fixing potential crashes and memory leaks on + invalid arguments; improve library messages and reporting of system errors + * tests/: major clean-up, check return codes, increase code coverage + * python/tests/: switch to pytest (closes gh-163), check emitted warnings + * python/: add pages to manual with brief descriptions of classes + + [ Fixes ] + + * security: improve arguments validation in new_aubio_filterbank (prevent + possible null-pointer dereference on invalid n_filters, CVE-2018-19801), + new_aubio-tempo (prevent possible buffer overflow, CVE-2018-19800), and + new_aubio_onset (prevent null-pointer dereference, CVE-2018-19802). Thanks + to Guoxiang Niu (@niugx), from the EaglEye Team for reporting these issues. + * tempo: fix delay_ms methods + * filterbank: fix aubio_filterbank_get_power (thanks to @romanbsd who + also noticed this issue) + * dct: creation fail on negative sizes or invalid accelerate radix, + fix typo in error and warning messages, prevent possible memory leak + * pitch: prevent null pointer dereference in yinfast, comment out unused + functions in mcomb and yin, prevent possible leak in specacf + * mfcc: always use dct module, strengthen input validation, change + get_{scale,power} to return smpl_t + * specdesc: improve error message + * notes: prevent null pointer dereference + * hist: add validation for size argument, prevent possible leak + * awhitening: use shortest length available (closes gh-216) + * io: add macros to display system errors, add helpers to validate input + arguments of source and sink methods, always clean-up after failure + * source: validate input sizes to prevent invalid reads + * apple_audio: use native format conversions in source and sink, prevent + possible apple_audio crash on empty string, get_duration returns 0 on failure + * ffmpeg/avcodec: prevent deprecation warnings, read after close, and skipped + samples warnings, improve warning messages, only show a warning when + swr_convert failed, prevent possible memory leak when closing swr context + * wavwrite: copy to all channels if needed, check fseek and fwrite return + values, call fflush in open to return failure on full disk-system + * source_sndfile: fix reading sizes when resampling, set error message when + reading after close + * aubio_priv.h: include blas first (see gh-225), add STRERROR macros + + [ Python ] + + * documentation: add pages to manual, add minimal docstrings for fft, + digital_filter, and generated objects, improve specdesc documentation + * filterbank: add get_norm/power documentation + * source: take a copy of the last frame before resizing it, raise an + exception when read failed, fix compilation warning + * fixes: remove unneeded check convert with PyFloat_FromDouble or + PyFloat_FromDouble, check if sink, digital_filter, were created before + deleting + + [ Tests ] + + * python/tests/: switch to pytest (slightly slower than nose2 but better at + capturing warnings and parametrization), improve coding style and coverage. + Tests should now be run with `pytest`. + * tests/: Each test program in C must now return 0, otherwise the test will + fail. Examples have been modified to run themselves on a test audio file, + but can still be run with arguments. Tests for `source` and `sink` have been + factorised, and some code cleaning. A python script is used to create a + test sound file. Tested on linux, macos, and windows, improvements to + test-mfcc (closes gh-219). + + [ Build system ] + + * waf: upgrade to 2.0.14, check the return code of each test program, + update rules to build manual and api documentation into build/, check + for errno.h + * osx: use -Os in scripts/build_apple_frameworks + * Makefile: improve coverage reports + * appveyor, travis, circleci: switch to pytest, set one travis config to use + sndfile only + * travis: add py3.6, drop py3.4, use py3.5 to test debug mode + * azure: add basic configuration + +2018-11-21 Paul Brossier <piem@aubio.org> + + [ Overview ] + + * VERSION: bump to 0.4.8 + * notes: new option release_drop (gh-203) + * spectral: new parameters added to filterbank and mfcc (gh-206) + * python: start documenting module (gh-73, debian #480018), improve build for + win-amd64 (gh-154, gh-199, gh-208) + * fixes: prevent crash when using fft sizes unsupported by vDSP (gh-207), + prevent saturation when down-mixing a multi-channel source (avcodec/ffmpeg) + + [ Fixes ] + + * avcodec: prevent saturation when down-mixing a multi-channel source, emit + a warning if compiling against avutil < 53 (gh-137), wrap long lines + * examples/: avoid hiding global and unreachable code + * fft: limit to r*2*n sizes, with r in [1, 3, 5, 15] (vDSP only) (gh-207) + * fft: fix reconstruction for odd sizes (fftw only) + * pvoc: add missing implementations for aubio_pvoc_get_hop/win + * mathutils: increase ln(2) precision of in freqtomidi/miditofreq + * wavetable: stop sets playing to 0, add dummy implementation for _load + + [ New features ] + + * src/musicutils.h: new aubio_meltohz, aubio_hztomel, with _htk versions + * src/spectral/filterbank.h: new set_mel_coeffs, set_mel_coeffs_htk, + set_power, and set_norm methods, improved set_triangle_bands + * src/spectral/mfcc.h: new set_scale, set_power, set_norm, set_mel_coeffs, + set_mel_coeffs_htk, set_mel_coeffs_slaney + * src/mathutils.h: new fvec_mul + * src/notes: new option release_drop to prevent missing note-offs (gh-203) + + [ Python module ] + + * fix: rounding to nearest integer in midi2note and freq2note + * general: supports code generation of setters with none or multiple + parameters + * documentation: add docstrings do fvec, cvec, source, sink, pvoc, frequency + conversion and level detection routines (gh-73, debian #480018) + * slicing: improve and document slice_source_at_stamps + * module: new note2freq function, recover error log when raising exceptions + on failed set_ methods, prevent cyclic import, coding style improvements + * demos: improve coding style, fix bpm_extract arguments + * MANIFEST.in: exclude *.pyc, improve patterns + + [ Documentation ] + + * doc/: use sphinx autodoc to load docstrings from aubio module, reorganize + python module documentation, add a note about double precision, use https + when possible + * src/spectral/: update Auditory Toolbox url, update copyright year + + [ Tools ] + + * aubionotes: add --release-drop option + * aubio: add --release-drop and --silence options to `aubio notes`, + workaround for -V to really show version (py2) + * aubiocut: add option --create-first to always create first slice + + [ Tests ] + + * tests/, python/tests: add tests for new methods, check source channel + down-mix, improve coverage + + [ Build system ] + + * Makefile: disable docs when measuring coverage, add branch coverage + option, add coverage_zero_counters target, improve html report + * waf: update to 2.0.12, improve wscript style, prevent shipping some + generated files + * python: always show compiler warnings when pre-processing headers, + workaround to fix code generation for win-amd64 (gh-154, gh-199, gh-208). + * continuous integration: add azure pipelines, update and improve + configurations for appveyor, circleci, and travis. + +2018-09-22 Paul Brossier <piem@aubio.org> + + [ Overview ] + + * VERSION: bump to 0.4.7 + * src/spectral/dct.h: add dct type II object with optimised versions + * src/io/, src/notes/, src/pitch: prevent crashes on corrupted files + * examples/: fix jack midi output, improve messages when jack disabled + * python/: add dct support, minor bug fixes tests and demos + * wscript: improve support for BLAS/ATLAS + + [ Library fixes ] + + * src/pitch/pitchyinfft.c: fix out of bound read when samplerate > 50kHz + thanks to @fCorleone (closes #189, CVE-2018-14523, debian #904906) + * src/notes/notes.c: bail out if pitch creation failed (see #188) + * src/io/source_wavread.c: + - also exit if samplerate is negative (closes #188, CVE-2018-14522, + debian #904907) + - add some input validation (closes #148 and #158, CVE-2017-17054, + debian #883355) + * src/io/source_avcodec.c: + - give up if resampling context failed opening (see #137, closes #187, + CVE-2018-14521, debian #904908) + - give up reading file if number of channel changes during stream (closes + #137, CVE-2017-17554, debian #884237) + - make sure libavutil > 52 before checking avFrame->channels (see #137) + - fix build with ffmpeg 4.0, thanks to @jcowgill (closes #168, #173) + - avoid deprecated call for ffmpeg >= 4.0 + * src/onset/onset.c: add dummy default parameters for wphase (closes #150) + + [ Tools ] + + * examples/parse_args.h: hide jack options if not available, improve error + message (closes #182) + * examples/utils.h: process_block returns void + * examples/utils.c: fix examples failing to send more than one JACK midi + event per frame, thanks to @cyclopsian (closes #201) + + [ New features ] + + * src/spectral/dct.h: add dct type II object with implementation factory + * src/spectral/dct_plain.c: add plain dct implementation + * src/spectral/dct_ooura.c: add ooura implementation + * src/spectral/dct_fftw.c: add fftw implementation + * src/spectral/dct_ipp.c: add ipp version + * src/spectral/dct_accelerate.c: add vdsp/accelerate dct + * tests/src/spectral/test-dct.c: check reconstruction works + * src/spectral/mfcc.c: use new dct to compute mfcc + + [ Library internals ] + + * src/aubio_priv.h: avoid hard-coded undefs, split BLAS and ATLAS support, + add vdsp scalar add and multiply + + [ Build system ] + + * wscript: + - add options to disable examples and tests + - detect includes for openblas/libblas/atlas + * scripts/get_waf.sh: bump to 2.0.11, verify signature if gpg available + * python/lib/gen_external.py: pass '-x c' to emcc only + + [ Python ] + + * python/lib/gen_code.py: add support for rdo methods + * python/tests/test_dct.py: add tests for new dct + * python/demos/demo_pitch_sinusoid.py: use // to yield an integer, fixing + demo on py3, thanks to @ancorcruz (closes #176) + * python/ext/py-musicutils.*: add shift(fvec) and ishift(fvec) + * python/tests/test_fvec_shift.py: add tests for shift() and ishift() + * python/lib/aubio/cmd.py: fix typo in comment + + [ Documentation ] + + * README.md, doc/statuslinks.rst: use latest for commits-since + * examples/parse_args.h: add yinfast to pitch algorithms + * doc/requirements.rst: add some blas documentation + * doc/requirements.rst: split media/optimisation libraries + * doc/develop.rst: fixed spelling error, thanks to Jon Williams (closes #161) + * doc/aubio{pitch,notes}.txt: add yinfast to list of pitch methods + + [ Continuous integration ] + + * .travis.yml: remove xcode8.2 builds, group osx, add alias pip=pip2 + * .appveyor.yml: upgrade pip first, always use python -m pip + +2017-10-02 Paul Brossier <piem@aubio.org> + + [ Overview ] + + * VERSION: bump to 0.4.6 + * src/spectral/fft.c, src/*.c: add support for Intel IPP (many thanks to + Eduard Mueller) + * wscript: add support for emscripten (thanks to Martin Hermant) + * src/pitch/pitchyinfast.h: new fast method to compute YIN algorithm + * src/pitch/pitchyin*.c: improve confidence measure, making sure its value + corresponds to the selected period (thanks to Eduard Mueller) + * python/lib/aubio/cmd.py: add `quiet`, `cut`, and `help` subcommands + + [ Library ] + + * src/aubio_priv.h: add missing aubio_vDSP_vclr (Eduard Mueller) + * src/io/source_avcodec.c: improve error message, prevent un-opened bracket, + no declaration after statements for older compilers, avoid unused variable + * src/mathutils.c: prevent segfault with Accelerate.framework (closes #58, + closes #102) + * src/spectral/phasevoc.h: add aubio_pvoc_set_window to change the windowing + function + * src/mathutils.c: add window type `ones` (no windowing) + + [ Python ] + + * python/demos/demo_tapthebeat.py: add a real-time example to play beats + using pyaudio + * python/lib/gen_external.py: improve parsing and syntax, use results in + emscripten build (Martin Hermant) + * python/lib/aubio/cmd.py: add option `-u` to `aubio pitch`, improve error + messages, add `quiet` subcommand (closes #124), improve syntax, add some + documentation, add `cut` and `help` subcommand, add silence and time format + options + * python/lib/aubio/cut.py: upgrade to argparse, set samplerate as needed + * python/demos/demo_yin_compare.py: add comparison of yin implementations + * python/demos/demo_wav2midi.py: add an example to create a midi from a + sound file using mido (closes: #134) + * python/demos/demo_bpm_extract.py: use argparse, use beats_to_bpm function + * python/ext/py-cvec.c: fix support for pypy by changing setters to return a + negative value on error (closes #17) + + [ Documentation ] + + * src/tempo/beattracking.h: fix typo (thanks to Hannes Fritz) + * doc/requirements.rst: fix broken link (thanks to @ssj71, closes #99) + * doc/aubiomfcc.txt: fix typo in 'coefficients' + + [ Tests ] + + * python/tests/tests_aubio_{cmd,cut}.py: add basic tests + * python/tests/test_filterbank*.py: ignore UserWarnings, clean-up, + improve get_coeff tests + + [ Build system ] + + * wscript: add support for emscripten, see scripts/build_emscripten + * scripts/get_waf.sh: update waf to 2.0.1, build waf from source tarball + * scripts/build_emscripten: update to build aubio.js + * Makefile: add coverage and coverage_report targets, run tests once + + [ Continuous integration ] + + * .travis.yml: add coverage report on osx + * .appveyor.yml: use msvc 14.0 (VS 2015) and scripts/get_waf.sh + * .coveragerc: add minimal python coverage configuration + +2017-04-10 Paul Brossier <piem@aubio.org> + + [Overview] + + * VERSION: bump to 0.4.5 + * src/io/source_avcodec.c: add support for libswresample + * aubio: new python command line tool to extract information + * src/onset/onset.c: add spectral whitening and compression, improve default + parameters + * this_version.py: use centralized script to get current version, adding git + sha when building from git repo (thanks to MartinHN) + + [Interface] + + * src/spectral/awhithening.h: add adaptive whitening + * src/{cvec,mathutils,musicutils}.h: add cvec_logmag, fvec_logmag, and fvec_push + * src/onset/onset.h: add aubio_onset_set_default_parameters to load optimal + parameters of each novelty function, _{set,get}_compression and + _{set,get}_awhitening to turn on/off compression and adaptive whitening + * src/spectral/specdesc.h: add weighted phase + + [Library] + + * src/onset/onset.c: improve default onset parameters (thanks to @superbock + for access to his evaluation database), see commit dccfad2 for more details + * src/pitch/pitch.c: avoid segfault when using invalid parameters + * src/temporal/biquad.c: fix biquad parameters initialization (thanks to + @jurlhardt) + + [Tools] + + * examples/aubio{onset,track}.c: add options --miditap-note and + --miditap-velo to set which midi note is triggered at onset/beat (thanks to + @tseaver) + * examples/aubioonset.c: show actual parameters in verbose mode + * examples/utils.c: improve memory usage to emit midi notes + + [Python] + + * python/ext/py-source.c: add with (PEP 343) and iter (PEP 234) interface + * python/ext/py-sink.c: add with interface (PEP 343) + * python/lib/aubio/cmd.py: new `aubio` command line tool + * python/lib/aubio/cut.py: moved from python/scripts/aubiocut + + [Documentation] + + * doc/*.rst: reorganize and improve sphinx manual + * doc/*.txt: update manpages, add simple manpage for aubio command line + * doc/full.cfg: derive from doc/web.cfg + * README.md: simplify and add contribute information + + [Build system] + + * wscript: prefer libswresample over libavsamplerate when available, use + current version in manpages, doxygen, and sphinx, update to newest waf + * setup.py: use entry_points console_scripts to generate scripts, use + centralized version from this_version.py, clean up + * python/lib/moresetuptools.py: detect if libswresample is available + +2017-01-08 Paul Brossier <piem@aubio.org> + + [ Overview ] + + * VERSION: bump to 0.4.4 + * src/utils/log.h: new function to redirect log, error, and warnings + * python/: AUBIO_ERR raises python exception, AUBIO_WRN to emit py warning + * doc/: add some documentation, fix errors in manpages + * wscript: new rules to build 'manpages', 'doxygen', and 'sphinx', new + --build-type=<release|debug> option (thanks to Eduard Mueller) + * src/notes/notes.h: add minioi and silence methods + * examples/: add --minioi (minimum inter-onset interval) option + * src/pitch/pitchyin.c: improve msvc compiler optimisations (thanks to + Eduard Mueller) + * python/, src/: improve error messages, fix minor memory leaks + * src/io/source_avcodec.c: improve compatibility with latest ffmpeg and with + older libav/ffmpeg versions + * python/demos/: new demos to capture microphone in real time + + [ Interface] + + * src/aubio.h: include utils/log.h + * src/utils/log.h: add new aubio_log_set_function to redirect log messages + * src/notes/notes.h: add aubio_notes_{get,set}_minioi_ms, add + _{get,set}_silence methods + + [ Library ] + + * src/aubio_priv.h: add AUBIO_INF to print to stdout with header, use new + logging function, add ATAN alias, add stdarg.h, move #include "config.h" + * src/{fmat,fvec}.c: avoid integer division + * src/pitch/pitchyin.c: [msvc] help compiler to optimize aubio_pitchyin_do + by giving it addresses for all arrays which are referenced in inner loops, + thanks to Eduard Mueller. + * src/pitch/pitch.c: declare internal functions as static, fail on wrong + method, warn on wrong unit, improve error messages, fix error string + * src/spectral/specdesc.c: return NULL if wrong mode asked, remove trailing + spaces + * src/onset/onset.c: return null and clean-up if new_aubio_specdesc failed, + fix error message + * src/notes/notes.c: use midi note to store pitch candidate, round to + nearest note, add a variable to define precision, fix out-of-bound write, + fix unset silence_threshold, fix error message + * src/spectral/ooura_fft8g.c: add cast to avoid conversion warnings, prefix + public function with aubio_ooura_ to avoid with other apps using ooura (e.g. + puredata), make internal functions static, + * src/spectral/fft.c: add message about fftw3 being able to do non-power of + two sizes, make calls to fftw_destroy_plan thread-safe, use prefixed + aubio_ooura_rdft + * src/spectral/phasevoc.c: fix error string + * src/temporal/resampler.c: throw an error when using libsamplerate with doubles + * src/io/ioutils.h: add functions to check samplerate and channels, use in sink_*.c + * src/io/source.c: add error message when aubio was compiled with no source, + only show error message from last child source_ + * src/io/source_avcodec.c: call avformat_free_context after + avformat_close_input, keep a reference to packet to remove it when closing + file, avoid deprecation warnings with ffmpeg 3.2, add backward compatibility + for libavcodec55, fix for old libavcodec54, use AV_SAMPLE_FMT_DBL when + compiling with HAVE_AUBIO_DOUBLE, fix missing samples in eof block, avoid + function calls before declarations, improve error messages, replace with new + context before closing old one, make sure s->path is set to null + * src/io/{source_wavread,sink_wavwrite}.c: declare internal functions as static + * src/io/source_wavread.c: fix bytes_read for JUNK headers, improve error + messages, initialize buffer, skip chunks until data is found, or abort, skip + junk chunk + * src/io/source_sndfile.c: add support for multi-channel resampling, set + handle to null after sucessful close, add missing floor in ratio comparison, + improve formatting + * src/io/sink.c: only show error message from last child sink_ + * src/io/sink_apple_audio.c: avoid crash on empty file name + * src/io/sink_sndfile.c: improve error message + * src/io/sink_{sndfile,wavwrite}.c: use AUBIO_MAX_CHANNELS, fix error message + + [ Documentation ] + + * README.md: update copyright dates, use https + * src/aubio.h: add some links to examples, use https + * src/pitch/pitch.h: add aubio_pitch_get_tolerance, add basic description of + unit modes + * src/notes/notes.h: add doxygen header + * src/spectral/fft.h: strip example path + * doc/*.rst: improve sphinx documentation + * doc/android.rst: add reference to it scripts/build_android + * doc/debian_packages.rst: added page on debian packages + * doc/python_module.rst: add demo_source_simple.py, add note on pip, add + `print(aubio.version)` + * doc/cli.rst: include command line manpages + * doc/cli_features.rst: add matrix of command line features + * doc/requirements.rst: add a note about --notests (closes #77), document + --msvc options, improve description of options + * doc/download.rst: added page on download + * doc/installing.rst: update + * doc/xcode_frameworks.rst: added page on xcode frameworks + * doc/**: use https://aubio.org + * doc/conf.py: use pyramid theme, update copyright, remove hardcoded path + * doc/web.cfg: exclude ioutils from doc + * doc/aubionotes.txt: document -M option (see #18), + * doc/aubioonset.txt: add documentation for -M, --minioi, improve threshold + description (thanks to Peter Parker), fix typo (onset, not pitch) + * doc/aubio*.txt: document -T/--timeformat option + + [ Build ] + + * Makefile: add a brief intro, avoid offline operations, add html and dist + targets, add rules for documentation, simplify listing, avoid offline + operations, bump waf to 1.9.6, check for waf before clean, chmod go-w + waflib, improve clean, use pip to install, factorise pip options, generate + more test sounds, improve test_python and test_pure_python, pass build_ext + in test_pure_python{,_wheel}, quieten uninstall_python if already + uninstalled, improve test targets, use bdist_wheel in test_pure_python, + build_ext only for --enable-double, verbose waf rules, add cleanwaf + * wscript: added debug/release build type configurations release (default) + enables optimizations, debug symbols are enabled in both configurations, + thanks to Eduard Mueller. + * wscript: add options to disable source_wavread/sink_wavwrite, add check + for stdarg.h, new rules 'manpages', 'sphinx', and 'doxygen' to build + documentation, add version to sphinx and manpages, disable libsamplerate + if double precision enabled (libsamplerate only supports float), fix typos, + remove trailing spaces, improve tarball creation (./waf dist), remove + full.cfg from tarball, prepend to CFLAGS to honor user cflags + * wscript, src/wscript_build: improve install locations using DATAROOTDIR, + MANDIR, INCLUDEDIR + * wscript: default to no atlas for now + * src/wscript_build: always build static library + * scripts/build_android: add an example script to build aubio on android, + + [ Tools ] + + * examples/aubionotes.c: use new notes, set minioi, send last note off when + needed, add warning for missing options + * examples/aubioonset.c: add minioi option, in seconds + * examples/: only send a last note off when using jack + * examples/: return 1 if object creation failed + * examples/: use PROG_HAS_OUTPUT, add PROG_HAS_SILENCE + + [ Tests ] + + * tests/src/spectral/test-fft.c: fix default size + * tests/src/spectral/test-phasevoc.c: fix typos + * tests/src/utils/test-log.c: add AUBIO_INF, add example for + aubio_log_set_function, improve messages + + [ Python ] + + * python/ext/aubiomodule.c: add aubio._aubio.__version__ and import it as + aubio.version, use custom logging function for errors and warnings, remove + duplicated add_generated_objects, use <> for non local aubio + * python/ext/py-cvec.c: use NPY_INTP_FMT + * python/ext/py-fft.c: use error string set in src/spectral/fft.c + * python/ext/py-phasevoc.c: use error string set in src/spectral/phasevoc.c + * python/ext/py-sink.c: always set samplerate and channels in init + * python/ext/py-source.c: use error string set in src/io/source.c + * python/lib/aubio/midiconv.py: add unicode double sharp and double flat, + improve unicode handling, skip UnicodeEncodeError on python 2.x + + [ Python build ] + + * MANIFEST.in: add src/**.c, exclude full.cfg, include waflib, remove + python/ext/config.h + * setup.py: define AUBIO_VERSION use sorted glob.glob to improve + reproducibility, remove extra quotes, remove status from version string, + update description, use custom build_ext instead of 'generate' command, + define HAVE_AUBIO_DOUBLE to 1 if needed + * python/lib/gen_code.py: add support for multiple _do outputs, fix number + of output, improve del_ function, safer DECREF, fix indentation, emit RuntimeError + * python/lib/gen_external.py: clean-up, enable tss, remove duplicate, + sort generated files + * python/lib/moresetuptools.py: add HAVE_STDARG_H, also check for + HAVE_AUBIO_DOUBLE, cleaner clean, look first for system library, then for + local build, then local sources, mo nore fake config.h here, use + samplerate in single precision only + * python/README.md: add a note about nose2 for python tests (closes #74) + * scripts/setenv_local.sh: python3 compat + + [ Python demos ] + + * python/demos/demo_alsa.py: add example using alsaaudio (closes #72) + * python/demos/demo_mfcc.py: add options to plot first and second + derivatives, and set samplerate/win_s/hop_s, thanks to @jhoelzl (closes #68) + * python/demos/demo_notes.py: add simple notes demos + * python/demos/demo_pyaudio.py: added simple demo for pyaudio, see #6, + closes #78, thanks to @jhoelzl and @notalentgeek, add some comments, avoid + overwriting aubio.pitch + * python/demos/demo_source_simple.py: fix indentation, make executable + * python/demos/demo_timestretch{,_online}.py: fix usage string, remove + unused import, use // to yield an integer (closes #71) + * python/demos/demo_timestretch_online.py: use 512, fix block counter + * python/demos/demo_tss.py: improve default parameters, exit before plotting + + [ Python tests ] + + * python/tests/: use local import, add __init__.py + * python/tests/test_cvec.py: simplify + * python/tests/test_fft.py: skip test fft(zeros).phas == 0 if needed, expected powerpc + * python/tests/test_fvec.py: reduce alpha norm precision to 10.-4 + * python/tests/test_{midi2note,note2midi}.py: use nose2.params, add unicode tests + * python/tests/test_notes.py: add basic tests + * python/tests/test_notes.py: test results are correct for 44100Hz_44100f_sine441.wav + * python/tests/test_sink.py: add more tests, quiet warnings + * python/tests/test_source.py: break long line, check the tail of the file + is non-zero on non silent test files, filter user warnings to avoid spamming + the console, only check if last frames are non silent on brownnoise (weak), + remove fragile brownnoise test, check duration on short files, use nose2 + params to process one sound file per test + * python/tests/test_specdesc.py: RuntimeError is now raised on wrong mode + * python/tests/utils.py: by default, use 5 seconds brownoise + + [ Only in git ] + + * .travis.yml: add debian dpkg-buildflags config, switch from precise to + trusty, sudo required, add ffmpeg on osx, add targets ios, iosimulator, + and osx noopt configs, bump to xcode8, add xcode8.2 config, mimick + build_apple_frameworks options, alway upgrade pip, add pip --version and + which pip after upgrading, remove --user, use expandwaf in install, remove + unused ARCH, shuffle order, remove duplicate, add missing opening quote, + use AUBIO_NOTESTS to build only lib on ios, add gitter webhook + * .appveyor.yml: fix path for windows+python 3.5, fix typo in path, make + nose2 tests verbose + +2016-08-16 Paul Brossier <piem@aubio.org> [ Interface ] @@ -117,6 +705,7 @@ * python/scripts/aubiocut: fix usage string output [ Python tests ] + * python/tests/run_all_tests,*.py: switch to nose2, fix most prospect warnings * python/tests/test_fvec.py: add test_pass_to_numpy, cope with accumulated errors diff --git a/MANIFEST.in b/MANIFEST.in index 15fd25a..85edbdf 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -1,25 +1,22 @@ include AUTHORS COPYING README.md VERSION ChangeLog include python/README.md -include Makefile wscript */wscript_build +include this_version.py +include waf_gensyms.py include waf +recursive-include waflib *.py +include Makefile wscript */wscript_build include aubio.pc.in -include nose2.cfg include requirements.txt -include src/*.h -include src/*/*.h +include src/*.c src/*.h +include src/*/*.c src/*/*.h include examples/*.c examples/*.h -include tests/*.h tests/*/*.c tests/*/*/*.c +recursive-include tests *.h *.c *.py include python/ext/*.h -include python/__init__.py -include python/lib/__init__.py -include python/lib/moresetuptools.py -include python/lib/gen_external.py -include python/lib/gen_code.py -include python/tests/run_all_tests -include python/tests/*.py -include python/demos/*.py +recursive-include python *.py +include python/README.md +include python/tests/eval_pitch include python/tests/*.expected include doc/*.txt doc/*.rst doc/*.cfg doc/Makefile doc/make.bat doc/conf.py +exclude doc/full.cfg include scripts/* scripts/apple/Info.plist scripts/apple/Modules/module.modulemap exclude python/gen/* -exclude python/ext/config.h @@ -1,87 +1,177 @@ +#!/usr/bin/make -f +# -*- makefile -*- +# +# This makefile contains simple rules to prepare, compile, test, and install +# aubio. Try one of the following rules: +# +# $ make configure +# $ make build +# $ make install +# $ make test_python + WAFCMD=python waf -WAFURL=https://waf.io/waf-1.8.22 + +#WAFOPTS:= +# turn on verbose mode +WAFOPTS += --verbose +# build wafopts +WAFOPTS += --destdir $(DESTDIR) +# multiple jobs +WAFOPTS += --jobs 4 +# if HAVE_AUBIO_DOUBLE is defined, pass --enable-double to waf +# python/lib/moresetuptools.py also checks for HAVE_AUBIO_DOUBLE +WAFOPTS += $(shell [ -z $(HAVE_AUBIO_DOUBLE) ] || echo --enable-double ) + +PIPOPTS += --verbose + +DESTDIR:=$(PWD)/build/dist +PYDESTDIR:=$(PWD)/build/pydist + +# default install locations +PREFIX?=/usr/local +EXEC_PREFIX?=$(PREFIX) +LIBDIR?=$(PREFIX)/lib +INCLUDEDIR?=$(PREFIX)/include +DATAROOTDIR?=$(PREFIX)/share +MANDIR?=$(DATAROOTDIR)/man + +# default python test command +PYTEST?=pytest --verbose SOX=sox -ENABLE_DOUBLE := $(shell [ -z $(HAVE_DOUBLE) ] || echo --enable-double ) TESTSOUNDS := python/tests/sounds +LCOVOPTS += --rc lcov_branch_coverage=1 + all: build checkwaf: @[ -f waf ] || make getwaf getwaf: - @./scripts/get_waf.sh + ./scripts/get_waf.sh + +expandwaf: getwaf + [ -d wafilb ] || rm -fr waflib + $(WAFCMD) --help > /dev/null + mv .waf*/waflib . && rm -fr .waf* + sed '/^#==>$$/,$$d' waf > waf2 && mv waf2 waf + chmod +x waf && chmod -R go-w waflib -expandwaf: - @[ -d wafilb ] || rm -fr waflib - @$(WAFCMD) --help > /dev/null - @mv .waf*/waflib . && rm -fr .waf* - @sed '/^#==>$$/,$$d' waf > waf2 && mv waf2 waf - @chmod +x waf +cleanwaf: + rm -rf waf waflib .waf* configure: checkwaf - $(WAFCMD) configure $(WAFOPTS) $(ENABLE_DOUBLE) + $(WAFCMD) configure $(WAFOPTS) build: configure $(WAFCMD) build $(WAFOPTS) +install: + # install + $(WAFCMD) install $(WAFOPTS) + +list_installed: + find $(DESTDIR) -ls | sed 's|$(DESTDIR)|/«destdir»|' + +list_installed_python: + pip show -f aubio + +list_all_installed: list_installed list_installed_python + +uninstall: + # uninstall + $(WAFCMD) uninstall $(WAFOPTS) + +delete_install: + rm -rf $(PWD)/dist/test + build_python: - python ./setup.py generate $(ENABLE_DOUBLE) build + # build python-aubio, using locally built libaubio if found + python ./setup.py build + +build_python_extlib: + # build python-aubio using (locally) installed libaubio + [ -f $(DESTDIR)/$(INCLUDEDIR)/aubio/aubio.h ] + [ -d $(DESTDIR)/$(LIBDIR) ] + [ -f $(DESTDIR)/$(LIBDIR)/pkgconfig/aubio.pc ] + PKG_CONFIG_PATH=$(DESTDIR)/$(LIBDIR)/pkgconfig \ + CFLAGS="-I$(DESTDIR)/$(INCLUDEDIR)" \ + LDFLAGS="-L$(DESTDIR)/$(LIBDIR)" \ + make build_python + +deps_python: + # install or upgrade python requirements + pip install $(PIPOPTS) --requirement requirements.txt + +# use pip or distutils? +install_python: install_python_with_pip +uninstall_python: uninstall_python_with_pip +#install_python: install_python_with_distutils +#uninstall_python: uninstall_python_with_distutils -test_python: export LD_LIBRARY_PATH=$(PWD)/build/src -test_python: - pip install -v -r requirements.txt - pip install -v . - nose2 --verbose - pip uninstall -y -v aubio +install_python_with_pip: + # install package + pip install $(PIPOPTS) . -test_python_osx: +uninstall_python_with_pip: + # uninstall package + ( pip show aubio | grep -l aubio > /dev/null ) && \ + pip uninstall -y -v aubio || echo "info: aubio package is not installed" + +install_python_with_distutils: + ./setup.py install $(PIPOPTS) $(DISTUTILSOPTS) + +uninstall_python_with_distutils: + #./setup.py uninstall + [ -d $(PYDESTDIR)/$(LIBDIR) ] && echo Warning: did not clean $(PYDESTDIR)/$(LIBDIR) || true + +force_uninstall_python: + # ignore failure if not installed + -make uninstall_python + +local_dylib: + # DYLD_LIBRARY_PATH is no more on mac os # create links from ~/lib/lib* to build/src/lib* - [ -f build/src/libaubio.[0-9].dylib ] && ( mkdir -p ~/lib && cp -prv build/src/libaubio.4.dylib ~/lib ) || true - # then run the tests - pip install --user -v -r requirements.txt - pip install --user -v . - nose2 --verbose - pip uninstall -y -v aubio + [ -f $(PWD)/build/src/libaubio.[0-9].dylib ] && ( mkdir -p ~/lib && cp -prv build/src/libaubio.[0-9].dylib ~/lib ) || true + +test_python: export LD_LIBRARY_PATH=$(DESTDIR)/$(LIBDIR) +test_python: export PYTHONPATH=$(PYDESTDIR)/$(LIBDIR) +test_python: local_dylib + # run test with installed package + $(PYTEST) clean_python: ./setup.py clean -test_pure_python: - -pip uninstall -v -y aubio - -rm -rf build/ python/gen/ - -rm -f dist/*.egg - -pip install -v -r requirements.txt - CFLAGS=-Os python setup.py bdist_egg - [ "$(TRAVIS_OS_NAME)" == "osx" ] && easy_install --user dist/*.egg || \ - easy_install dist/*.egg - nose2 -N 4 - pip uninstall -v -y aubio - -test_pure_python_wheel: - -pip uninstall -v -y aubio - -rm -rf build/ python/gen/ - -rm -f dist/*.whl - -pip install -v -r requirements.txt - -pip install -v wheel - CFLAGS=-Os python setup.py bdist_wheel --universal - wheel install dist/*.whl - nose2 -N 4 - pip uninstall -v -y aubio - -build_python3: - python3 ./setup.py generate $(ENABLE_DOUBLE) build - -clean_python3: - python3 ./setup.py clean - -clean: +check_clean_python: + # check cleaning a second time works + make clean_python + make clean_python + +clean: checkwaf + # optionnaly clean before build + -$(WAFCMD) clean + # remove possible left overs + -rm -rf doc/_build + +check_clean: + # check cleaning after build works + $(WAFCMD) clean + # check cleaning a second time works $(WAFCMD) clean +distclean: + $(WAFCMD) distclean + -rm -rf doc/_build/ + -rm -rf doc/web/ + +check_distclean: + make distclean + distcheck: checkwaf - $(WAFCMD) distcheck $(WAFOPTS) $(ENABLE_DOUBLE) + $(WAFCMD) distcheck $(WAFOPTS) help: $(WAFCMD) --help @@ -89,8 +179,113 @@ help: create_test_sounds: -[ -z `which $(SOX)` ] && ( echo $(SOX) could not be found) || true -mkdir -p $(TESTSOUNDS) - -$(SOX) -r 44100 -b 16 -n "$(TESTSOUNDS)/44100Hz_1f_silence.wav" synth 1s silence 0 + -$(SOX) -r 44100 -b 16 -n "$(TESTSOUNDS)/44100Hz_1f_silence.wav" trim 0 1s -$(SOX) -r 22050 -b 16 -n "$(TESTSOUNDS)/22050Hz_5s_brownnoise.wav" synth 5 brownnoise vol 0.9 -$(SOX) -r 32000 -b 16 -n "$(TESTSOUNDS)/32000Hz_127f_sine440.wav" synth 127s sine 440 vol 0.9 - -$(SOX) -r 8000 -b 16 -n "$(TESTSOUNDS)/8000Hz_30s_silence.wav" synth 30 silence 0 vol 0.9 + -$(SOX) -r 8000 -b 16 -n "$(TESTSOUNDS)/8000Hz_30s_silence.wav" trim 0 30 -$(SOX) -r 48000 -b 32 -n "$(TESTSOUNDS)/48000Hz_60s_sweep.wav" synth 60 sine 100-20000 vol 0.9 + -$(SOX) -r 44100 -b 16 -n "$(TESTSOUNDS)/44100Hz_44100f_sine441.wav" synth 44100s sine 441 vol 0.9 + -$(SOX) -r 44100 -b 16 -n "$(TESTSOUNDS)/44100Hz_100f_sine441.wav" synth 100s sine 441 vol 0.9 + +# build only libaubio, no python-aubio +test_lib_only: clean distclean configure build install list_installed +# additionally, clean after a fresh build +test_lib_only_clean: test_lib_only uninstall check_clean check_distclean + +# build libaubio, build and test python-aubio against it +test_lib_python: force_uninstall_python deps_python \ + clean_python clean distclean \ + configure build build_python \ + install install_python \ + test_python \ + list_all_installed + +test_lib_python_clean: test_lib_python \ + uninstall_python uninstall \ + check_clean_python \ + check_clean \ + check_distclean + +# build libaubio, install it, build python-aubio against it +test_lib_install_python: force_uninstall_python deps_python \ + clean_python distclean \ + configure build \ + install \ + build_python_extlib \ + install_python \ + test_python \ + list_all_installed + +test_lib_install_python_clean: test_lib_install_python \ + uninstall_python \ + delete_install \ + check_clean_python \ + check_distclean + +# build a python-aubio that includes libaubio +test_python_only: force_uninstall_python deps_python \ + clean_python clean distclean \ + build_python \ + install_python \ + test_python \ + list_installed_python + +test_python_only_clean: test_python_only \ + uninstall_python \ + check_clean_python + +coverage_cycle: coverage_zero_counters coverage_report + +coverage_zero_counters: + lcov --zerocounters --directory . + +coverage: export CFLAGS=--coverage +coverage: export LDFLAGS=--coverage +coverage: export PYTHONPATH=$(PWD)/python/lib +coverage: export LD_LIBRARY_PATH=$(PWD)/build/src +coverage: force_uninstall_python deps_python \ + clean_python clean distclean build local_dylib + # capture coverage after running c tests + lcov $(LCOVOPTS) --capture --no-external --directory . \ + --output-file build/coverage_lib.info + # build and test python + pip install -v -e . + # run tests, with python coverage + coverage run `which pytest` + # capture coverage again + lcov $(LCOVOPTS) --capture --no-external --directory . \ + --output-file build/coverage_python.info + # merge both coverage info files + lcov $(LCOVOPTS) -a build/coverage_python.info -a build/coverage_lib.info \ + --output-file build/coverage.info + # remove tests + lcov $(LCOVOPTS) --remove build/coverage.info '*/ooura_fft8g*' \ + --output-file build/coverage_lib.info + +# make sure we don't build the doc, which builds a temporary python module +coverage_report: export WAFOPTS += --disable-docs +coverage_report: coverage + # generate report with lcov's genhtml + genhtml build/coverage_lib.info --output-directory build/coverage_c \ + --branch-coverage --highlight --legend + # generate python report with coverage python package + coverage report + coverage html -d build/coverage_python + # show links to generated reports + for i in $$(ls build/coverage_*/index.html); do echo file://$(PWD)/$$i; done + +sphinx: configure + $(WAFCMD) sphinx $(WAFOPTS) + +doxygen: configure + $(WAFCMD) doxygen $(WAFOPTS) + +manpages: configure + $(WAFCMD) manpages $(WAFOPTS) + +html: doxygen sphinx + +docs: html manpages + +dist: distclean expandwaf + $(WAFCMD) dist @@ -1,5 +1,13 @@ -aubio library -============= +aubio +===== + +[![Travis build status](https://travis-ci.org/aubio/aubio.svg?branch=master)](https://travis-ci.org/aubio/aubio "Travis build status") +[![Appveyor build status](https://img.shields.io/appveyor/ci/piem/aubio/master.svg)](https://ci.appveyor.com/project/piem/aubio "Appveyor build status") +[![Landscape code health](https://landscape.io/github/aubio/aubio/master/landscape.svg?style=flat)](https://landscape.io/github/aubio/aubio/master "Landscape code health") +[![Commits since last release](https://img.shields.io/github/commits-since/aubio/aubio/latest.svg)](https://github.com/aubio/aubio "Commits since last release") + +[![Documentation](https://readthedocs.org/projects/aubio/badge/?version=latest)](http://aubio.readthedocs.io/en/latest/?badge=latest "Latest documentation") +[![DOI](https://zenodo.org/badge/396389.svg)](https://zenodo.org/badge/latestdoi/396389) aubio is a library to label music and sounds. It listens to audio signals and attempts to detect events. For instance, when a drum is hit, at which frequency @@ -20,7 +28,7 @@ aubio provide several algorithms and routines, including: - digital filters (low pass, high pass, and more) - spectral filtering - transient/steady-state separation - - sound file and audio devices read and write access + - sound file read and write access - various mathematics utilities for music applications The name aubio comes from _audio_ with a typo: some errors are likely to be @@ -29,14 +37,19 @@ found in the results. Python module ------------- -A python module to access the library functions is also provided. Please see -the file [`python/README.md`](python/README.md) for more information on how to -use it. +A python module for aubio is provided. For more information on how to use it, +please see the file [`python/README.md`](python/README.md) and the +[manual](https://aubio.org/manual/latest/) . + +Tools +----- -Examples tools --------------- +The python module comes with the following command line tools: + + - `aubio` extracts informations from sound files + - `aubiocut` slices sound files at onset or beat timestamps -A few simple command line tools are included along with the library: +Additional command line tools are included along with the library: - `aubioonset` outputs the time stamp of detected note onsets - `aubiopitch` attempts to identify a fundamental frequency, or pitch, for @@ -46,26 +59,11 @@ A few simple command line tools are included along with the library: - `aubionotes` emits midi-like notes, with an onset, a pitch, and a duration - `aubioquiet` extracts quiet and loud regions -Additionally, the python module comes with the following script: - - - `aubiocut` slices sound files at onset or beat timestamps - -Implementation and Design Basics --------------------------------- - -The library is written in C and is optimised for speed and portability. - -The C API is designed in the following way: - - aubio_something_t * new_aubio_something (void * args); - audio_something_do (aubio_something_t * t, void * args); - smpl_t aubio_something_get_a_parameter (aubio_something_t *t); - uint_t aubio_something_set_a_parameter (aubio_something_t *t, smpl_t a_parameter); - void del_aubio_something (aubio_something_t * t); +Documentation +------------- -For performance and real-time operation, no memory allocation or freeing take -place in the `_do` methods. Instead, memory allocation should always take place -in the `new_` methods, whereas free operations are done in the `del_` methods. + - [manual](https://aubio.org/manual/latest/), generated with sphinx + - [developer documentation](https://aubio.org/doc/latest/), generated with Doxygen The latest version of the documentation can be found at: @@ -74,105 +72,42 @@ The latest version of the documentation can be found at: Build Instructions ------------------ -A number of distributions already include aubio. Check your favorite package -management system, or have a look at the [download -page](https://aubio.org/download). - -aubio uses [waf](https://waf.io/) to configure, compile, and test the source: - - ./waf configure - ./waf build - -If waf is not found in the directory, you can download and install it with: - - make getwaf - -aubio compiles on Linux, Mac OS X, Cygwin, and iOS. - -Installation ------------- - -To install aubio library and headers on your system, use: +aubio compiles on Linux, Mac OS X, Windows, Cygwin, and iOS. - sudo ./waf install +To compile aubio, you should be able to simply run: -To uninstall: + make - sudo ./waf uninstall +To compile the python module: -If you don't have root access to install libaubio on your system, you can use -libaubio without installing libaubio either by setting `LD_LIBRARY_PATH`, or by -copying it to `~/lib`. + ./setup.py build -On Linux, you should be able to set `LD_LIBRARY_PATH` with: +See the [manual](https://aubio.org/manual/latest/) for more information about +[installing aubio](https://aubio.org/manual/latest/installing.html). - $ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PWD/build/src +Citation +-------- -On Mac OS X, a copy or a symlink can be made in `~/lib`: +Please use the DOI link above to cite this release in your publications. For +more information, see also the [about +page](https://aubio.org/manual/latest/about.html) in [aubio +manual](https://aubio.org/manual/latest/). - $ mkdir -p ~/lib - $ ln -sf $PWD/build/src/libaubio*.dylib ~/lib/ - -Note on Mac OS X systems older than El Capitan (10.11), the `DYLD_LIBRARY_PATH` -variable can be set as follows: - - $ export DYLD_LIBRARY_PATH=$DYLD_LIBRARY_PATH:$PWD/build/src - -Credits and Publications ------------------------- - -This library gathers music signal processing algorithms designed at the Centre -for Digital Music and elsewhere. This software project was developed along the -research I did at the Centre for Digital Music, Queen Mary, University of -London. Most of this C code was written by myself, starting from published -papers and existing code. The header files of each algorithm contains brief -descriptions and references to the corresponding papers. - -Special thanks go Juan Pablo Bello, Chris Duxbury, Samer Abdallah, Alain de -Cheveigne for their help and publications. Also many thanks to Miguel Ramirez -and Nicolas Wack for their bug fixing. - -Substantial informations about the algorithms and their evaluation are gathered -in: - - - Paul Brossier, _[Automatic annotation of musical audio for interactive - systems](https://aubio.org/phd)_, PhD thesis, Centre for Digital music, -Queen Mary University of London, London, UK, 2006. - -Additional results obtained with this software were discussed in the following -papers: - - - P. M. Brossier and J. P. Bello and M. D. Plumbley, [Real-time temporal - segmentation of note objects in music signals](https://aubio.org/articles/brossier04fastnotes.pdf), -in _Proceedings of the International Computer Music Conference_, 2004, Miami, -Florida, ICMA - - - P. M. Brossier and J. P. Bello and M. D. Plumbley, [Fast labelling of note - objects in music signals] (https://aubio.org/articles/brossier04fastnotes.pdf), -in _Proceedings of the International Symposium on Music Information Retrieval_, -2004, Barcelona, Spain - - -Contact Info and Mailing List ------------------------------ +Homepage +-------- The home page of this project can be found at: https://aubio.org/ -Questions, comments, suggestions, and contributions are welcome. Use the -mailing list: <aubio-user@aubio.org>. - -To subscribe to the list, use the mailman form: -http://lists.aubio.org/listinfo/aubio-user/ - -Alternatively, feel free to contact directly the author. - - -Copyright and License Information ---------------------------------- - -Copyright (C) 2003-2013 Paul Brossier <piem@aubio.org> +License +------- aubio is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. + +Contributing +------------ + +Patches are welcome: please fork the latest git repository and create a feature +branch. Submitted requests should pass all continuous integration tests. @@ -1,7 +1,7 @@ AUBIO_MAJOR_VERSION=0 AUBIO_MINOR_VERSION=4 -AUBIO_PATCH_VERSION=3 +AUBIO_PATCH_VERSION=9 AUBIO_VERSION_STATUS='' LIBAUBIO_LT_CUR=5 -LIBAUBIO_LT_REV=0 -LIBAUBIO_LT_AGE=4 +LIBAUBIO_LT_REV=4 +LIBAUBIO_LT_AGE=8 diff --git a/aubio-0.4.9.tar.bz2.md5 b/aubio-0.4.9.tar.bz2.md5 new file mode 100644 index 0000000..216a0b9 --- /dev/null +++ b/aubio-0.4.9.tar.bz2.md5 @@ -0,0 +1 @@ +5b6570d0990cf1f289e6390bfe93102c aubio-0.4.9.tar.bz2 diff --git a/aubio-0.4.9.tar.bz2.sha1 b/aubio-0.4.9.tar.bz2.sha1 new file mode 100644 index 0000000..58dd71a --- /dev/null +++ b/aubio-0.4.9.tar.bz2.sha1 @@ -0,0 +1 @@ +5535d1ee6d9c5a0e2369a5f18dcdcbe1a9066c26 aubio-0.4.9.tar.bz2 diff --git a/aubio-0.4.9.tar.bz2.sha256 b/aubio-0.4.9.tar.bz2.sha256 new file mode 100644 index 0000000..edacf10 --- /dev/null +++ b/aubio-0.4.9.tar.bz2.sha256 @@ -0,0 +1 @@ +baa201689de28fa9d2e24e1133b9f73c85332ed947d3f7d1112b3c68209280a9 aubio-0.4.9.tar.bz2 diff --git a/aubio.pc.in b/aubio.pc.in index 301a1b5..16a892f 100644 --- a/aubio.pc.in +++ b/aubio.pc.in @@ -7,4 +7,4 @@ Name: aubio Description: a library for audio labelling Version: @VERSION@ Libs: -L${libdir} -laubio -Cflags: -I${includedir} +Cflags: -I${includedir} diff --git a/debian/aubio-tools.manpages b/debian/aubio-tools.manpages index d8c7027..d5a1e98 100644 --- a/debian/aubio-tools.manpages +++ b/debian/aubio-tools.manpages @@ -1 +1 @@ -build/doc/*.1 +usr/share/man/*/* diff --git a/debian/changelog b/debian/changelog index f8d92ea..97f0f46 100644 --- a/debian/changelog +++ b/debian/changelog @@ -1,3 +1,109 @@ +aubio (0.4.9-4) unstable; urgency=medium + + * debian/tests/control: remove py2 tests + + -- Paul Brossier <piem@debian.org> Wed, 08 Jan 2020 13:39:53 +0100 + +aubio (0.4.9-3) unstable; urgency=medium + + * debian/control: remove python-aubio package (closes: #936161) + * debian/rules: use only python3, cleaner py3 clean + * debian/patches/wscript_py3.patch: add patch to build with current python + * debian/upstream/metadata: add basic upstream description and reference + * debian/control: bump to debhelper-compat 12, remove debian/compat, + bump S-V to 4.4.1, add Rules-Requires-Root: no + * debian/aubio-tools.manpages: install from debian/tmp + * debian/libaubio-doc.*: move html doc to /u/s/d/libaubio-dev/api + * debian/libaubio-doc.doc-base: update author email + * debian/libaubio-doc.links: fix jquery link + + -- Paul Brossier <piem@debian.org> Sat, 04 Jan 2020 18:34:15 +0100 + +aubio (0.4.9-2) unstable; urgency=medium + + * Add upstream patches to fix FTBFS on powerpc and i386 + + -- Paul Brossier <piem@debian.org> Fri, 21 Jun 2019 14:46:30 +0200 + +aubio (0.4.9-1) unstable; urgency=medium + + * New upstream version 0.4.9 (closes: #480018, #930186) + * Fixes security issues (CVE-2018-19800, CVE-2018-19801, CVE-2018-19802) + * debian/tests/control: also install built binaries + * debian/patches: remove patches integrated upstream + * debian/control: bump to S-V 4.3.0 + * debian/control, debian/tests: switch b-d from nose2 to pytest + * debian/rules: switch to pytest, update clean target + * debian/patches/fixtypos.patch: fix typos + * debian/patches/series: remove patches integrated upstream + * debian/libaubio-doc.dob-base: move documentation to api folder + * debian/libaubio5.symbols: add new symbols since 0.4.8, remove unused + symbols previously exported, add Build-Depends-Package field + * debian/upstream/signing-key.asc: minimize key export + + -- Paul Brossier <piem@debian.org> Thu, 20 Jun 2019 12:01:41 +0200 + +aubio (0.4.6-2) unstable; urgency=medium + + * debian/tests/control: add minimal dependencies + * debian/rules: use hardening=+all, enabling bindnow + * debian/control: + - python3-aubio: use Python 3 for long description + - libaubio-dev: add Multi-Arch: same + - libaubio-doc: add Multi-Arch: foreign + * debian/copyright: use https urls + * debian/patches/examples_proto.patch: fix gcc 8 warning + * debian/libaubio5.symbols: add list of symbols since 0.4.3 + + -- Paul Brossier <piem@debian.org> Fri, 14 Sep 2018 16:57:22 +0200 + +aubio (0.4.6-1) unstable; urgency=medium + + * New upstream version 0.4.6 + * Acknowledge NMU (thanks to Sebastian Ramacher, closes: #888336) + * debian/watch: use https + * debian/copyright: fix file path + * debian/control: + - remove duplicate Section from aubio-tools + - capitalize Python in short descriptions + - remove obsolete X-Python fields + - bump Standards-Version to 4.2.1 + - move Vcs-Git and Browser to salsa.d.o + * debian/rules: + - add a comment to enable bindnow hardening + - add -Wl,--as-needed to LDFLAGS + - clean waf_gensyms and python/tests/sounds + * debian/patches: + - add upstream patches to fix security issues + - add avoid_deprecated to omit av_register_all() where deprecated + * CVE-2017-17054 div by zero, thx to my123px (closes: #883355) + * CVE-2017-17554 null pointer dereference, thx to IvanCql (closes: #884237) + * CVE-2017-17555 denial of service, thx to IvanCql (closes: #884232) + * CVE-2018-14521 SEGV in aubiomfcc, thx to fCorleone (closes: #904908) + * CVE-2018-14522 SEGV in aubionotes, thx to fCorleone (closes: #904907) + * CVE-2018-14523 global buffer overflow, thx to fCorleone (closes: #904906) + + -- Paul Brossier <piem@debian.org> Mon, 10 Sep 2018 16:20:59 +0200 + +aubio (0.4.5-1.1) unstable; urgency=medium + + * Non-maintainer upload. + * debian/patches: Fix build with ffmpeg 4.0. (Closes: #888336) + + -- Sebastian Ramacher <sramacher@debian.org> Wed, 11 Jul 2018 20:52:59 +0200 + +aubio (0.4.5-1) unstable; urgency=medium + + * New upstream version 0.4.5 + * Apply patch for 0.4.3-4.1 (thanks to Dr. Tobias Quathamer, closes: #862804) + * debian/control: update homepage to https + * debian/patches: delete all patches merged upstream + * debian/rules: clean sphinx doc, this_version.pyc, and aubio.egg-info + * debian/rules: add /usr/bin/aubio to aubio-tools + * debian/watch: check upstream signature + + -- Paul Brossier <piem@debian.org> Mon, 24 Jul 2017 15:11:01 +0200 + aubio (0.4.3-4.1) unstable; urgency=medium * Non-maintainer upload. diff --git a/debian/compat b/debian/compat deleted file mode 100644 index ec63514..0000000 --- a/debian/compat +++ /dev/null @@ -1 +0,0 @@ -9 diff --git a/debian/control b/debian/control index 4c0fca3..d89d324 100644 --- a/debian/control +++ b/debian/control @@ -2,7 +2,7 @@ Source: aubio Section: sound Priority: optional Maintainer: Paul Brossier <piem@debian.org> -Build-Depends: debhelper (>= 9.0.0), +Build-Depends: debhelper-compat (= 12), libtool, libjack-dev | libjack-jackd2-dev, libavcodec-dev, @@ -14,31 +14,26 @@ Build-Depends: debhelper (>= 9.0.0), libasound2-dev, libfftw3-dev, dh-python, - python-all-dev, - python-setuptools, - libpython2.7-dev, - python-numpy, python3-all-dev, python3-setuptools, libpython3-dev, python3-numpy, sox, - python-nose2, - python3-nose2, + python3-pytest, txt2man, Build-Depends-Indep: doxygen, libjs-jquery, libjs-mathjax, -Standards-Version: 3.9.8 -X-Python-Version: >= 2.6 -X-Python3-Version: >= 3.2 -Homepage: http://aubio.org -Vcs-Git: https://anonscm.debian.org/git/collab-maint/aubio.git -Vcs-Browser: https://anonscm.debian.org/cgit/collab-maint/aubio.git/ +Standards-Version: 4.4.1 +Homepage: https://aubio.org +Vcs-Git: https://salsa.debian.org/piem/aubio.git +Vcs-Browser: https://salsa.debian.org/piem/aubio +Rules-Requires-Root: no Package: libaubio-dev Section: libdevel Architecture: any +Multi-Arch: same Depends: libaubio5 (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends} Description: library for audio and music analysis, synthesis, and effects aubio gathers a set of functions for audio signal segmentation and labelling. @@ -61,7 +56,6 @@ Description: library for audio segmentation This package provides the shared library libaubio. Package: aubio-tools -Section: sound Architecture: any Depends: python3-aubio (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends}, ${python3:Depends} Description: library for audio segmentation -- utilities @@ -74,6 +68,7 @@ Description: library for audio segmentation -- utilities Package: libaubio-doc Section: doc Architecture: all +Multi-Arch: foreign Depends: ${misc:Depends}, libjs-jquery, libjs-mathjax Description: library for audio segmentation -- documentation aubio gathers a set of functions for audio signal segmentation and labelling. @@ -82,24 +77,12 @@ Description: library for audio segmentation -- documentation . This package provides the documentation for the C interface. -Package: python-aubio -Section: python -Architecture: any -Depends: ${shlibs:Depends}, ${misc:Depends}, ${python:Depends} -Suggests: python-matplotlib -Description: python interface for aubio, a library for audio segmentation - aubio gathers a set of functions for audio signal segmentation and labelling. - The library contains a phase vocoder, onset and pitch detection functions, a - beat tracking algorithm and other sound processing utilities. - . - This package provides the aubio module for Python 2. - Package: python3-aubio Section: python Architecture: any Depends: ${shlibs:Depends}, ${misc:Depends}, ${python3:Depends} Suggests: python3-matplotlib -Description: python interface for aubio, a library for audio segmentation +Description: Python 3 interface for aubio, a library for audio segmentation aubio gathers a set of functions for audio signal segmentation and labelling. The library contains a phase vocoder, onset and pitch detection functions, a beat tracking algorithm and other sound processing utilities. diff --git a/debian/copyright b/debian/copyright index 64b3716..fe6229d 100644 --- a/debian/copyright +++ b/debian/copyright @@ -1,6 +1,6 @@ -Format: http://www.debian.org/doc/packaging-manuals/copyright-format/1.0/ +Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/ Upstream-Name: aubio -Source: http://aubio.org +Source: https://aubio.org Files: * Copyright: 2003-2013 Paul Brossier <piem@aubio.org> @@ -19,12 +19,12 @@ Copyright: 2003-2013 Paul Brossier <piem@aubio.org> 2004-2005 Mario Lang <mlang@delysid.org> License: GPL-3+ -Files: src/spec{tral/filterbank.c - src/spec{tral/filterbank.h - src/spec{tral/filterbank_mel.c - src/spec{tral/filterbank_mel.h - src/spec{tral/mfcc.c - src/spec{tral/mfcc.h +Files: src/spectral/filterbank.c + src/spectral/filterbank.h + src/spectral/filterbank_mel.c + src/spectral/filterbank_mel.h + src/spectral/mfcc.c + src/spectral/mfcc.h Copyright: 2007-2013 Paul Brossier <piem@aubio.org> 2007-2009 Amaury Hazan <ahazan@iua.upf.edu> License: GPL-3+ diff --git a/debian/libaubio-doc.doc-base b/debian/libaubio-doc.doc-base index 3ed027f..f0cbe00 100644 --- a/debian/libaubio-doc.doc-base +++ b/debian/libaubio-doc.doc-base @@ -1,10 +1,10 @@ Document: libaubio Title: Aubio Manual -Author: Paul Brossier <piem@altern.org> +Author: Paul Brossier <piem@aubio.org> Abstract: This is a programming manual for the libaubio library. Section: Programming Format: HTML -Index: /usr/share/doc/libaubio-doc/html/index.html -Files: /usr/share/doc/libaubio-doc/html/*.html +Index: /usr/share/doc/libaubio-dev/api/index.html +Files: /usr/share/doc/libaubio-dev/api/*.html diff --git a/debian/libaubio-doc.docs b/debian/libaubio-doc.docs index 2f2d0ec..c4dde58 100644 --- a/debian/libaubio-doc.docs +++ b/debian/libaubio-doc.docs @@ -1 +1 @@ -doc/web/html +usr/share/doc/libaubio-doc/api diff --git a/debian/libaubio-doc.links b/debian/libaubio-doc.links index 8338a73..6563044 100644 --- a/debian/libaubio-doc.links +++ b/debian/libaubio-doc.links @@ -1 +1 @@ -/usr/share/javascript/jquery/jquery.js /usr/share/doc/libaubio-doc/html/jquery.js +/usr/share/javascript/jquery/jquery.js /usr/share/doc/libaubio-dev/api/jquery.js diff --git a/debian/libaubio5.symbols b/debian/libaubio5.symbols new file mode 100644 index 0000000..c15db0a --- /dev/null +++ b/debian/libaubio5.symbols @@ -0,0 +1,486 @@ +libaubio.so.5 libaubio5 #MINVER# +* Build-Depends-Package: libaubio-dev + aubio_autocorr@Base 0.4.3 + aubio_beattracking_checkstate@Base 0.4.3 + aubio_beattracking_do@Base 0.4.3 + aubio_beattracking_get_bpm@Base 0.4.3 + aubio_beattracking_get_confidence@Base 0.4.3 + aubio_beattracking_get_period@Base 0.4.3 + aubio_beattracking_get_period_s@Base 0.4.3 + aubio_bintofreq@Base 0.4.3 + aubio_bintomidi@Base 0.4.3 + aubio_cleanup@Base 0.4.3 + aubio_db_spl@Base 0.4.3 + aubio_dct_do@Base 0.4.8 + aubio_dct_fftw_do@Base 0.4.8 + aubio_dct_fftw_rdo@Base 0.4.8 + aubio_dct_plain_do@Base 0.4.8 + aubio_dct_plain_rdo@Base 0.4.8 + aubio_dct_rdo@Base 0.4.8 + aubio_default_log@Base 0.4.5 + aubio_fft_do@Base 0.4.3 + aubio_fft_do_complex@Base 0.4.3 + aubio_fft_get_imag@Base 0.4.3 + aubio_fft_get_norm@Base 0.4.3 + aubio_fft_get_phas@Base 0.4.3 + aubio_fft_get_real@Base 0.4.3 + aubio_fft_get_realimag@Base 0.4.3 + aubio_fft_get_spectrum@Base 0.4.3 + aubio_fft_rdo@Base 0.4.3 + aubio_fft_rdo_complex@Base 0.4.3 + aubio_fftw_mutex@Base 0.4.3 + aubio_filter_do@Base 0.4.3 + aubio_filter_do_filtfilt@Base 0.4.3 + aubio_filter_do_outplace@Base 0.4.3 + aubio_filter_do_reset@Base 0.4.3 + aubio_filter_get_feedback@Base 0.4.3 + aubio_filter_get_feedforward@Base 0.4.3 + aubio_filter_get_order@Base 0.4.3 + aubio_filter_get_samplerate@Base 0.4.3 + aubio_filter_set_a_weighting@Base 0.4.3 + aubio_filter_set_biquad@Base 0.4.3 + aubio_filter_set_c_weighting@Base 0.4.3 + aubio_filter_set_samplerate@Base 0.4.3 + aubio_filterbank_do@Base 0.4.3 + aubio_filterbank_get_coeffs@Base 0.4.3 + aubio_filterbank_get_norm@Base 0.4.8 + aubio_filterbank_get_power@Base 0.4.8 + aubio_filterbank_set_coeffs@Base 0.4.3 + aubio_filterbank_set_mel_coeffs@Base 0.4.8 + aubio_filterbank_set_mel_coeffs_htk@Base 0.4.8 + aubio_filterbank_set_mel_coeffs_slaney@Base 0.4.3 + aubio_filterbank_set_norm@Base 0.4.8 + aubio_filterbank_set_power@Base 0.4.8 + aubio_filterbank_set_triangle_bands@Base 0.4.3 + aubio_freqtobin@Base 0.4.3 + aubio_freqtomidi@Base 0.4.3 + aubio_hist_do@Base 0.4.3 + aubio_hist_do_notnull@Base 0.4.3 + aubio_hist_dyn_notnull@Base 0.4.3 + aubio_hist_mean@Base 0.4.3 + aubio_hist_weight@Base 0.4.3 + aubio_hztomel@Base 0.4.8 + aubio_hztomel_htk@Base 0.4.8 + aubio_io_validate_channels@Base 0.4.5 + aubio_io_validate_samplerate@Base 0.4.5 + aubio_is_power_of_two@Base 0.4.3 + aubio_level_detection@Base 0.4.3 + aubio_level_lin@Base 0.4.3 + aubio_log@Base 0.4.5 + aubio_log_reset@Base 0.4.5 + aubio_log_set_function@Base 0.4.5 + aubio_log_set_level_function@Base 0.4.5 + aubio_meltohz@Base 0.4.8 + aubio_meltohz_htk@Base 0.4.8 + aubio_mfcc_do@Base 0.4.3 + aubio_mfcc_get_power@Base 0.4.8 + aubio_mfcc_get_scale@Base 0.4.8 + aubio_mfcc_set_mel_coeffs@Base 0.4.8 + aubio_mfcc_set_mel_coeffs_htk@Base 0.4.8 + aubio_mfcc_set_mel_coeffs_slaney@Base 0.4.8 + aubio_mfcc_set_power@Base 0.4.8 + aubio_mfcc_set_scale@Base 0.4.8 + aubio_miditobin@Base 0.4.3 + aubio_miditofreq@Base 0.4.3 + aubio_next_power_of_two@Base 0.4.3 + aubio_notes_do@Base 0.4.3 + aubio_notes_get_minioi_ms@Base 0.4.5 + aubio_notes_get_release_drop@Base 0.4.8 + aubio_notes_get_silence@Base 0.4.5 + aubio_notes_set_minioi_ms@Base 0.4.5 + aubio_notes_set_release_drop@Base 0.4.8 + aubio_notes_set_silence@Base 0.4.5 + aubio_onset_do@Base 0.4.3 + aubio_onset_get_awhitening@Base 0.4.5 + aubio_onset_get_compression@Base 0.4.5 + aubio_onset_get_delay@Base 0.4.3 + aubio_onset_get_delay_ms@Base 0.4.3 + aubio_onset_get_delay_s@Base 0.4.3 + aubio_onset_get_descriptor@Base 0.4.3 + aubio_onset_get_last@Base 0.4.3 + aubio_onset_get_last_ms@Base 0.4.3 + aubio_onset_get_last_s@Base 0.4.3 + aubio_onset_get_minioi@Base 0.4.3 + aubio_onset_get_minioi_ms@Base 0.4.3 + aubio_onset_get_minioi_s@Base 0.4.3 + aubio_onset_get_silence@Base 0.4.3 + aubio_onset_get_threshold@Base 0.4.3 + aubio_onset_get_thresholded_descriptor@Base 0.4.3 + aubio_onset_reset@Base 0.4.5 + aubio_onset_set_awhitening@Base 0.4.5 + aubio_onset_set_compression@Base 0.4.5 + aubio_onset_set_default_parameters@Base 0.4.5 + aubio_onset_set_delay@Base 0.4.3 + aubio_onset_set_delay_ms@Base 0.4.3 + aubio_onset_set_delay_s@Base 0.4.3 + aubio_onset_set_minioi@Base 0.4.3 + aubio_onset_set_minioi_ms@Base 0.4.3 + aubio_onset_set_minioi_s@Base 0.4.3 + aubio_onset_set_silence@Base 0.4.3 + aubio_onset_set_threshold@Base 0.4.3 + aubio_ooura_cdft@Base 0.4.5 + aubio_ooura_ddct@Base 0.4.5 + aubio_ooura_ddst@Base 0.4.5 + aubio_ooura_dfct@Base 0.4.5 + aubio_ooura_dfst@Base 0.4.5 + aubio_ooura_rdft@Base 0.4.5 + aubio_parameter_get_current_value@Base 0.4.3 + aubio_parameter_get_max_value@Base 0.4.3 + aubio_parameter_get_min_value@Base 0.4.3 + aubio_parameter_get_next_value@Base 0.4.3 + aubio_parameter_get_steps@Base 0.4.3 + aubio_parameter_set_current_value@Base 0.4.3 + aubio_parameter_set_max_value@Base 0.4.3 + aubio_parameter_set_min_value@Base 0.4.3 + aubio_parameter_set_steps@Base 0.4.3 + aubio_parameter_set_target_value@Base 0.4.3 + aubio_peakpicker_do@Base 0.4.3 + aubio_peakpicker_get_threshold@Base 0.4.3 + aubio_peakpicker_get_thresholded_input@Base 0.4.3 + aubio_peakpicker_get_thresholdfn@Base 0.4.3 + aubio_peakpicker_set_threshold@Base 0.4.3 + aubio_peakpicker_set_thresholdfn@Base 0.4.3 + aubio_pitch_do@Base 0.4.3 + aubio_pitch_get_confidence@Base 0.4.3 + aubio_pitch_get_silence@Base 0.4.3 + aubio_pitch_get_tolerance@Base 0.4.5 + aubio_pitch_set_silence@Base 0.4.3 + aubio_pitch_set_tolerance@Base 0.4.3 + aubio_pitch_set_unit@Base 0.4.3 + aubio_pitch_slideblock@Base 0.4.3 + aubio_pitchfcomb_do@Base 0.4.3 + aubio_pitchmcomb_combdet@Base 0.4.3 + aubio_pitchmcomb_do@Base 0.4.3 + aubio_pitchmcomb_get_root_peak@Base 0.4.3 + aubio_pitchmcomb_quadpick@Base 0.4.3 + aubio_pitchmcomb_spectral_pp@Base 0.4.3 + aubio_pitchschmitt_do@Base 0.4.3 + aubio_pitchspecacf_do@Base 0.4.3 + aubio_pitchspecacf_get_confidence@Base 0.4.3 + aubio_pitchspecacf_get_tolerance@Base 0.4.3 + aubio_pitchspecacf_set_tolerance@Base 0.4.3 + aubio_pitchyin_do@Base 0.4.3 + aubio_pitchyin_get_confidence@Base 0.4.3 + aubio_pitchyin_get_tolerance@Base 0.4.3 + aubio_pitchyin_set_tolerance@Base 0.4.3 + aubio_pitchyinfast_do@Base 0.4.6 + aubio_pitchyinfast_get_confidence@Base 0.4.6 + aubio_pitchyinfast_get_tolerance@Base 0.4.6 + aubio_pitchyinfast_set_tolerance@Base 0.4.6 + aubio_pitchyinfft_do@Base 0.4.3 + aubio_pitchyinfft_get_confidence@Base 0.4.3 + aubio_pitchyinfft_get_tolerance@Base 0.4.3 + aubio_pitchyinfft_set_tolerance@Base 0.4.3 + aubio_power_of_two_order@Base 0.4.6 + aubio_pvoc_do@Base 0.4.3 + aubio_pvoc_get_hop@Base 0.4.8 + aubio_pvoc_get_win@Base 0.4.8 + aubio_pvoc_rdo@Base 0.4.3 + aubio_pvoc_set_window@Base 0.4.6 + aubio_quadfrac@Base 0.4.3 + aubio_resampler_do@Base 0.4.3 + aubio_sampler_do@Base 0.4.3 + aubio_sampler_do_multi@Base 0.4.3 + aubio_sampler_get_playing@Base 0.4.3 + aubio_sampler_load@Base 0.4.3 + aubio_sampler_play@Base 0.4.3 + aubio_sampler_set_playing@Base 0.4.3 + aubio_sampler_stop@Base 0.4.3 + aubio_scale_do@Base 0.4.3 + aubio_scale_set_limits@Base 0.4.3 + aubio_schmittS16LE@Base 0.4.3 + aubio_silence_detection@Base 0.4.3 + aubio_sink_close@Base 0.4.3 + aubio_sink_do@Base 0.4.3 + aubio_sink_do_multi@Base 0.4.3 + aubio_sink_get_channels@Base 0.4.3 + aubio_sink_get_samplerate@Base 0.4.3 + aubio_sink_preset_channels@Base 0.4.3 + aubio_sink_preset_samplerate@Base 0.4.3 + aubio_sink_sndfile_close@Base 0.4.3 + aubio_sink_sndfile_do@Base 0.4.3 + aubio_sink_sndfile_do_multi@Base 0.4.3 + aubio_sink_sndfile_get_channels@Base 0.4.3 + aubio_sink_sndfile_get_samplerate@Base 0.4.3 + aubio_sink_sndfile_open@Base 0.4.3 + aubio_sink_sndfile_preset_channels@Base 0.4.3 + aubio_sink_sndfile_preset_samplerate@Base 0.4.3 + aubio_sink_validate_input_channels@Base 0.4.9 + aubio_sink_validate_input_length@Base 0.4.9 + aubio_sink_wavwrite_close@Base 0.4.3 + aubio_sink_wavwrite_do@Base 0.4.3 + aubio_sink_wavwrite_do_multi@Base 0.4.3 + aubio_sink_wavwrite_get_channels@Base 0.4.3 + aubio_sink_wavwrite_get_samplerate@Base 0.4.3 + aubio_sink_wavwrite_open@Base 0.4.3 + aubio_sink_wavwrite_preset_channels@Base 0.4.3 + aubio_sink_wavwrite_preset_samplerate@Base 0.4.3 + aubio_source_avcodec_close@Base 0.4.3 + aubio_source_avcodec_do@Base 0.4.3 + aubio_source_avcodec_do_multi@Base 0.4.3 + aubio_source_avcodec_get_channels@Base 0.4.3 + aubio_source_avcodec_get_duration@Base 0.4.3 + aubio_source_avcodec_get_samplerate@Base 0.4.3 + aubio_source_avcodec_has_network_url@Base 0.4.3 + aubio_source_avcodec_readframe@Base 0.4.3 + aubio_source_avcodec_reset_resampler@Base 0.4.3 + aubio_source_avcodec_seek@Base 0.4.3 + aubio_source_close@Base 0.4.3 + aubio_source_do@Base 0.4.3 + aubio_source_do_multi@Base 0.4.3 + aubio_source_get_channels@Base 0.4.3 + aubio_source_get_duration@Base 0.4.3 + aubio_source_get_samplerate@Base 0.4.3 + aubio_source_pad_multi_output@Base 0.4.9 + aubio_source_pad_output@Base 0.4.9 + aubio_source_seek@Base 0.4.3 + aubio_source_sndfile_close@Base 0.4.3 + aubio_source_sndfile_do@Base 0.4.3 + aubio_source_sndfile_do_multi@Base 0.4.3 + aubio_source_sndfile_get_channels@Base 0.4.3 + aubio_source_sndfile_get_duration@Base 0.4.3 + aubio_source_sndfile_get_samplerate@Base 0.4.3 + aubio_source_sndfile_seek@Base 0.4.3 + aubio_source_validate_input_channels@Base 0.4.9 + aubio_source_validate_input_length@Base 0.4.9 + aubio_source_wavread_close@Base 0.4.3 + aubio_source_wavread_do@Base 0.4.3 + aubio_source_wavread_do_multi@Base 0.4.3 + aubio_source_wavread_get_channels@Base 0.4.3 + aubio_source_wavread_get_duration@Base 0.4.3 + aubio_source_wavread_get_samplerate@Base 0.4.3 + aubio_source_wavread_readframe@Base 0.4.3 + aubio_source_wavread_seek@Base 0.4.3 + aubio_specdesc_centroid@Base 0.4.3 + aubio_specdesc_complex@Base 0.4.3 + aubio_specdesc_decrease@Base 0.4.3 + aubio_specdesc_do@Base 0.4.3 + aubio_specdesc_energy@Base 0.4.3 + aubio_specdesc_hfc@Base 0.4.3 + aubio_specdesc_kl@Base 0.4.3 + aubio_specdesc_kurtosis@Base 0.4.3 + aubio_specdesc_mkl@Base 0.4.3 + aubio_specdesc_phase@Base 0.4.3 + aubio_specdesc_rolloff@Base 0.4.3 + aubio_specdesc_skewness@Base 0.4.3 + aubio_specdesc_slope@Base 0.4.3 + aubio_specdesc_specdiff@Base 0.4.3 + aubio_specdesc_specflux@Base 0.4.3 + aubio_specdesc_spread@Base 0.4.3 + aubio_specdesc_wphase@Base 0.4.5 + aubio_spectral_whitening_do@Base 0.4.5 + aubio_spectral_whitening_get_floor@Base 0.4.5 + aubio_spectral_whitening_get_relax_time@Base 0.4.5 + aubio_spectral_whitening_reset@Base 0.4.5 + aubio_spectral_whitening_set_floor@Base 0.4.5 + aubio_spectral_whitening_set_relax_time@Base 0.4.5 + aubio_tempo_do@Base 0.4.3 + aubio_tempo_get_bpm@Base 0.4.3 + aubio_tempo_get_confidence@Base 0.4.3 + aubio_tempo_get_delay@Base 0.4.3 + aubio_tempo_get_delay_ms@Base 0.4.3 + aubio_tempo_get_delay_s@Base 0.4.3 + aubio_tempo_get_last@Base 0.4.3 + aubio_tempo_get_last_ms@Base 0.4.3 + aubio_tempo_get_last_s@Base 0.4.3 + aubio_tempo_get_last_tatum@Base 0.4.3 + aubio_tempo_get_period@Base 0.4.3 + aubio_tempo_get_period_s@Base 0.4.3 + aubio_tempo_get_silence@Base 0.4.3 + aubio_tempo_get_threshold@Base 0.4.3 + aubio_tempo_set_delay@Base 0.4.3 + aubio_tempo_set_delay_ms@Base 0.4.3 + aubio_tempo_set_delay_s@Base 0.4.3 + aubio_tempo_set_silence@Base 0.4.3 + aubio_tempo_set_tatum_signature@Base 0.4.3 + aubio_tempo_set_threshold@Base 0.4.3 + aubio_tempo_was_tatum@Base 0.4.3 + aubio_tss_do@Base 0.4.3 + aubio_tss_set_alpha@Base 0.4.3 + aubio_tss_set_beta@Base 0.4.3 + aubio_tss_set_threshold@Base 0.4.3 + aubio_unwrap2pi@Base 0.4.3 + aubio_wavetable_do@Base 0.4.3 + aubio_wavetable_do_multi@Base 0.4.3 + aubio_wavetable_get_amp@Base 0.4.3 + aubio_wavetable_get_freq@Base 0.4.3 + aubio_wavetable_get_playing@Base 0.4.3 + aubio_wavetable_load@Base 0.4.8 + aubio_wavetable_play@Base 0.4.3 + aubio_wavetable_set_amp@Base 0.4.3 + aubio_wavetable_set_freq@Base 0.4.3 + aubio_wavetable_set_playing@Base 0.4.3 + aubio_wavetable_stop@Base 0.4.3 + aubio_zero_crossing_rate@Base 0.4.3 + cvec_centroid@Base 0.4.3 + cvec_copy@Base 0.4.3 + cvec_logmag@Base 0.4.5 + cvec_mean@Base 0.4.3 + cvec_moment@Base 0.4.3 + cvec_norm_get_data@Base 0.4.3 + cvec_norm_get_sample@Base 0.4.3 + cvec_norm_ones@Base 0.4.3 + cvec_norm_set_all@Base 0.4.3 + cvec_norm_set_sample@Base 0.4.3 + cvec_norm_zeros@Base 0.4.3 + cvec_phas_get_data@Base 0.4.3 + cvec_phas_get_sample@Base 0.4.3 + cvec_phas_ones@Base 0.4.3 + cvec_phas_set_all@Base 0.4.3 + cvec_phas_set_sample@Base 0.4.3 + cvec_phas_zeros@Base 0.4.3 + cvec_print@Base 0.4.3 + cvec_sum@Base 0.4.3 + cvec_zeros@Base 0.4.3 + del_aubio_beattracking@Base 0.4.3 + del_aubio_dct@Base 0.4.8 + del_aubio_dct_fftw@Base 0.4.8 + del_aubio_dct_plain@Base 0.4.8 + del_aubio_fft@Base 0.4.3 + del_aubio_filter@Base 0.4.3 + del_aubio_filterbank@Base 0.4.3 + del_aubio_hist@Base 0.4.3 + del_aubio_mfcc@Base 0.4.3 + del_aubio_notes@Base 0.4.3 + del_aubio_onset@Base 0.4.3 + del_aubio_parameter@Base 0.4.3 + del_aubio_peakpicker@Base 0.4.3 + del_aubio_pitch@Base 0.4.3 + del_aubio_pitchfcomb@Base 0.4.3 + del_aubio_pitchmcomb@Base 0.4.3 + del_aubio_pitchschmitt@Base 0.4.3 + del_aubio_pitchspecacf@Base 0.4.3 + del_aubio_pitchyin@Base 0.4.3 + del_aubio_pitchyinfast@Base 0.4.6 + del_aubio_pitchyinfft@Base 0.4.3 + del_aubio_pvoc@Base 0.4.3 + del_aubio_resampler@Base 0.4.3 + del_aubio_sampler@Base 0.4.3 + del_aubio_scale@Base 0.4.3 + del_aubio_sink@Base 0.4.3 + del_aubio_sink_sndfile@Base 0.4.3 + del_aubio_sink_wavwrite@Base 0.4.3 + del_aubio_source@Base 0.4.3 + del_aubio_source_avcodec@Base 0.4.3 + del_aubio_source_sndfile@Base 0.4.3 + del_aubio_source_wavread@Base 0.4.3 + del_aubio_specdesc@Base 0.4.3 + del_aubio_spectral_whitening@Base 0.4.5 + del_aubio_tempo@Base 0.4.3 + del_aubio_tss@Base 0.4.3 + del_aubio_wavetable@Base 0.4.3 + del_cvec@Base 0.4.3 + del_fmat@Base 0.4.3 + del_fvec@Base 0.4.3 + del_lvec@Base 0.4.3 + fmat_copy@Base 0.4.3 + fmat_get_channel@Base 0.4.3 + fmat_get_channel_data@Base 0.4.3 + fmat_get_data@Base 0.4.3 + fmat_get_sample@Base 0.4.3 + fmat_ones@Base 0.4.3 + fmat_print@Base 0.4.3 + fmat_rev@Base 0.4.3 + fmat_set@Base 0.4.3 + fmat_set_sample@Base 0.4.3 + fmat_vecmul@Base 0.4.3 + fmat_weight@Base 0.4.3 + fmat_zeros@Base 0.4.3 + fvec_abs@Base 0.4.3 + fvec_adapt_thres@Base 0.4.3 + fvec_add@Base 0.4.3 + fvec_alpha_norm@Base 0.4.3 + fvec_alpha_normalise@Base 0.4.3 + fvec_ceil@Base 0.4.3 + fvec_clamp@Base 0.4.5 + fvec_copy@Base 0.4.3 + fvec_cos@Base 0.4.3 + fvec_exp@Base 0.4.3 + fvec_floor@Base 0.4.3 + fvec_get_data@Base 0.4.3 + fvec_get_sample@Base 0.4.3 + fvec_gettimesig@Base 0.4.3 + fvec_ishift@Base 0.4.3 + fvec_local_hfc@Base 0.4.3 + fvec_log10@Base 0.4.3 + fvec_log@Base 0.4.3 + fvec_max@Base 0.4.3 + fvec_max_elem@Base 0.4.3 + fvec_mean@Base 0.4.3 + fvec_median@Base 0.4.3 + fvec_min@Base 0.4.3 + fvec_min_elem@Base 0.4.3 + fvec_min_removal@Base 0.4.3 + fvec_moving_thres@Base 0.4.3 + fvec_mul@Base 0.4.8 + fvec_ones@Base 0.4.3 + fvec_peakpick@Base 0.4.3 + fvec_pow@Base 0.4.3 + fvec_print@Base 0.4.3 + fvec_push@Base 0.4.5 + fvec_quadratic_peak_mag@Base 0.4.3 + fvec_quadratic_peak_pos@Base 0.4.3 + fvec_rev@Base 0.4.3 + fvec_round@Base 0.4.3 + fvec_set_all@Base 0.4.3 + fvec_set_sample@Base 0.4.3 + fvec_set_window@Base 0.4.3 + fvec_shift@Base 0.4.3 + fvec_sin@Base 0.4.3 + fvec_sqrt@Base 0.4.3 + fvec_sum@Base 0.4.3 + fvec_weight@Base 0.4.3 + fvec_weighted_copy@Base 0.4.3 + fvec_zeros@Base 0.4.3 + lvec_get_data@Base 0.4.3 + lvec_get_sample@Base 0.4.3 + lvec_ones@Base 0.4.3 + lvec_print@Base 0.4.3 + lvec_set_all@Base 0.4.3 + lvec_set_sample@Base 0.4.3 + lvec_zeros@Base 0.4.3 + new_aubio_beattracking@Base 0.4.3 + new_aubio_dct@Base 0.4.8 + new_aubio_dct_fftw@Base 0.4.8 + new_aubio_dct_plain@Base 0.4.8 + new_aubio_fft@Base 0.4.3 + new_aubio_filter@Base 0.4.3 + new_aubio_filter_a_weighting@Base 0.4.3 + new_aubio_filter_biquad@Base 0.4.3 + new_aubio_filter_c_weighting@Base 0.4.3 + new_aubio_filterbank@Base 0.4.3 + new_aubio_hist@Base 0.4.3 + new_aubio_mfcc@Base 0.4.3 + new_aubio_notes@Base 0.4.3 + new_aubio_onset@Base 0.4.3 + new_aubio_parameter@Base 0.4.3 + new_aubio_peakpicker@Base 0.4.3 + new_aubio_pitch@Base 0.4.3 + new_aubio_pitchfcomb@Base 0.4.3 + new_aubio_pitchmcomb@Base 0.4.3 + new_aubio_pitchschmitt@Base 0.4.3 + new_aubio_pitchspecacf@Base 0.4.3 + new_aubio_pitchyin@Base 0.4.3 + new_aubio_pitchyinfast@Base 0.4.6 + new_aubio_pitchyinfft@Base 0.4.3 + new_aubio_pvoc@Base 0.4.3 + new_aubio_resampler@Base 0.4.3 + new_aubio_sampler@Base 0.4.3 + new_aubio_scale@Base 0.4.3 + new_aubio_sink@Base 0.4.3 + new_aubio_sink_sndfile@Base 0.4.3 + new_aubio_sink_wavwrite@Base 0.4.3 + new_aubio_source@Base 0.4.3 + new_aubio_source_avcodec@Base 0.4.3 + new_aubio_source_sndfile@Base 0.4.3 + new_aubio_source_wavread@Base 0.4.3 + new_aubio_specdesc@Base 0.4.3 + new_aubio_spectral_whitening@Base 0.4.5 + new_aubio_tempo@Base 0.4.3 + new_aubio_tss@Base 0.4.3 + new_aubio_wavetable@Base 0.4.3 + new_aubio_window@Base 0.4.3 + new_cvec@Base 0.4.3 + new_fmat@Base 0.4.3 + new_fvec@Base 0.4.3 + new_lvec@Base 0.4.3 diff --git a/debian/patches/alpha_norm-88c89e3.diff b/debian/patches/alpha_norm-88c89e3.diff deleted file mode 100644 index bb2fea3..0000000 --- a/debian/patches/alpha_norm-88c89e3.diff +++ /dev/null @@ -1,19 +0,0 @@ -commit 88c89e3b2981c430d61079c9517fa37407ba2f58 -Author: Paul Brossier <piem@piem.org> -Date: Thu Sep 22 13:46:16 2016 +0200 - - python/tests/test_fvec.py: reduce alpha norm precision to 10.-4 - -diff --git a/python/tests/test_fvec.py b/python/tests/test_fvec.py -index 4ea5533..4e50f0f 100755 ---- a/python/tests/test_fvec.py -+++ b/python/tests/test_fvec.py -@@ -98,7 +98,7 @@ class aubio_alpha_norm(TestCase): - x = np.random.rand(1024).astype(float_type) - alpha = np.random.rand() * 5. - x_alpha_norm = (np.sum(np.abs(x)**alpha)/len(x))**(1/alpha) -- assert_almost_equal(alpha_norm(x, alpha), x_alpha_norm, decimal = 5) -+ assert_almost_equal(alpha_norm(x, alpha), x_alpha_norm, decimal = 4) - - class aubio_zero_crossing_rate_test(TestCase): - diff --git a/debian/patches/avcodec_update.diff b/debian/patches/avcodec_update.diff deleted file mode 100644 index c26eac2..0000000 --- a/debian/patches/avcodec_update.diff +++ /dev/null @@ -1,91 +0,0 @@ -diff --git a/src/io/source_avcodec.c b/src/io/source_avcodec.c -index a4cbf6d..faf8015 100644 ---- a/src/io/source_avcodec.c -+++ b/src/io/source_avcodec.c -@@ -150,7 +150,11 @@ aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, uint_t sa - uint_t i; - sint_t selected_stream = -1; - for (i = 0; i < avFormatCtx->nb_streams; i++) { -+#if FF_API_LAVF_AVCTX -+ if (avFormatCtx->streams[i]->codecpar->codec_type == AVMEDIA_TYPE_AUDIO) { -+#else - if (avFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_AUDIO) { -+#endif - if (selected_stream == -1) { - selected_stream = i; - } else { -@@ -167,13 +171,39 @@ aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, uint_t sa - s->selected_stream = selected_stream; - - AVCodecContext *avCodecCtx = s->avCodecCtx; -+#if FF_API_LAVF_AVCTX -+ AVCodecParameters *codecpar = avFormatCtx->streams[selected_stream]->codecpar; -+ if (codecpar == NULL) { -+ AUBIO_ERR("source_avcodec: Could not find decoder for %s", s->path); -+ goto beach; -+ } -+ AVCodec *codec = avcodec_find_decoder(codecpar->codec_id); -+ -+ /* Allocate a codec context for the decoder */ -+ avCodecCtx = avcodec_alloc_context3(codec); -+ if (!avCodecCtx) { -+ AUBIO_ERR("source_avcodec: Failed to allocate the %s codec context for path %s\n", -+ av_get_media_type_string(AVMEDIA_TYPE_AUDIO), s->path); -+ goto beach; -+ } -+#else - avCodecCtx = avFormatCtx->streams[selected_stream]->codec; - AVCodec *codec = avcodec_find_decoder(avCodecCtx->codec_id); -+#endif - if (codec == NULL) { - AUBIO_ERR("source_avcodec: Could not find decoder for %s", s->path); - goto beach; - } - -+#if FF_API_LAVF_AVCTX -+ /* Copy codec parameters from input stream to output codec context */ -+ if ((err = avcodec_parameters_to_context(avCodecCtx, codecpar)) < 0) { -+ AUBIO_ERR("source_avcodec: Failed to copy %s codec parameters to decoder context for %s\n", -+ av_get_media_type_string(AVMEDIA_TYPE_AUDIO), s->path); -+ goto beach; -+ } -+#endif -+ - if ( ( err = avcodec_open2(avCodecCtx, codec, NULL) ) < 0) { - char errorstr[256]; - av_strerror (err, errorstr, sizeof(errorstr)); -@@ -290,12 +320,34 @@ void aubio_source_avcodec_readframe(aubio_source_avcodec_t *s, uint_t * read_sam - } while (avPacket.stream_index != s->selected_stream); - - int got_frame = 0; -+#if FF_API_LAVF_AVCTX -+ int ret = avcodec_send_packet(avCodecCtx, &avPacket); -+ if (ret < 0 && ret != AVERROR_EOF) { -+ AUBIO_ERR("source_avcodec: error when sending packet for %s\n", s->path); -+ goto beach; -+ } -+ ret = avcodec_receive_frame(avCodecCtx, avFrame); -+ if (ret >= 0) { -+ got_frame = 1; -+ } -+ if (ret < 0) { -+ if (ret == AVERROR(EAGAIN)) { -+ AUBIO_WRN("source_avcodec: output is not available right now - user must try to send new input\n"); -+ } else if (ret == AVERROR_EOF) { -+ AUBIO_WRN("source_avcodec: the decoder has been fully flushed, and there will be no more output frames\n"); -+ } else { -+ AUBIO_ERR("source_avcodec: decoding errors on %s\n", s->path); -+ goto beach; -+ } -+ } -+#else - int len = avcodec_decode_audio4(avCodecCtx, avFrame, &got_frame, &avPacket); - - if (len < 0) { - AUBIO_ERR("Error while decoding %s\n", s->path); - goto beach; - } -+#endif - if (got_frame == 0) { - //AUBIO_ERR("Could not get frame for (%s)\n", s->path); - goto beach; diff --git a/debian/patches/cleaner_clean.diff b/debian/patches/cleaner_clean.diff deleted file mode 100644 index 4ac8bae..0000000 --- a/debian/patches/cleaner_clean.diff +++ /dev/null @@ -1,15 +0,0 @@ -diff --git a/python/lib/moresetuptools.py b/python/lib/moresetuptools.py -index 906b871..6318c1e 100644 ---- a/python/lib/moresetuptools.py -+++ b/python/lib/moresetuptools.py -@@ -117,7 +117,9 @@ def add_system_aubio(ext): - - class CleanGenerated(distutils.command.clean.clean): - def run(self): -- distutils.dir_util.remove_tree(output_path) -+ if os.path.isdir(output_path): -+ distutils.dir_util.remove_tree(output_path) -+ config = os.path.join('python', 'ext', 'config.h') - distutils.command.clean.clean.run(self) - - class GenerateCommand(distutils.cmd.Command): diff --git a/debian/patches/fixi386.patch b/debian/patches/fixi386.patch new file mode 100644 index 0000000..a3f6097 --- /dev/null +++ b/debian/patches/fixi386.patch @@ -0,0 +1,77 @@ +Description: relax precision for tests to pass on i386 + This patch include upstream commits 5039244..099237f to relax precision + tests so that they pass on i386. +Author: Paul Brossier <piem@debian.org> +Forwarded: not-needed +Last-Update: 2019-06-21 + +diff --git a/python/tests/test_hztomel.py b/python/tests/test_hztomel.py +index fcd8fa1d..a1f4f8e9 100755 +--- a/python/tests/test_hztomel.py ++++ b/python/tests/test_hztomel.py +@@ -4,23 +4,28 @@ from unittest import main + from numpy.testing import TestCase + from numpy.testing import assert_equal, assert_almost_equal + from _tools import assert_warns ++from utils import is32bit + import numpy as np + import aubio + + from aubio import hztomel, meltohz + from aubio import hztomel_htk, meltohz_htk + +- + class aubio_hztomel_test_case(TestCase): + + def test_hztomel(self): + assert_equal(hztomel(0.), 0.) + assert_almost_equal(hztomel(400. / 3.), 2., decimal=5) + assert_almost_equal(hztomel(1000. / 3), 5.) +- assert_equal(hztomel(200.), 3.) ++ # on 32bit, some of these tests fails unless compiling with -ffloat-store ++ try: ++ assert_equal(hztomel(200.), 3.) ++ except AssertionError: ++ if not is32bit(): raise ++ assert_almost_equal(hztomel(200.), 3., decimal=5) + assert_almost_equal(hztomel(1000.), 15) +- assert_almost_equal(hztomel(6400), 42) +- assert_almost_equal(hztomel(40960), 69) ++ assert_almost_equal(hztomel(6400), 42, decimal=5) ++ assert_almost_equal(hztomel(40960), 69, decimal=5) + + for m in np.linspace(0, 1000, 100): + assert_almost_equal(hztomel(meltohz(m)) - m, 0, decimal=3) +@@ -28,7 +33,11 @@ class aubio_hztomel_test_case(TestCase): + def test_meltohz(self): + assert_equal(meltohz(0.), 0.) + assert_almost_equal(meltohz(2), 400. / 3., decimal=4) +- assert_equal(meltohz(3.), 200.) ++ try: ++ assert_equal(meltohz(3.), 200.) ++ except AssertionError: ++ if not is32bit(): raise ++ assert_almost_equal(meltohz(3.), 200., decimal=5) + assert_almost_equal(meltohz(5), 1000. / 3., decimal=4) + assert_almost_equal(meltohz(15), 1000., decimal=4) + assert_almost_equal(meltohz(42), 6400., decimal=2) +diff --git a/python/tests/utils.py b/python/tests/utils.py +index 4b414883..76064042 100644 +--- a/python/tests/utils.py ++++ b/python/tests/utils.py +@@ -3,11 +3,15 @@ + import os + import re + import glob ++import struct + import numpy as np + from tempfile import mkstemp + + DEFAULT_SOUND = '22050Hz_5s_brownnoise.wav' + ++def is32bit(): ++ return struct.calcsize("P") * 8 == 32 ++ + def array_from_text_file(filename, dtype = 'float'): + realpathname = os.path.join(os.path.dirname(__file__), filename) + return np.loadtxt(realpathname, dtype = dtype) diff --git a/debian/patches/fixpowerpc.patch b/debian/patches/fixpowerpc.patch new file mode 100644 index 0000000..5bd35c3 --- /dev/null +++ b/debian/patches/fixpowerpc.patch @@ -0,0 +1,28 @@ +commit c60e048f3aba852710b9763a6fb24adad2017f40 +Author: Paul Brossier <piem@piem.org> +Date: Thu Jun 20 19:38:39 2019 +0200 + + [py] fix pvoc tests on powerpc + +diff --git a/python/tests/test_phasevoc.py b/python/tests/test_phasevoc.py +index cf3b7ac8..b228269e 100755 +--- a/python/tests/test_phasevoc.py ++++ b/python/tests/test_phasevoc.py +@@ -1,7 +1,7 @@ + #! /usr/bin/env python + + from numpy.testing import TestCase, assert_equal, assert_array_less +-from _tools import parametrize ++from _tools import parametrize, skipTest + from aubio import fvec, cvec, pvoc, float_type + import numpy as np + +@@ -51,7 +51,7 @@ class Test_aubio_pvoc_test_case(object): + assert_equal (s.phas[s.phas > 0], +np.pi) + assert_equal (s.phas[s.phas < 0], -np.pi) + assert_equal (np.abs(s.phas[np.abs(s.phas) != np.pi]), 0) +- self.skipTest('pvoc(fvec(%d)).phas != +0, ' % win_s \ ++ skipTest('pvoc(fvec(%d)).phas != +0, ' % win_s \ + + 'This is expected when using fftw3 on powerpc.') + assert_equal ( r, 0.) + diff --git a/debian/patches/fixtests-5d8cc71.patch b/debian/patches/fixtests-5d8cc71.patch deleted file mode 100644 index 065001d..0000000 --- a/debian/patches/fixtests-5d8cc71.patch +++ /dev/null @@ -1,26 +0,0 @@ -diff --git a/python/tests/utils.py b/python/tests/utils.py -index 2e3c31e..b0963fc 100644 ---- a/python/tests/utils.py -+++ b/python/tests/utils.py -@@ -5,6 +5,8 @@ import glob - import numpy as np - from tempfile import mkstemp - -+DEFAULT_SOUND = '22050Hz_5s_brownnoise.wav' -+ - def array_from_text_file(filename, dtype = 'float'): - filename = os.path.join(os.path.dirname(__file__), filename) - with open(filename) as f: -@@ -21,7 +23,11 @@ def get_default_test_sound(TestCase, rel_dir = 'sounds'): - if len(all_sounds) == 0: - TestCase.skipTest("please add some sounds in \'python/tests/sounds\'") - else: -- return all_sounds[0] -+ default_sound = all_sounds[0] -+ if DEFAULT_SOUND in map(os.path.basename, all_sounds): -+ while os.path.basename(default_sound) != DEFAULT_SOUND: -+ default_sound = all_sounds.pop(0) -+ return default_sound - - def get_tmp_sink_path(): - fd, path = mkstemp() diff --git a/debian/patches/fixtypos.patch b/debian/patches/fixtypos.patch new file mode 100644 index 0000000..b60d9c5 --- /dev/null +++ b/debian/patches/fixtypos.patch @@ -0,0 +1,45 @@ +Description: fix typos + This patch includes upstream commits 98927ca..8e84c46 to fix typos. +Author: Paul Brossier <piem@debian.org> +Forwarded: not-needed +Last-Update: 2019-06-20 + +diff --git a/python/ext/aubio-docstrings.h b/python/ext/aubio-docstrings.h +index 2929ee12..26cada03 100644 +--- a/python/ext/aubio-docstrings.h ++++ b/python/ext/aubio-docstrings.h +@@ -1,7 +1,7 @@ + #define PYAUBIO_dct_doc \ + "dct(size=1024)\n"\ + "\n"\ +- "Compute Discrete Fourier Transorms of Type-II.\n"\ ++ "Compute Discrete Fourier Transforms of Type-II.\n"\ + "\n"\ + "Parameters\n"\ + "----------\n"\ +diff --git a/python/ext/py-fft.c b/python/ext/py-fft.c +index a08af4e7..53dfbbfd 100644 +--- a/python/ext/py-fft.c ++++ b/python/ext/py-fft.c +@@ -3,7 +3,7 @@ + static char Py_fft_doc[] = "" + "fft(size=1024)\n" + "\n" +-"Compute Fast Fourier Transorms.\n" ++"Compute Fast Fourier Transforms.\n" + "\n" + "Parameters\n" + "----------\n" +diff --git a/python/ext/py-sink.c b/python/ext/py-sink.c +index 4fc514f2..83c7ddd1 100644 +--- a/python/ext/py-sink.c ++++ b/python/ext/py-sink.c +@@ -81,7 +81,7 @@ static char Py_sink_close_doc[] = "" + "Close this sink now.\n" + "\n" + "By default, the sink will be closed before being deleted.\n" +-"Explicitely closing a sink can be useful to control the number\n" ++"Explicitly closing a sink can be useful to control the number\n" + "of files simultaneously opened.\n" + ""; + diff --git a/debian/patches/local_mathjax.patch b/debian/patches/local_mathjax.patch index ae2deba..037051a 100644 --- a/debian/patches/local_mathjax.patch +++ b/debian/patches/local_mathjax.patch @@ -7,11 +7,11 @@ Last-Update: 2013-12-29 --- a/doc/web.cfg +++ b/doc/web.cfg -@@ -1483,7 +1483,7 @@ +@@ -1526,7 +1526,7 @@ # The default value is: http://cdn.mathjax.org/mathjax/latest. # This tag requires that the tag USE_MATHJAX is set to YES. --MATHJAX_RELPATH = http://cdn.mathjax.org/mathjax/latest +-MATHJAX_RELPATH = https://cdn.mathjax.org/mathjax/latest +MATHJAX_RELPATH = file:///usr/share/javascript/mathjax # The MATHJAX_EXTENSIONS tag can be used to specify one or more MathJax diff --git a/debian/patches/pytest_local_import.diff b/debian/patches/pytest_local_import.diff deleted file mode 100644 index 1b80e52..0000000 --- a/debian/patches/pytest_local_import.diff +++ /dev/null @@ -1,72 +0,0 @@ ---- a/python/tests/eval_pitch -+++ b/python/tests/eval_pitch -@@ -24,7 +24,7 @@ - import time - import os.path - import numpy --from utils import array_from_text_file, array_from_yaml_file -+from .utils import array_from_text_file, array_from_yaml_file - from aubio import source, pitch, freqtomidi - - start = time.time() ---- a/python/tests/test_filter.py -+++ b/python/tests/test_filter.py -@@ -3,7 +3,7 @@ - from unittest import main - from numpy.testing import TestCase, assert_equal, assert_almost_equal - from aubio import fvec, digital_filter --from utils import array_from_text_file -+from .utils import array_from_text_file - - class aubio_filter_test_case(TestCase): - ---- a/python/tests/test_filterbank.py -+++ b/python/tests/test_filterbank.py -@@ -5,7 +5,7 @@ - from numpy.testing import assert_equal, assert_almost_equal - import numpy as np - from aubio import cvec, filterbank, float_type --from utils import array_from_text_file -+from .utils import array_from_text_file - - class aubio_filterbank_test_case(TestCase): - ---- a/python/tests/test_sink.py -+++ b/python/tests/test_sink.py -@@ -4,7 +4,7 @@ - from nose2.tools import params - from numpy.testing import TestCase - from aubio import fvec, source, sink --from utils import list_all_sounds, get_tmp_sink_path, del_tmp_sink_path -+from .utils import list_all_sounds, get_tmp_sink_path, del_tmp_sink_path - - list_of_sounds = list_all_sounds('sounds') - samplerates = [0, 44100, 8000, 32000] ---- a/python/tests/test_slicing.py -+++ b/python/tests/test_slicing.py -@@ -3,8 +3,8 @@ - from unittest import main - from numpy.testing import TestCase, assert_equal - from aubio import slice_source_at_stamps --from utils import count_files_in_directory, get_default_test_sound --from utils import count_samples_in_directory, count_samples_in_file -+from .utils import count_files_in_directory, get_default_test_sound -+from .utils import count_samples_in_directory, count_samples_in_file - - import tempfile - import shutil ---- a/python/tests/test_source.py -+++ b/python/tests/test_source.py -@@ -4,7 +4,7 @@ - from nose2.tools import params - from numpy.testing import TestCase - from aubio import source --from utils import list_all_sounds -+from .utils import list_all_sounds - - list_of_sounds = list_all_sounds('sounds') - samplerates = [0, 44100, 8000, 32000] ---- /dev/null -+++ b/python/tests/__init__.py -@@ -0,0 +1 @@ -+ diff --git a/debian/patches/series b/debian/patches/series index 11190f9..106a693 100644 --- a/debian/patches/series +++ b/debian/patches/series @@ -1,9 +1,5 @@ local_mathjax.patch -fixtests-5d8cc71.patch -skip_ppc64.diff -sort_pysrc.diff -alpha_norm-88c89e3.diff -avcodec_update.diff -waf_cstlib_inst-8698499.diff -pytest_local_import.diff -cleaner_clean.diff +fixtypos.patch +fixpowerpc.patch +fixi386.patch +wscript_py3.patch diff --git a/debian/patches/skip_ppc64.diff b/debian/patches/skip_ppc64.diff deleted file mode 100644 index e730a22..0000000 --- a/debian/patches/skip_ppc64.diff +++ /dev/null @@ -1,40 +0,0 @@ -diff --git a/python/tests/test_fft.py b/python/tests/test_fft.py -index fa349e5..a8f82b9 100755 ---- a/python/tests/test_fft.py -+++ b/python/tests/test_fft.py -@@ -33,7 +33,14 @@ class aubio_fft_test_case(TestCase): - f = fft (win_s) - fftgrain = f (timegrain) - assert_equal ( fftgrain.norm, 0 ) -- assert_equal ( fftgrain.phas, 0 ) -+ try: -+ assert_equal ( fftgrain.phas, 0 ) -+ except AssertionError: -+ assert_equal (fftgrain.phas[fftgrain.phas > 0], +pi) -+ assert_equal (fftgrain.phas[fftgrain.phas < 0], -pi) -+ assert_equal (np.abs(fftgrain.phas[np.abs(fftgrain.phas) != pi]), 0) -+ self.skipTest('fft(fvec(%d)).phas != +0, ' % win_s \ -+ + 'This is expected when using fftw3 on powerpc.') - - def test_impulse(self): - """ check the transform of one impulse at a random place """ -diff --git a/python/tests/test_phasevoc.py b/python/tests/test_phasevoc.py -index 23cbad5..957d3b1 100755 ---- a/python/tests/test_phasevoc.py -+++ b/python/tests/test_phasevoc.py -@@ -46,7 +46,14 @@ class aubio_pvoc_test_case(TestCase): - r = f.rdo(s) - assert_equal ( t, 0.) - assert_equal ( s.norm, 0.) -- assert_equal ( s.phas, 0.) -+ try: -+ assert_equal ( s.phas, 0 ) -+ except AssertionError: -+ assert_equal (s.phas[s.phas > 0], +np.pi) -+ assert_equal (s.phas[s.phas < 0], -np.pi) -+ assert_equal (np.abs(s.phas[np.abs(s.phas) != np.pi]), 0) -+ self.skipTest('pvoc(fvec(%d)).phas != +0, ' % win_s \ -+ + 'This is expected when using fftw3 on powerpc.') - assert_equal ( r, 0.) - - @params( diff --git a/debian/patches/sort_pysrc.diff b/debian/patches/sort_pysrc.diff deleted file mode 100644 index ac28e3f..0000000 --- a/debian/patches/sort_pysrc.diff +++ /dev/null @@ -1,44 +0,0 @@ ---- a/python/lib/moresetuptools.py -+++ b/python/lib/moresetuptools.py -@@ -58,8 +58,8 @@ - # create an empty header, macros will be passed on the command line - fake_config_header = os.path.join('python', 'ext', 'config.h') - distutils.file_util.write_file(fake_config_header, "") -- aubio_sources = glob.glob(os.path.join('src', '**.c')) -- aubio_sources += glob.glob(os.path.join('src', '*', '**.c')) -+ aubio_sources = sorted(glob.glob(os.path.join('src', '**.c'))) -+ aubio_sources += sorted(glob.glob(os.path.join('src', '*', '**.c'))) - ext.sources += aubio_sources - # define macros (waf puts them in build/src/config.h) - for define_macro in ['HAVE_STDLIB_H', 'HAVE_STDIO_H', ---- a/setup.py -+++ b/setup.py -@@ -37,7 +37,7 @@ - if sys.platform.startswith('darwin'): - extra_link_args += ['-framework','CoreFoundation', '-framework','AudioToolbox'] - --sources = glob.glob(os.path.join('python', 'ext', '*.c')) -+sources = sorted(glob.glob(os.path.join('python', 'ext', '*.c'))) - - aubio_extension = Extension("aubio._aubio", - sources, ---- a/python/lib/gen_external.py -+++ b/python/lib/gen_external.py -@@ -123,7 +123,7 @@ - - def generate_external(header=header, output_path=output_path, usedouble=False, overwrite=True): - if not os.path.isdir(output_path): os.mkdir(output_path) -- elif not overwrite: return glob.glob(os.path.join(output_path, '*.c')) -+ elif not overwrite: return sorted(glob.glob(os.path.join(output_path, '*.c'))) - sources_list = [] - cpp_output, cpp_objects = get_cpp_objects(header) - lib = {} -@@ -241,7 +241,7 @@ - print ("wrote %s" % output_file ) - # no need to add header to list of sources - -- return sources_list -+ return sorted(sources_list) - - if __name__ == '__main__': - if len(sys.argv) > 1: header = sys.argv[1] diff --git a/debian/patches/waf_cstlib_inst-8698499.diff b/debian/patches/waf_cstlib_inst-8698499.diff deleted file mode 100644 index 7738f2d..0000000 --- a/debian/patches/waf_cstlib_inst-8698499.diff +++ /dev/null @@ -1,25 +0,0 @@ -commit 8698499e0619b3c2cd0dcf7c880f3bec64bbb876 -Author: Paul Brossier <piem@piem.org> -Date: Sat Nov 26 14:52:37 2016 +0100 - - src/wscript_build: also install static library - - See this post from waf author: - https://groups.google.com/forum/#!msg/waf-users/GBHPrmO_lDg/34VWYEaks40J - -diff --git a/src/wscript_build b/src/wscript_build -index f2bd2ba..c55d5f2 100644 ---- a/src/wscript_build -+++ b/src/wscript_build -@@ -29,6 +29,11 @@ elif ctx.env['DEST_OS'] in ['emscripten']: - else: #linux, darwin, android, mingw, ... - build_features = ['cstlib', 'cshlib'] - -+# also install static lib -+from waflib.Tools.c import cstlib -+from waflib.Tools.fc import fcstlib -+fcstlib.inst_to = cstlib.inst_to = '${LIBDIR}' -+ - for target in build_features: - ctx(features = 'c ' + target, - use = uselib + ['lib_objects'], diff --git a/debian/patches/wscript_py3.patch b/debian/patches/wscript_py3.patch new file mode 100644 index 0000000..7dbb9a4 --- /dev/null +++ b/debian/patches/wscript_py3.patch @@ -0,0 +1,26 @@ +Description: use current interpreter to create test sound files + This patch ensures the current interpreter is used to run create_tests_source. +Author: Paul Brossier <piem@debian.org> +Forwarded: not-needed +Last-Update: 2020-01-02 + +Index: aubio/tests/wscript_build +=================================================================== +--- aubio.orig/tests/wscript_build ++++ aubio/tests/wscript_build +@@ -1,5 +1,6 @@ + # vim:set syntax=python: + ++import sys + import os.path + + uselib = ['aubio'] +@@ -13,7 +14,7 @@ test_sound_abspath = bld.path.get_bld(). + test_sound_abspath = str(test_sound_abspath).replace('\\', '\\\\') + + b = bld(name='create_tests_source', +- rule='python ${SRC} ${TGT}', ++ rule=sys.executable + ' ${SRC} ${TGT}', + source='create_tests_source.py', + target=test_sound_target) + # use post() to create the task, keep a reference to it diff --git a/debian/rules b/debian/rules index 4f2ff4d..ce08d2d 100755 --- a/debian/rules +++ b/debian/rules @@ -2,39 +2,42 @@ # -*- makefile -*- #export DH_VERBOSE=1 +export DEB_BUILD_MAINT_OPTIONS = hardening=+all DEB_HOST_MULTIARCH ?= $(shell dpkg-architecture -qDEB_HOST_MULTIARCH) -# set environment for waf -export LINKFLAGS=-Wl,--as-needed +LDFLAGS += -Wl,--as-needed + WAF_OPTIONS = --verbose --destdir=debian/tmp --prefix=/usr --enable-fftw3f WAF_OPTIONS += --libdir=/usr/lib/$(DEB_HOST_MULTIARCH) -WAF_CMD = ./waf +WAF_CMD = python3 ./waf export PYBUILD_NAME=aubio export PYBUILD_AFTER_INSTALL_python2=rm -vrf '{destdir}/usr/bin' -export PYBUILD_AFTER_INSTALL_python3=mv '{destdir}/usr/bin/aubiocut' '{dir}/debian/tmp/usr/bin' - -export PYBUILD_BEFORE_TEST=make create_test_sounds; cp -prv '{dir}/python/tests' '{build_dir}' -export PYBUILD_AFTER_TEST=rm -rf '{build_dir}/tests'; rm -rf '{dir}/python/tests/sounds' -export PYBUILD_TEST_ARGS_python2=cd '{build_dir}'; nose2-2.7 --verbose -export PYBUILD_TEST_ARGS_python3=cd '{build_dir}'; python{version} `which nose2-3` --verbose +export PYBUILD_AFTER_INSTALL_python3=mv '{destdir}/usr/bin/aubio' '{destdir}/usr/bin/aubiocut' '{dir}/debian/tmp/usr/bin' +export PYBUILD_BEFORE_TEST=make create_test_sounds +export PYBUILD_AFTER_TEST=rm -rf '{dir}/python/tests/sounds' %: - dh $@ --with python2,python3 --buildsystem=pybuild + dh $@ --with python3 --buildsystem=pybuild override_dh_auto_clean: dh_auto_clean --buildsystem=pybuild -$(WAF_CMD) distclean - rm -rf doc/web/ rm -rf python/ext/config.h + rm -rf aubio.egg-info/ + rm -f this_version.pyc + rm -f waf_gensyms.pyc + rm -rf python/tests/sounds + rm -rf .cache/ .pytest_cache/ + rm -rf waflib/*/__pycache__/ + rm -rf ./__pycache__/ # extra rules to remove files manually #-find waf -name '*.pyc' -delete #rm -rf .waf* .lock-waf* #rm -rf build/ dist/ .waf* .lock-waf* #rm -rf python/gen/ aubio.egg-info #rm -rf python/lib/*.pyc python/lib/aubio/*.so - #rm -rf python/tests/sounds #rm -rf debian/tmp #rm -rf manpages.refs manpages.links @@ -47,9 +50,8 @@ override_dh_auto_build: override_dh_auto_test: # run tests - PYBUILD_SYSTEM=custom \ LD_LIBRARY_PATH=$(CURDIR)/build/src:$(LD_LIBRARY_PATH) \ - dh_auto_test --buildsystem=pybuild + dh_auto_test -- --test-pytest --test-args $(CURDIR)/python/tests/ override_dh_auto_install: # library diff --git a/debian/tests/control b/debian/tests/control new file mode 100644 index 0000000..32ad067 --- /dev/null +++ b/debian/tests/control @@ -0,0 +1,3 @@ +Tests: py3xtests +Depends: @, sox, python3-pytest +Restrictions: allow-stderr diff --git a/debian/tests/py2xtests b/debian/tests/py2xtests new file mode 100644 index 0000000..ea20ba6 --- /dev/null +++ b/debian/tests/py2xtests @@ -0,0 +1,2 @@ +make create_test_sounds +pytest --verbose diff --git a/debian/tests/py3xtests b/debian/tests/py3xtests new file mode 100644 index 0000000..c051a67 --- /dev/null +++ b/debian/tests/py3xtests @@ -0,0 +1,2 @@ +make create_test_sounds +pytest-3 --verbose diff --git a/debian/upstream/metadata b/debian/upstream/metadata new file mode 100644 index 0000000..c5009cc --- /dev/null +++ b/debian/upstream/metadata @@ -0,0 +1,12 @@ +Name: aubio +Documentation: https://aubio.org/doc +Repository: https://git.aubio.org/aubio/aubio/ +Repository-Browse: https://git.aubio.org/ +Bug-Submit: https://aubio.org/development +Reference: +- Author: Paul Brossier + Title: Automatic Annotation of Musical Audio for Interactive Applications + Type: phdthesis + Year: 2006 + URL: https://aubio.org/phd/ + Eprint: https://aubio.org/phd/thesis/brossier06thesis.pdf diff --git a/debian/upstream/signing-key.asc b/debian/upstream/signing-key.asc new file mode 100644 index 0000000..62f971a --- /dev/null +++ b/debian/upstream/signing-key.asc @@ -0,0 +1,114 @@ +-----BEGIN PGP PUBLIC KEY BLOCK----- + +mQINBFM0j78BEACwNXTxOf2SArnsxHrJMhadgYRbFPrhaEa0oQCGs2aKFM55esi6 +Vm+UU/r3ILDFerNRCIl3+Ww7X9NiXx5mSE1HL+sStHu/uQ+ujdL0w0WfkL6QxFO1 ++A1UMpp4zQ9GOinwEkrAVHAwbAqw7spmQ3sUoVuou4xXNQXELyxNdL2ccQ0RU0RX +KSahKtqAh7Vxel28wEAOA//MFWy/Q8OrlKu/hat+gAAxhPcLO4pDrHkop/QBs4Cc +pjh8CHIIh2rQgLZTbuqttQwOd3KM8+J/zOzLrPIebSsClr+/XLS+4Z+OGMLM1GB6 +QfQGnJNQt0hviAQ3k7wwRLnkL2uypwWXAolspGwR7I0WdiZACAfJy/v+akw6PEnk +cReBYK8pmHJhAoaU+RU60DkoY8iZsUgWMqXRPvV2w8Re1Crcp8i5eqJGOEEISzAp +GTW+huhMr835Oe8bHbBk3CfMkEVXZsu1ohd0pLZljipSmyLBuYIkdP2XR0zz1GXh +pt5Uk/YNhWYRd3pmrOcdnCgH2b5VR71smcELCf9QXaOVqVwy9jW2hnRWytMD5H9s +Zkv2OQnz8rkci0JdUEfBS5hmgpiCgn8Nx1NZ105JBW58mXw9eysmME31fyxifiV7 +JhAoZfoYHb8bbcwuzKrx4LQLvDj4cxDHa7qPAv02nyV95/ul4SuZJnMBRQARAQAB +tB1QYXVsIEJyb3NzaWVyIDxwaWVtQHBpZW0ub3JnPokCVwQTAQoAQQIbAwULCQgH +AwUVCgkICwUWAwIBAAIeAQIXgAIZARYhBLiKUHLUkVrs+BokNGpJsZcoq92SBQJc +inAHBQkNGEdIAAoJEGpJsZcoq92S1s8P/jTu3OllJZDQmjOqaAOVrrhJRxaX6Lzo +ksyZ7c3H21Uk1KiShz9Y84sskDsWdZENOy0v3zrzEzkdQKSu1CKO0E7jcIZYwuUw +l3hfTOKXXR5r0kfQuHbqMpbLJ7lmI2oWNAiUAFlcx/ghoCMN6claiIp24gyD1gyj +JWOnTknuLnvEthVH5gqDrp/vf+OOpNU2Jp/kDx3tV0CcYGmSYd5FoPZJjtMRLqsT +TPXGvTYJf0UfqBFB+IK92ChwACbW77a2OuRY0b3GvZy2fnL+FC8b2geAQjNQnmUh +PxQuf9a+GBf+7YZd9dyOPX9gZOwVJQ49ONzkuSxYDYShgkcMbNWddhINglSKN2I0 +1RaC8xz4CSBVXkMaOFmkBZ4Q22iI+iXfVldC3aaGWndVtDYhwKWTeZUkIiszaMcr +r8tjiSURCKfCN42zK9C9ng2G6iHcJnSRl6LFRMEIve6iyCCwaSfEXAJAvkStwX4R +HyJJfVEJQrTKHGZU9vf9t8FQPf7NWRxQLTa1ERV6TVIMQh0Qxa5DvFzZbV3ES41q +xfRPvfNdQ+Jl++VG734juIzp6UM57oEQ95xkSxPb5TVPZ2OK7edMgSoz66uyQB9e +3FT8SdBViBFD0EOO2TnpLa3enFMa4nr9X/CWx/UUA72CYEyqCYlyqxwQ778Bj33u +TuFjoraBeA9htB9QYXVsIEJyb3NzaWVyIDxwaWVtQGFsdGVybi5vcmc+iQJUBBMB +CgA+AhsDBQsJCAcDBRUKCQgLBRYDAgEAAh4BAheAFiEEuIpQctSRWuz4GiQ0akmx +lyir3ZIFAlyKcAcFCQ0YR0gACgkQakmxlyir3ZLrWw//e/BK9rgTrGVlD5gbsOHG +rBHmYjQHFxeerjg9W5MAf+opgGtdqT2rVbBCghcaVuNHxLr3I+FqaKy9/llJy0Oi +lw8BF20/iHcuNIGTjMatiF1ios+gNPrz6BQTQySgMPy/6d/jjxxUmVOGn3guY3P6 +yJ93wGaopkElLo7DELCtXopF3w32pJbu1SUoA/c6mfuJVAHhhpRmBendJ2NvV0eb +M9bHb2m4xkO3Pqg/m02DltuefDL5onF/w8bTc+1bgBwcoO+LJSlYjN17FC4oBRMj +PuCCvHOT3GJFG2Zj21QcFw8othKIzenlo3cDJs/hd2OuM9Xw2OXfdTPfbMiQYCiw +Ndv4AYZTqUP0NEHbJD27PVoUUOf3MFBHco1JgaofpLmcOzMbqjw6n/gT/bTOVvjn +qef7q9c5ZLdew2Jz6pdQvDwReu19EiaKtKIQlrsC+ayN2Hmp4PVMPBMMX2rZj1kS +m9QT/Epuqp7jFuCSnYt7ttRQ8yZio1nUxAsiDYgfCi76hA36iTMZcLRK/pK9q6zW +nn666u1IcJ++sSrTxfJ3PcH67+Wg/wSnwhVYZcFQnpO5xbJDMfnfJarAOVDt0Zpg +XKvgCJBz5kzco3yN7awDv1N387grRJ3gbMZNYUCHnu66UzcZgjfVCcr2X+r8SYUo +RMESSydxSXwqacvDZVTSnFS0HlBhdWwgQnJvc3NpZXIgPHBpZW1AYXViaW8ub3Jn +PokCVAQTAQoAPgIbAwULCQgHAwUVCgkICwUWAwIBAAIeAQIXgBYhBLiKUHLUkVrs ++BokNGpJsZcoq92SBQJcinAHBQkNGEdIAAoJEGpJsZcoq92SPyEQAJC1jDZxeWAu +d+KsTOUz6fZpVufmYsxo9rbuZWnU5iiqQD1Aq9qa6LOaoix1+Bb5aa+rp84QckOZ +zkRs3jG7wb5rFTx1P+Ktfw3ZXJSXeo83aa0wW2qFVKNYj/TfuvvM6ZyfK2Ix5OpO +lGr07Kuc4fg9odLg/CipawKgOmCIU6NPLIFpBCXskmaoDm3tsz/v0mJ3eVbCaGx9 +Um5dOs+jFXWHygRBdQJ5GI1J9v4oySwPJphm8ZPbBycGUfnRrpl58YFQoNBBCHgA +SpV47ZXBzSRN4d2upkdDEhRUKJKF4NSa6E0lWJnr8Ibc5LnOKCf3Tetd4yNVI82S +gLob0N99fJ+rhmG07UYe5S6gX1tswPt41TosZ1AYSKDmA7wdifPqpicm+fznRTDj +4IQbpbeelWO8SdSUB6TgOXOzyA7vVdYGNajpufvXqHnI6V1Q++Fr7NzLAGSYvykE +KjUhpJr3JbwfpwZSpDEgOe69BljZSRqgPoyh5vc/3PeazB4rt/NQDcnm8o+D4389 +az+D/nFovZwOmQrrVhZ5EM5rzQmg10NqLJGKiHmtA4bjTRK7mLXJTUxuPm8J6Rw+ +SYxm1lQ7vf33gSeWjZDHdeowAnyBokH1eiA1pek7MwOHXveuRd80ZGncmquJ+7UA +ohE2jNT46A8XbX4o6NR/Hgt8K2jEqnlhtB9QYXVsIEJyb3NzaWVyIDxwaWVtQGRl +Ymlhbi5vcmc+iQJUBBMBCgA+AhsDBQsJCAcDBRUKCQgLBRYDAgEAAh4BAheAFiEE +uIpQctSRWuz4GiQ0akmxlyir3ZIFAlyKcAcFCQ0YR0gACgkQakmxlyir3ZLNbRAA +nqY6bxS3Q4mwn7MlrTRAZyifoAL4/1BVHHlVHw5x4QjzLGCdir9j6jw1eBn+amEv +HMYT1KFMziIAs4P20VKJFsRAjeLSj69oYKMBrI/MdaC3UqXhDC3/iJPWKuzcq7oi +qqezyGbvONBcm/wLg6h/mr1Ns5s+bbEZ/GUsaYTHzC1z7AvPT8LJhdYBo7EPPKOh +q+uCaPIdbHRn+dEjU6NMyEiPdCOcZ83+TsiqEIWe8KGbJsp5zcG2JEyMdue0fRMd +nGfOaiGCtGppxlZFL1Vfks0JFilojQq9KBREchvxp0JJLc0SDS9dfs3MNL7Erw+S +s7JeqfsGvf6bdKxl1pNzHOE7yA1DfTmtPDFis5+StLJYSNa+2qrgj7Q6hWuG8yvN +0T1ROqChDaElCGiDkKic0D4ozCZbqvXaYrLbwROxq9IiFrH1ARzQ5+VLv7wpTDW1 +X7WOeV5ZmOl9Kv2aP3HqlU5ZsCzwB0vKlXWeivWAMNbW9gHUdBD5W4+Q71NkJFID +iRCQDW8l6qIzmehSjqQVTwIl9TE0Aiu0ufwwWBJKG6dsozsT1TQLtNLnhY4+VHqE +18OdFh3iRX632+fp4vjFhK+pq/rmbyNBZWxTITaxmnapTVmAkDlherrq4B92QoYA +hYwNqXHL7NoiaYZ/hPwviYL+zSG0Riy63p9AdxPXrpi5Ag0EUzSPvwEQAKjvUxi8 +7soGPPgp3SpYkZOJg/UN+Knl6wVHSFMxfb071ed4khkY4ExfSIJTp3q6jlrNLVE5 +xoYgo112FLpml23wA/6mnKSKkTu5s3VMeuzHHEZMnWPtBHKBxo2yzNpjGv+4yRGc +MqykDmZ62BzlyeOpQOwMoowpPrpRvtiRB+GYrCLce60UYr3ve2d/bKHImLw8rZ0y +OI2mEGNInKUR0RQqPKtGWgwk5dcqJKKHIjzDTDRYg6UpEy20ARbczi3iV6PqbEPF +G3Xi7v7K/DBX4sjbbjg1ZlTpkUNfNBq3/XjkZ+OmlbaUDpK7GS7WamICJKfN2L14 +HYe/qZss8OAzyn/bSfMOO139X8CD9sA5KZLugfQ7D4vh9MO/8rjWrbcMwkBmPvgf +HK4RRWNLYTSckyTi+01imVQ7FktZJiw3OY4vaAgDM907DBWsGSyRc8pBRmYtjdq/ +vyADyxWwiU9Zlrnt+JF8LAejOEx9GmZDsrFAXiQezTfW20MSxfdq0w85SDa0yKc+ +NkItk75v0ahGV1mqG/KdYebrebLSqIDpKoxkPhTjERuMTQ35ciwj/3zjxbzD08CR +q5vMY14yoZt4ZHluuD/hSTlKTCuNxRkovv9PO/WNO62pBBY3BCqJlkF8hpMp6LLe +BNyqmq9nXLYSFOm1ag11zhFP2XGvZ82GQDX5ABEBAAGJAjwEGAEKACYCGwwWIQS4 +ilBy1JFa7PgaJDRqSbGXKKvdkgUCXIpwQwUJDRhHhAAKCRBqSbGXKKvdkqDvEACB +ainX9ZgGgU7U7afZAtOu6/7Y/fRFQgfvFL3dxvlJ8KR21IZQVMOtYc948Zs+m4wv +WmDIYOqlY+YF3vm2KalD0SOzUIHGQVMPQu/XB20ys9EUd7ZHfZgfnh7NCiEfgcAL +D00TeOqx9Sb44DhJi/Ptff/fVWEkUEpAv0ngnHnmpJOqHW9Sf8rMkKT3h9OhNhRA +8tDLiSwwVRWvhaiJMNuFF13FrYvFmJJQd6D24h8a8U6SRlUYslzG8a8VS8+fdpO1 +k4o0FT0ugkzqA/F1ioR+5S/0uivF1/r/tX60isG2KSfHN/4bJxCeJ5x7WonEDHee +HF83ktCcbIcbcyZs9oBdm5NRYuf2sdp9kC8W/lT4qBVA/xhcIPvM/fVjhNfyGWUu +mM2OIPKsbTKmeO0auOZLzIsSXApmqIv40x0eXaynoJTev3tXB1Mso0bL1IvHb7GA +q2FbTtPiajMtmaL3O0T+6ytKt3R0jVRHtZMNaXCucNFHe9WZoS57o/BdqdDGhm6m +e6DFgUYT6tdICvLFTnW55xYG+Ayc42yEl9aRJebRRSK0k8rIPs+g1oSA+Jn441RA +QF60Ux9foWqGzcn4SVNi5aL4K50a7U5+eLLMbd9eZjqeoZPO7epOPJPknWaGSbcK +f1SOwCLTKoV0OTe3Dy0wESrae2G6Kxv2km017XvnqLkCDQRTNJGnARAAtQSn8zJf +Zv94DyxRrNz1YCuoyM5WoUjKp2J7UX8Xj3Ll56YAcjSvCKq/bFxQF6e1+wR1XVzO +DqYHfTaDugAV/+Y25KYTimrEqNjaPir2rkDJId5yThkhkSfjt6QwEouMDCRocC8G +roYNsV+SWL/K2wUK47mqmSw5XzV4n9BsYlU8vMVxJqaI2lV5EQGplLwPFFo1MTpK +V4buJZ8wiDU3BvBmyqgjEd3dbVoRQ12svzb7S+icTUYmmRPFPRkUHz3kes4C9Nmo +ANY45FxspYyMPrWF43689PJER9TVFaBynJYumK9We0RHPAV5Dt3XMNkxy8uu2b2p +OEpYM8mzz/2MEFfbodmjyEAU65X1xcVqVaJxPFD+zCvUt0SBQ7kJHGE8HAAyprYZ +DyAX2xvh/iBpeNBYqJe2KMjp3Ha+do7+aQVvbSYx0INgCTCPMmCkxylEFAxOalGT +vvw7CUrhPo7q+ewN+REiBLRIBcR07OSBAsewlvz/QLNai5gKKzKVX3LMfpnWVSwe +8PxWAad3anXPfaFiETxgm/xlCgozk0YWkrbFXOpKBnFtjaZMKL+flBvzq5OQdFKj +mi0stRutS189SyH1mMu+yvAIq2PYV0VdLXtCVbNB11UsvOkzNxnVe2LpnjOLUJ/C +49tNsXlb/+BSfimoiNmsXGgIQt/L0slvdssAEQEAAYkCPAQYAQoAJgIbDBYhBLiK +UHLUkVrs+BokNGpJsZcoq92SBQJcinBDBQkNGEWcAAoJEGpJsZcoq92SN3oP/iXO +U4OxGyHQgSc6jrbHC3iZyU160I/VmFRPFysj0TpXOYUbiYk5Hq91N+oQi9XRabai +gGnHYS8+MURVfC7B2On4+O/NVogHIqTf799IX03VV7X13uEQR3Z410HAnLWaDN6S +B+guHJbXyyP9JdZlyi27rZnYH5zu4xruypyCtlxRnOgKS7VEtB/Xt+gxHvKngIvL +Asb0iy/utezmoE+QG1lCDgaGMSP4RcppbkGcNeHxTQQbLbp5xnwiqhLpg1/FM8Zz +HboRgauUngAQeqMzNsP8jB1qwamTk3QwdHGVtM2+oGYzCdhXsMLM7p1hX5kB8yE0 +WOryV9hIYTVfaEJqdNp7CqEJlQRQ0UuzGE+v2V46kcTb2frglGS3mWy6TCGQwryj +VQT89ccBbxS31XCjCSsUmCteoIqC5U6sV3f2/2ZTICyMOi56C/yXvzevGCZ/XeHp +SYCCWkUgjangzV+n0S7oOTjQGeDdWlRkcrZUK45cdZ+KHXwoGtcpDJBT8tRhNOBk +S8kCtyInWxBzth6Pcut9j09VzSFLhCK6vTe4J622/H90UyRQgHXOfBzUky9mOQLp +vlqXyXDbGbTeiVdml+w/LffuyTS+r0DBbY6yvIxCQFfn8kawjCBkVTJjQyFrRv2w +nO4uOZNZheesRhcLECYCaI3W4iB87cx63tL7kjrV +=A5x/ +-----END PGP PUBLIC KEY BLOCK----- diff --git a/debian/watch b/debian/watch index ef3628e..c85db65 100644 --- a/debian/watch +++ b/debian/watch @@ -1,2 +1,3 @@ version=2 -http://aubio.org/pub/ aubio-([0-9.]*)\.tar\.bz2 +opts="pgpsigurlmangle=s/$/.asc/" \ + https://aubio.org/pub/ aubio-([0-9.]*)\.tar\.bz2 diff --git a/doc/Makefile b/doc/Makefile index 8b0b81d..b93571d 100644 --- a/doc/Makefile +++ b/doc/Makefile @@ -39,9 +39,11 @@ help: @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: + -rm -rf _static -rm -rf $(BUILDDIR)/* html: + mkdir -p _static $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." diff --git a/doc/about.rst b/doc/about.rst new file mode 100644 index 0000000..65083f8 --- /dev/null +++ b/doc/about.rst @@ -0,0 +1,73 @@ +About +===== + +This library gathers a collection of music signal processing algorithms written +by several people. The documentation of each algorithms contains a brief +description and references to the corresponding papers. + +Credits +------- + +Many thanks to everyone who contributed to aubio, including: + + - Martin Hermant (`MartinHN <https://github.com/MartinHN>`_) + - Eduard Müller (`emuell <https://github.com/emuell>`_) + - Nils Philippsen (`nphilipp <https://github.com/nphilipp>`_) + - Tres Seaver (`tseaver <https://github.com/tseaver>`_) + - Dirkjan Rijnders (`dirkjankrijnders <https://github.com/dirkjankrijnders>`_) + - Jeffrey Kern (`anwserman <https:/ /github.com/anwserman>`_) + - Sam Alexander (`sxalexander <https://github.com/sxalexander>`_) + +Special thanks to Juan Pablo Bello, Chris Duxbury, Samer Abdallah, Alain de +Cheveigne for their help. Also many thanks to Miguel Ramirez and Nicolas Wack +for their advices and help fixing bugs. + +Publications +------------ + +Substantial informations about several of the algorithms and their evaluation +are gathered in: + + - Paul Brossier, `Automatic annotation of musical audio for interactive + systems <https://aubio.org/phd>`_, PhD thesis, Centre for Digital music, + Queen Mary University of London, London, UK, 2006. + +Additional results obtained with this software were discussed in the following +papers: + + - P. M. Brossier and J. P. Bello and M. D. Plumbley, `Real-time temporal + segmentation of note objects in music signals + <https://aubio.org/articles/brossier04fastnotes.pdf>`_ in *Proceedings of + the International Computer Music Conference*, 2004, Miami, Florida, ICMA + + - P. M. Brossier and J. P. Bello and M. D. Plumbley, `Fast labelling of note + objects in music signals + <https://aubio.org/articles/brossier04fastnotes.pdf>`, in *Proceedings of + the International Symposium on Music Information Retrieval*, 2004, + Barcelona, Spain + +Citation +-------- + +Please refer to the Zenodo link in the file README.md to cite this release. + +Copyright +--------- + +Copyright © 2003-2017 Paul Brossier <piem@aubio.org> + +License +------- + +aubio is a `free <https://www.debian.org/intro/free>`_ and `open source +<http://www.opensource.org/docs/definition.php>`_ software; **you** can +redistribute it and/or modify it under the terms of the `GNU +<https://www.gnu.org/>`_ `General Public License +<https://www.gnu.org/licenses/gpl.html>`_ as published by the `Free Software +Foundation <https://fsf.org>`_, either version 3 of the License, or (at your +option) any later version. + +.. note:: + + aubio is not MIT or BSD licensed. Contact us if you need it in your + commercial product. diff --git a/doc/android.rst b/doc/android.rst new file mode 100644 index 0000000..732cfe5 --- /dev/null +++ b/doc/android.rst @@ -0,0 +1,9 @@ +.. _android: + +Android build +------------- + +To compile aubio for android, you will need to get the `Android Native +Development Toolkit (NDK) <https://developer.android.com/ndk/>`_, prepare a +standalone toolchain, and tell waf to use the NDK toolchain. An example script +to complete these tasks is available in ``scripts/build_android``. diff --git a/doc/aubio.txt b/doc/aubio.txt new file mode 100644 index 0000000..cd1138f --- /dev/null +++ b/doc/aubio.txt @@ -0,0 +1,141 @@ +NAME + aubio - a command line tool to extract information from sound files + +SYNOPSIS + + aubio [-h] [-V] <command> ... + +COMMANDS + + The general syntax is "aubio <command> <soundfile> [options]". The following + commands are available: + + onset get onset times + pitch extract fundamental frequency + beat get locations of beats + tempo get overall tempo in bpm + notes get midi-like notes + mfcc extract mel-frequency cepstrum coefficients + melbands extract mel-frequency energies per band + + For a list of available commands, use "aubio -h". For more info about each + command, use "aubio <command> --help". + +GENERAL OPTIONS + + These options can be used before any command has been specified. + + -h, --help show help message and exit + + -V, --version show version + +COMMON OPTIONS + + The following options can be used with all commands: + + <source_uri>, -i <source_uri>, --input <source_uri> input sound file to + analyse (required) + + -r <freq>, --samplerate <freq> samplerate at which the file should be + represented (default: 0, e.g. samplerate of the input sound) + + -H <size>, --hopsize <size> overlap size, number of samples between two + consecutive analysis (default: 256) + + -B <size>, --bufsize <size> buffer size, number of samples used for each + analysis, (e.g. FFT length, default: 512) + + -h, --help show help message and exit + + -T format, --time-format format select time values output format (samples, + ms, seconds) (default: seconds) + + -v, --verbose be verbose (increment verbosity by 1, default: 1) + + -q, --quiet be quiet (set verbosity to 0) + +ONSET + + The following additional options can be used with the "onset" subcommand. + + -m <method>, --method <method> onset novelty function + <default|energy|hfc|complex|phase|specdiff|kl|mkl|specflux> (default: + default) + + -t <threshold>, --threshold <threshold> threshold (default: unset) + + -s <value>, --silence <value> silence threshold, in dB (default: -70) + + -M <value>, --minioi <value> minimum Inter-Onset Interval (default: 12ms) + +PITCH + + The following additional options can be used with the "pitch" subcommand. + + -m <method>, --method <method> pitch detection method + <default|yinfft|yin|mcomb|fcomb|schmitt> (default: default, e.g. yinfft) + + -t <threshold>, --threshold <threshold> tolerance (default: unset) + + -s <value>, --silence <value> silence threshold, in dB (default: -70) + + The default buffer size for the beat algorithm is 2048. The default hop size + is 256. + +BEAT + + The "beat" command accepts all common options and no additional options. + + The default buffer size for the beat algorithm is 1024. The default hop size + is 512. + +TEMPO + + The "tempo" command accepts all common options and no additional options. + + The default buffer size for the beat algorithm is 1024. The default hop size + is 512. + +NOTES + + The following additional options can be used with the "notes" subcommand. + + -s <value>, --silence <value> silence threshold, in dB (default: -70) + + -d <value>, --release-drop <value> release drop level, in dB. If the level + drops more than this amount since the last note started, the note will be + turned off (default: 10). + +MFCC + + The "mfcc" command accepts all common options and no additional options. + +MELBANDS + + The "melbands" command accepts all common options and no additional options. + +EXAMPLES + + Extract onsets using a minimum inter-onset interval of 30ms: + + aubio onset /path/to/input_file -M 30ms + + Extract pitch with method "mcomb" and a silence threshold of -90dB: + + aubio pitch /path/to/input_file -m mcomb -s -90.0 + + Extract MFCC using the standard Slaney implementation: + + aubio mfcc /path/to/input_file -r 44100 + + +SEE ALSO + + aubiocut(1) + +AUTHOR + + This manual page was written by Paul Brossier <piem@aubio.org>. Permission is + granted to copy, distribute and/or modify this document under the terms of + the GNU General Public License as published by the Free Software Foundation, + either version 3 of the License, or (at your option) any later version. diff --git a/doc/aubiocut.txt b/doc/aubiocut.txt index be19dad..c578133 100644 --- a/doc/aubiocut.txt +++ b/doc/aubiocut.txt @@ -34,9 +34,9 @@ OPTIONS -b, --beat Use beat locations instead of onset locations. -t, --onset-threshold thres Set the threshold value for the onset peak - picking. Typical values are typically within 0.001 and 0.900. Defaults to - 0.1. Lower threshold values imply more onsets detected. Try 0.5 in case of - over-detections. Defaults to 0.3. + picking. Values are typically in the range [0.001, 0.900]. Lower threshold + values imply more onsets detected. Increasing this threshold should reduce + the number of incorrect detections. Defaults to 0.3. -c, --cut Cut input sound file at detected labels. A new sound files for each slice will be created in the current directory. @@ -50,6 +50,8 @@ OPTIONS --cut-until-nslices n How many extra slices should be added at the end of each slice (default 0). + --create-first Alway create first slice. + -h, --help Print a short help message and exit. -v, --verbose Be verbose. diff --git a/doc/aubiomfcc.txt b/doc/aubiomfcc.txt index afeafe3..7ef93f7 100644 --- a/doc/aubiomfcc.txt +++ b/doc/aubiomfcc.txt @@ -6,6 +6,7 @@ SYNOPSIS aubiomfcc source aubiomfcc [[-i] source] [-r rate] [-B win] [-H hop] + [-T time-format] [-v] [-h] DESCRIPTION @@ -14,7 +15,7 @@ DESCRIPTION MFCCs are coefficients that make up for the mel-frequency spectrum, a representation of the short-term power spectrum of a sound. By default, 13 - coefficents are computed using 40 filters. + coefficients are computed using 40 filters. When started with an input source (-i/--input), the coefficients are given on the console, prefixed by their timestamps in seconds. @@ -37,6 +38,9 @@ OPTIONS -H, --hopsize hop The number of samples between two consecutive analysis. Defaults to 256. + -T, --timeformat format Set time format (samples, ms, seconds). Defaults to + seconds. + -h, --help Print a short help message and exit. -v, --verbose Be verbose. @@ -47,7 +51,7 @@ REFERENCES according to Malcolm Slaney's Auditory Toolbox, available at the following url: - http://cobweb.ecn.purdue.edu/~malcolm/interval/1998-010/ (see file mfcc.m) + https://engineering.purdue.edu/~malcolm/interval/1998-010/ (see file mfcc.m) SEE ALSO diff --git a/doc/aubionotes.txt b/doc/aubionotes.txt index 190fc72..3701056 100644 --- a/doc/aubionotes.txt +++ b/doc/aubionotes.txt @@ -6,8 +6,9 @@ SYNOPSIS aubionotes source aubionotes [[-i] source] [-r rate] [-B win] [-H hop] - [-O method] [-t thres] + [-O method] [-t thres] [-d drop] [-p method] [-u unit] [-l thres] + [-T time-format] [-s sil] [-j] [-v] [-h] @@ -49,6 +50,9 @@ OPTIONS 0.1. Lower threshold values imply more onsets detected. Try 0.5 in case of over-detections. Defaults to 0.3. + -M, --minioi value Set the minimum inter-onset interval, in seconds, the + shortest interval between two consecutive notes. Defaults to 0.030 + -p, --pitch method The pitch detection method to use. See PITCH METHODS below. Defaults to 'default'. @@ -64,6 +68,13 @@ OPTIONS will not be detected. A value of -20.0 would eliminate most onsets but the loudest ones. A value of -90.0 would select all onsets. Defaults to -90.0. + -d, --release-drop Set the release drop threshold, in dB. If the level drops + more than this amount since the last note started, the note will be turned + off. Defaults to 10. + + -T, --timeformat format Set time format (samples, ms, seconds). Defaults to + seconds. + -j, --jack Use Jack input/output. You will need a Jack connection controller to feed aubio some signal and listen to its output. @@ -80,7 +91,8 @@ ONSET METHODS PITCH METHODS - Available methods: default, schmitt, fcomb, mcomb, specacf, yin, yinfft. + Available methods: default, schmitt, fcomb, mcomb, specacf, yin, yinfft, + yinfast. See aubiopitch(1) for details about these methods. diff --git a/doc/aubioonset.txt b/doc/aubioonset.txt index f9d3783..e3cb560 100644 --- a/doc/aubioonset.txt +++ b/doc/aubioonset.txt @@ -7,8 +7,10 @@ SYNOPSIS aubioonset [[-i] source] [-o sink] [-r rate] [-B win] [-H hop] [-O method] [-t thres] + [-T time-format] [-s sil] [-m] [-f] - [-j] [-v] [-h] + [-j] [-N miditap-note] [-V miditap-velo] + [-v] [-h] DESCRIPTION @@ -47,14 +49,20 @@ OPTIONS below. Defaults to 'default'. -t, --onset-threshold thres Set the threshold value for the onset peak - picking. Typical values are typically within 0.001 and 0.900. Defaults to - 0.1. Lower threshold values imply more onsets detected. Try 0.5 in case of - over-detections. Defaults to 0.3. + picking. Values are typically in the range [0.001, 0.900]. Lower threshold + values imply more onsets detected. Increasing this threshold should reduce + the number of incorrect detections. Defaults to 0.3. - -s, --silence sil Set the silence threshold, in dB, under which the pitch + -M, --minioi value Set the minimum inter-onset interval, in seconds, the + shortest interval between two consecutive onsets. Defaults to 0.020 + + -s, --silence sil Set the silence threshold, in dB, under which the onset will not be detected. A value of -20.0 would eliminate most onsets but the loudest ones. A value of -90.0 would select all onsets. Defaults to -90.0. + -T, --timeformat format Set time format (samples, ms, seconds). Defaults to + seconds. + -m, --mix-input Mix source signal to the output signal before writing to sink. @@ -63,6 +71,10 @@ OPTIONS -j, --jack Use Jack input/output. You will need a Jack connection controller to feed aubio some signal and listen to its output. + -N, --miditap-note Override note value for MIDI tap. Defaults to 69. + + -V, --miditap-velop Override velocity value for MIDI tap. Defaults to 65. + -h, --help Print a short help message and exit. -v, --verbose Be verbose. diff --git a/doc/aubiopitch.txt b/doc/aubiopitch.txt index 1fc8205..a521ce1 100644 --- a/doc/aubiopitch.txt +++ b/doc/aubiopitch.txt @@ -7,6 +7,7 @@ SYNOPSIS aubiopitch [[-i] source] [-o sink] [-r rate] [-B win] [-H hop] [-p method] [-u unit] [-l thres] + [-T time-format] [-s sil] [-f] [-v] [-h] [-j] @@ -59,6 +60,9 @@ OPTIONS will not be detected. A value of -20.0 would eliminate most onsets but the loudest ones. A value of -90.0 would select all onsets. Defaults to -90.0. + -T, --timeformat format Set time format (samples, ms, seconds). Defaults to + seconds. + -m, --mix-input Mix source signal to the output signal before writing to sink. @@ -116,6 +120,12 @@ PITCH METHODS Chapter 3, Pitch Analysis, PhD thesis, Centre for Digital music, Queen Mary University of London, London, UK, 2006. + yinfast YIN algorithm (accelerated) + + An optimised implementation of the YIN algorithm, yielding results identical + to the original YIN algorithm, while reducing its computational cost from + O(n^2) to O(n log(n)). + SEE ALSO aubioonset(1), diff --git a/doc/aubioquiet.txt b/doc/aubioquiet.txt index eb11ae0..9428315 100644 --- a/doc/aubioquiet.txt +++ b/doc/aubioquiet.txt @@ -6,6 +6,7 @@ SYNOPSIS aubioquiet source aubioquiet [[-i] source] [-r rate] [-B win] [-H hop] + [-T time-format] [-s sil] [-v] [-h] @@ -38,6 +39,9 @@ OPTIONS -s, --silence sil Set the silence threshold, in dB, under which the pitch will not be detected. Defaults to -90.0. + -T, --timeformat format Set time format (samples, ms, seconds). Defaults to + seconds. + -h, --help Print a short help message and exit. -v, --verbose Be verbose. diff --git a/doc/aubiotrack.txt b/doc/aubiotrack.txt index 753e97f..4b7aba4 100644 --- a/doc/aubiotrack.txt +++ b/doc/aubiotrack.txt @@ -6,8 +6,10 @@ SYNOPSIS aubiotrack source aubiotrack [[-i] source] [-o sink] [-r rate] [-B win] [-H hop] + [-T time-format] [-s sil] [-m] - [-j] [-v] [-h] + [-j] [-N miditap-note] [-V miditap-velo] + [-v] [-h] DESCRIPTION @@ -53,6 +55,13 @@ OPTIONS -j, --jack Use Jack input/output. You will need a Jack connection controller to feed aubio some signal and listen to its output. + -N, --miditap-note Override note value for MIDI tap. Defaults to 69. + + -V, --miditap-velop Override velocity value for MIDI tap. Defaults to 65. + + -T, --timeformat format Set time format (samples, ms, seconds). Defaults to + seconds. + -h, --help Print a short help message and exit. -v, --verbose Be verbose. @@ -68,7 +77,7 @@ BEAT TRACKING METHODS Matthew E. P. Davies, Paul Brossier, and Mark D. Plumbley. Beat tracking towards automatic musical accompaniment. In Proceedings of the Audio - Engeeniring Society 118th Convention, Barcelona, Spain, May 2005. + Engineering Society 118th Convention, Barcelona, Spain, May 2005. SEE ALSO diff --git a/doc/binaries.rst b/doc/binaries.rst new file mode 100644 index 0000000..28ae31e --- /dev/null +++ b/doc/binaries.rst @@ -0,0 +1,12 @@ +Pre-compiled binaries +--------------------- + +`Pre-compiled binaries <https://aubio.org/download>`_ +are available for +`macOS <https://aubio.org/download#osx>`_, +`iOS <https://aubio.org/download#ios>`_, +and +`windows <https://aubio.org/download#win>`_ + +To use aubio in a macOS or iOS application, see :ref:`xcode-frameworks-label`. + diff --git a/doc/building.rst b/doc/building.rst new file mode 100644 index 0000000..3a11e4e --- /dev/null +++ b/doc/building.rst @@ -0,0 +1,129 @@ +.. highlight:: bash + +.. _building: + +Building aubio +============== + +.. note:: + To download a prebuilt version of aubio, see :ref:`download`. + +aubio uses `waf`_ to configure, compile, and test the source. +A copy of waf is included in aubio tarball, so all you need is a terminal, +a compiler, and a recent version of python installed. + +.. note:: + Make sure you have all the :ref:`requirements` you want before building. + +Latest release +-------------- + +The **latest stable release** can be downloaded from https://aubio.org/download:: + + $ curl -O http://aubio.org/pub/aubio-<version>.tar.bz2 + $ tar xf aubio-<version>.tar.bz2 + $ cd aubio-<version>/ + +Git repository +-------------- + +The **latest git branch** can be obtained with:: + + $ git clone git://git.aubio.org/git/aubio + $ cd aubio/ + +The following command will fetch the correct `waf`_ version (not included in +aubio's git):: + + $ ./scripts/get_waf.sh + +.. note:: + + Windows users without `Git Bash`_ installed will want to use the following + commands instead: + + .. code:: bash + + $ curl -fsS -o waf https://waf.io/waf-1.8.22 + $ curl -fsS -o waf.bat https://raw.githubusercontent.com/waf-project/waf/master/utils/waf.bat + + +Compiling +--------- + +To compile the C library, examples programs, and tests, run:: + + $ ./waf configure + +Check out the available options using ``./waf configure --help``. Once +you are done with configuration, you can start building:: + + $ ./waf build + +To install the freshly built C library and tools, simply run the following +command:: + + $ sudo ./waf install + +.. note:: + Windows users should simply run ``waf``, without the leading ``./``. For + instance: + + .. code:: bash + + $ waf configure build + + +Running as a user +----------------- + +To use aubio without actually installing, for instance if you don't have root +access to install libaubio on your system, + +On Linux or macOS, sourcing the script ``scripts/setenv_local.sh`` should help:: + + $ source ./scripts/setenv_local.sh + +This script sets ``LD_LIBRARY_PATH``, for libaubio, and ``PYTHONPATH`` for the +python module. + +On Linux, you should be able to set ``LD_LIBRARY_PATH`` with:: + + $ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PWD/build/src + +On Mac OS X, a copy or a symlink can be made in ``~/lib``:: + + $ mkdir -p ~/lib + $ ln -sf $PWD/build/src/libaubio*.dylib ~/lib/ + +Note on Mac OS X systems older than El Capitan (10.11), the ``DYLD_LIBRARY_PATH`` +variable can be set as follows:: + + $ export DYLD_LIBRARY_PATH=$DYLD_LIBRARY_PATH:$PWD/build/src + +Cleaning +-------- + +If you wish to uninstall the files installed by the ``install`` command, use +``uninstall``:: + + $ sudo ./waf uninstall + +To clean the source directory, use the ``clean`` command:: + + $ ./waf clean + +To also forget the options previously passed to the last ``./waf configure`` +invocation, use the ``distclean`` command:: + + $ ./waf distclean + +.. _waf: https://waf.io/ + +.. _Git Bash: https://git-for-windows.github.io/ + +.. _xcode-frameworks-label: + +.. include:: xcode_frameworks.rst + +.. include:: android.rst diff --git a/doc/cli.rst b/doc/cli.rst new file mode 100644 index 0000000..5df3b53 --- /dev/null +++ b/doc/cli.rst @@ -0,0 +1,73 @@ +.. _manpages: + +Command line tools +================== + +The python module comes with the following tools: + + - ``aubio`` estimate and extract descriptors from sound files + - ``aubiocut`` slices sound files at onset or beat timestamps + +More command line tools are included along with the library. + + - ``aubioonset`` outputs the time stamp of detected note onsets + - ``aubiopitch`` attempts to identify a fundamental frequency, or pitch, for + each frame of the input sound + - ``aubiomfcc`` computes Mel-frequency Cepstrum Coefficients + - ``aubiotrack`` outputs the time stamp of detected beats + - ``aubionotes`` emits midi-like notes, with an onset, a pitch, and a duration + - ``aubioquiet`` extracts quiet and loud regions + + +``aubio`` +--------- + +.. literalinclude:: aubio.txt + :language: text + + +``aubiocut`` +-------------- + +.. literalinclude:: aubiocut.txt + :language: text + + +``aubioonset`` +-------------- + +.. literalinclude:: aubioonset.txt + :language: text + +``aubiopitch`` +-------------- + +.. literalinclude:: aubiopitch.txt + :language: text + +``aubiomfcc`` +-------------- + +.. literalinclude:: aubiomfcc.txt + :language: text + +``aubiotrack`` +-------------- + +.. literalinclude:: aubiotrack.txt + :language: text + +``aubionotes`` +-------------- + +.. literalinclude:: aubionotes.txt + :language: text + +``aubioquiet`` +-------------- + +.. literalinclude:: aubioquiet.txt + :language: text + + +.. include:: cli_features.rst diff --git a/doc/cli_features.rst b/doc/cli_features.rst new file mode 100644 index 0000000..770a027 --- /dev/null +++ b/doc/cli_features.rst @@ -0,0 +1,42 @@ +Command line features +--------------------- + ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| feat vs. prg | onset | pitch | mfcc | track | notes | quiet | cut1 | short options | ++==============+=======+=======+======+=======+=======+=======+======+==================+ +| input | Y | Y | Y | Y | Y | Y | Y | -i | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| output | Y | Y | N | Y | Y | N | Y!1 | -o,-m,-f | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| Hz/buf/hop | Y | Y | Y | Y | Y | Y!2 | Y | -r,-B-,H | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| jack | Y | Y | N | Y | Y | N!3 | N | -j | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| onset | Y | N | N | Y!8 | Y!6 | N | Y | -O,-t,-M | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| pitch | N | Y | N | N | Y!6 | N | N!5 | -p,-u,-l | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| silence | Y | Y | N | Y | Y!7 | Y | N!4 | -s | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| timefmt | Y | Y | Y | Y | Y | Y | ! | -T | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| help | Y | Y | Y | Y | Y | Y | Y | -h | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ +| verbose | Y | Y | Y | Y | Y | Y | Y | -v | ++--------------+-------+-------+------+-------+-------+-------+------+------------------+ + +1. ``aubiocut --output`` is used to specify a directory, not a file. + +2. Option ``--bufsize`` is useless for ``aubioquiet`` + +3. ``aubioquiet`` could have a jack output + +4. Regression, re-add slicing at silences to ``aubiocut`` + +5. ``aubiocut`` could cut on notes + +6. ``aubionotes`` needs onset/pitch setters. + +7. Silence was different for pitch and onset, test. + +8. Some ``aubiotrack`` options should be disabled (minioi, threshold). diff --git a/doc/conf.py b/doc/conf.py index 48e5a4e..7d491b0 100644 --- a/doc/conf.py +++ b/doc/conf.py @@ -13,10 +13,14 @@ import sys, os +# get version using this_version.py +sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), '..')) +from this_version import get_aubio_version + # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. -sys.path.insert(0, os.path.abspath('../../python/build/lib.macosx-10.6-intel-2.7')) +#sys.path.insert(0, os.path.abspath('../../python/build/...')) # -- General configuration ----------------------------------------------------- @@ -25,7 +29,14 @@ sys.path.insert(0, os.path.abspath('../../python/build/lib.macosx-10.6-intel-2.7 # Add any Sphinx extension module names here, as strings. They can be extensions # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. -extensions = ['sphinx.ext.viewcode', 'sphinx.ext.autodoc'] +extensions = ['sphinx.ext.viewcode', 'sphinx.ext.autodoc', + 'sphinx.ext.napoleon', 'sphinx.ext.intersphinx'] + +autodoc_member_order = 'groupwise' + +intersphinx_mapping = { + 'numpy': ('https://docs.scipy.org/doc/numpy/', None), + } # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] @@ -41,16 +52,17 @@ master_doc = 'index' # General information about the project. project = u'aubio' -copyright = u'2014, Paul Brossier' +copyright = u'2018, Paul Brossier' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. -version = '0.4' + +version = get_aubio_version()[:3] # The full version, including alpha/beta/rc tags. -release = 'latest' +release = get_aubio_version() # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. @@ -64,7 +76,17 @@ release = 'latest' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. -exclude_patterns = ['_build'] +exclude_patterns = ['_build', + 'statuslinks.rst', + 'download.rst', + 'binaries.rst', + 'debian_packages.rst', + 'building.rst', + 'android.rst', + 'xcode_frameworks.rst', + 'requirements.rst', + 'cli_features.rst', + ] # The reST default role (used for this markup: `text`) to use for all documents. #default_role = None @@ -91,7 +113,10 @@ modindex_common_prefix = ['aubio.'] # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. -html_theme = 'default' +#html_theme = 'agogo' +#html_theme = 'default' +#html_theme = 'haiku' +html_theme = 'pyramid' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the @@ -120,7 +145,7 @@ html_theme = 'default' # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". -html_static_path = ['_static'] +html_static_path = [] #['_static'] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. @@ -150,7 +175,7 @@ html_static_path = ['_static'] #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. -#html_show_sphinx = True +html_show_sphinx = False # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True @@ -240,3 +265,6 @@ texinfo_documents = [ # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' + +def setup(app): + if release.endswith('~alpha'): app.tags.add('devel') diff --git a/doc/debian_packages.rst b/doc/debian_packages.rst new file mode 100644 index 0000000..9bc7ab7 --- /dev/null +++ b/doc/debian_packages.rst @@ -0,0 +1,16 @@ +Debian/Ubuntu packages +---------------------- + +For the latest Debian packages, see https://packages.debian.org/src:aubio. + +For the latest Ubuntu packages, see http://packages.ubuntu.com/src:aubio. + +For the latest version of the packages, see +https://anonscm.debian.org/cgit/collab-maint/aubio.git/. Use +``git-buildpackage`` to build from the git repository. For instance: + +.. code-block:: console + + $ git clone git://anonscm.debian.org/collab-maint/aubio.git + $ cd aubio + $ git buildpackage diff --git a/doc/develop.rst b/doc/develop.rst new file mode 100644 index 0000000..fa27db0 --- /dev/null +++ b/doc/develop.rst @@ -0,0 +1,154 @@ +.. _develop: + +Developing with aubio +===================== + +Here is a brief overview of the C library. + +For a more detailed list of available functions, see the `API documentation +<https://aubio.org/doc/latest/>`_. + +To report issues, ask questions, and request new features, use `Github Issues +<https://github.com/aubio/aubio/issues>`_ + +Design Basics +------------- + +The library is written in C and is optimised for speed and portability. + +All memory allocations take place in the `new_` methods. Each successful call +to `new_` should have a matching call to `del_` to deallocate the object. + +.. code-block:: C + + // new_ to create an object foobar + aubio_foobar_t * new_aubio_foobar(void * args); + // del_ to delete foobar + void del_aubio_foobar (aubio_foobar_t * foobar); + +The main computations are done in the `_do` methods. + +.. code-block:: C + + // _do to process output = foobar(input) + audio_foobar_do (aubio_foobar_t * foobar, fvec_t * input, cvec_t * output); + +Most parameters can be read and written at any time: + +.. code-block:: C + + // _get_param to get foobar.param + smpl_t aubio_foobar_get_a_parameter (aubio_foobar_t * foobar); + // _set_param to set foobar.param + uint_t aubio_foobar_set_a_parameter (aubio_foobar_t * foobar, smpl_t a_parameter); + +In some case, more functions are available: + +.. code-block:: C + + // non-real time functions + uint_t aubio_foobar_reset(aubio_foobar_t * t); + +Basic Types +----------- + +.. code-block:: C + + // integers + uint_t n = 10; // unsigned + sint_t delay = -90; // signed + + // float + smpl_t a = -90.; // simple precision + lsmp_t f = 0.024; // double precision + + // vector of floats (simple precision) + fvec_t * vec = new_fvec(n); + vec->data[0] = 1; + vec->data[vec->length-1] = 1.; // vec->data has n elements + fvec_print(vec); + del_fvec(vec); + + // complex data + cvec_t * fftgrain = new_cvec(n); + vec->norm[0] = 1.; // vec->norm has n/2+1 elements + vec->phas[n/2] = 3.1415; // vec->phas as well + del_cvec(fftgrain); + + // matrix + fmat_t * mat = new_fmat (height, length); + mat->data[height-1][0] = 1; // mat->data has height rows + mat->data[0][length-1] = 10; // mat->data[0] has length columns + del_fmat(mat); + + +Reading a sound file +-------------------- + +In this example, `aubio_source <https://aubio.org/doc/latest/source_8h.html>`_ +is used to read a media file. + +First, define a few variables and allocate some memory. + +.. literalinclude:: ../tests/src/io/test-source.c + :language: C + :lines: 22-24, 30-32, 34 + +.. note:: + With ``samplerate = 0``, ``aubio_source`` will be created with the file's + original samplerate. + +Now for the processing loop: + +.. literalinclude:: ../tests/src/io/test-source.c + :language: C + :lines: 40-44 + +At the end of the processing loop, memory is deallocated: + +.. literalinclude:: ../tests/src/io/test-source.c + :language: C + :lines: 55-56 + +See the complete example: :download:`test-source.c +<../tests/src/io/test-source.c>`. + +Computing a spectrum +-------------------- + +Now let's create a phase vocoder: + +.. literalinclude:: ../tests/src/spectral/test-phasevoc.c + :language: C + :lines: 6-11 + +The processing loop could now look like: + +.. literalinclude:: ../tests/src/spectral/test-phasevoc.c + :language: C + :lines: 20-37 + +Time to clean up the previously allocated memory: + +.. literalinclude:: ../tests/src/spectral/test-phasevoc.c + :language: C + :lines: 39-44 + +See the complete example: :download:`test-phasevoc.c +<../tests/src/spectral/test-phasevoc.c>`. + +.. _doxygen-documentation: + +Doxygen documentation +--------------------- + +The latest version of the API documentation is built using `Doxygen +<http://www.doxygen.org/>`_ and is available at: + + https://aubio.org/doc/latest/ + +Contribute +---------- + +Please report any issue and feature request at the `Github issue tracker +<https://github.com/aubio/aubio/issues>`_. Patches and pull-requests welcome! diff --git a/doc/download.rst b/doc/download.rst new file mode 100644 index 0000000..ceb3377 --- /dev/null +++ b/doc/download.rst @@ -0,0 +1,16 @@ +.. _download: + +Downloading aubio +================= + +A number of distributions already include aubio. Check your favorite package +management system, or have a look at the `aubio download page +<https://aubio.org/download>`_ for more options. + +To use aubio in an android project, see :ref:`android`. + +To compile aubio from source, read :ref:`building`. + +.. include:: binaries.rst + +.. include:: debian_packages.rst diff --git a/doc/full.cfg b/doc/full.cfg deleted file mode 100644 index ce341b3..0000000 --- a/doc/full.cfg +++ /dev/null @@ -1,2354 +0,0 @@ -# Doxyfile 1.8.8 - -# This file describes the settings to be used by the documentation system -# doxygen (www.doxygen.org) for a project. -# -# All text after a double hash (##) is considered a comment and is placed in -# front of the TAG it is preceding. -# -# All text after a single hash (#) is considered a comment and will be ignored. -# The format is: -# TAG = value [value, ...] -# For lists, items can also be appended using: -# TAG += value [value, ...] -# Values that contain spaces should be placed between quotes (\" \"). - -#--------------------------------------------------------------------------- -# Project related configuration options -#--------------------------------------------------------------------------- - -# This tag specifies the encoding used for all characters in the config file -# that follow. The default is UTF-8 which is also the encoding used for all text -# before the first occurrence of this tag. Doxygen uses libiconv (or the iconv -# built into libc) for the transcoding. See http://www.gnu.org/software/libiconv -# for the list of possible encodings. -# The default value is: UTF-8. - -DOXYFILE_ENCODING = UTF-8 - -# The PROJECT_NAME tag is a single word (or a sequence of words surrounded by -# double-quotes, unless you are using Doxywizard) that should identify the -# project for which the documentation is generated. This name is used in the -# title of most generated pages and in a few other places. -# The default value is: My Project. - -PROJECT_NAME = aubio - -# The PROJECT_NUMBER tag can be used to enter a project or revision number. This -# could be handy for archiving the generated documentation or if some version -# control system is used. - -PROJECT_NUMBER = "0.4.2~alpha full" - -# Using the PROJECT_BRIEF tag one can provide an optional one line description -# for a project that appears at the top of each page and should give viewer a -# quick idea about the purpose of the project. Keep the description short. - -PROJECT_BRIEF = - -# With the PROJECT_LOGO tag one can specify an logo or icon that is included in -# the documentation. The maximum height of the logo should not exceed 55 pixels -# and the maximum width should not exceed 200 pixels. Doxygen will copy the logo -# to the output directory. - -PROJECT_LOGO = - -# The OUTPUT_DIRECTORY tag is used to specify the (relative or absolute) path -# into which the generated documentation will be written. If a relative path is -# entered, it will be relative to the location where doxygen was started. If -# left blank the current directory will be used. - -OUTPUT_DIRECTORY = full - -# If the CREATE_SUBDIRS tag is set to YES, then doxygen will create 4096 sub- -# directories (in 2 levels) under the output directory of each output format and -# will distribute the generated files over these directories. Enabling this -# option can be useful when feeding doxygen a huge amount of source files, where -# putting all generated files in the same directory would otherwise causes -# performance problems for the file system. -# The default value is: NO. - -CREATE_SUBDIRS = NO - -# If the ALLOW_UNICODE_NAMES tag is set to YES, doxygen will allow non-ASCII -# characters to appear in the names of generated files. If set to NO, non-ASCII -# characters will be escaped, for example _xE3_x81_x84 will be used for Unicode -# U+3044. -# The default value is: NO. - -ALLOW_UNICODE_NAMES = NO - -# The OUTPUT_LANGUAGE tag is used to specify the language in which all -# documentation generated by doxygen is written. Doxygen will use this -# information to generate all constant output in the proper language. -# Possible values are: Afrikaans, Arabic, Armenian, Brazilian, Catalan, Chinese, -# Chinese-Traditional, Croatian, Czech, Danish, Dutch, English (United States), -# Esperanto, Farsi (Persian), Finnish, French, German, Greek, Hungarian, -# Indonesian, Italian, Japanese, Japanese-en (Japanese with English messages), -# Korean, Korean-en (Korean with English messages), Latvian, Lithuanian, -# Macedonian, Norwegian, Persian (Farsi), Polish, Portuguese, Romanian, Russian, -# Serbian, Serbian-Cyrillic, Slovak, Slovene, Spanish, Swedish, Turkish, -# Ukrainian and Vietnamese. -# The default value is: English. - -OUTPUT_LANGUAGE = English - -# If the BRIEF_MEMBER_DESC tag is set to YES doxygen will include brief member -# descriptions after the members that are listed in the file and class -# documentation (similar to Javadoc). Set to NO to disable this. -# The default value is: YES. - -BRIEF_MEMBER_DESC = YES - -# If the REPEAT_BRIEF tag is set to YES doxygen will prepend the brief -# description of a member or function before the detailed description -# -# Note: If both HIDE_UNDOC_MEMBERS and BRIEF_MEMBER_DESC are set to NO, the -# brief descriptions will be completely suppressed. -# The default value is: YES. - -REPEAT_BRIEF = YES - -# This tag implements a quasi-intelligent brief description abbreviator that is -# used to form the text in various listings. Each string in this list, if found -# as the leading text of the brief description, will be stripped from the text -# and the result, after processing the whole list, is used as the annotated -# text. Otherwise, the brief description is used as-is. If left blank, the -# following values are used ($name is automatically replaced with the name of -# the entity):The $name class, The $name widget, The $name file, is, provides, -# specifies, contains, represents, a, an and the. - -ABBREVIATE_BRIEF = - -# If the ALWAYS_DETAILED_SEC and REPEAT_BRIEF tags are both set to YES then -# doxygen will generate a detailed section even if there is only a brief -# description. -# The default value is: NO. - -ALWAYS_DETAILED_SEC = NO - -# If the INLINE_INHERITED_MEMB tag is set to YES, doxygen will show all -# inherited members of a class in the documentation of that class as if those -# members were ordinary class members. Constructors, destructors and assignment -# operators of the base classes will not be shown. -# The default value is: NO. - -INLINE_INHERITED_MEMB = NO - -# If the FULL_PATH_NAMES tag is set to YES doxygen will prepend the full path -# before files name in the file list and in the header files. If set to NO the -# shortest path that makes the file name unique will be used -# The default value is: YES. - -FULL_PATH_NAMES = YES - -# The STRIP_FROM_PATH tag can be used to strip a user-defined part of the path. -# Stripping is only done if one of the specified strings matches the left-hand -# part of the path. The tag can be used to show relative paths in the file list. -# If left blank the directory from which doxygen is run is used as the path to -# strip. -# -# Note that you can specify absolute paths here, but also relative paths, which -# will be relative from the directory where doxygen is started. -# This tag requires that the tag FULL_PATH_NAMES is set to YES. - -STRIP_FROM_PATH = ../src - -# The STRIP_FROM_INC_PATH tag can be used to strip a user-defined part of the -# path mentioned in the documentation of a class, which tells the reader which -# header file to include in order to use a class. If left blank only the name of -# the header file containing the class definition is used. Otherwise one should -# specify the list of include paths that are normally passed to the compiler -# using the -I flag. - -STRIP_FROM_INC_PATH = - -# If the SHORT_NAMES tag is set to YES, doxygen will generate much shorter (but -# less readable) file names. This can be useful is your file systems doesn't -# support long names like on DOS, Mac, or CD-ROM. -# The default value is: NO. - -SHORT_NAMES = NO - -# If the JAVADOC_AUTOBRIEF tag is set to YES then doxygen will interpret the -# first line (until the first dot) of a Javadoc-style comment as the brief -# description. If set to NO, the Javadoc-style will behave just like regular Qt- -# style comments (thus requiring an explicit @brief command for a brief -# description.) -# The default value is: NO. - -JAVADOC_AUTOBRIEF = YES - -# If the QT_AUTOBRIEF tag is set to YES then doxygen will interpret the first -# line (until the first dot) of a Qt-style comment as the brief description. If -# set to NO, the Qt-style will behave just like regular Qt-style comments (thus -# requiring an explicit \brief command for a brief description.) -# The default value is: NO. - -QT_AUTOBRIEF = NO - -# The MULTILINE_CPP_IS_BRIEF tag can be set to YES to make doxygen treat a -# multi-line C++ special comment block (i.e. a block of //! or /// comments) as -# a brief description. This used to be the default behavior. The new default is -# to treat a multi-line C++ comment block as a detailed description. Set this -# tag to YES if you prefer the old behavior instead. -# -# Note that setting this tag to YES also means that rational rose comments are -# not recognized any more. -# The default value is: NO. - -MULTILINE_CPP_IS_BRIEF = NO - -# If the INHERIT_DOCS tag is set to YES then an undocumented member inherits the -# documentation from any documented member that it re-implements. -# The default value is: YES. - -INHERIT_DOCS = YES - -# If the SEPARATE_MEMBER_PAGES tag is set to YES, then doxygen will produce a -# new page for each member. If set to NO, the documentation of a member will be -# part of the file/class/namespace that contains it. -# The default value is: NO. - -SEPARATE_MEMBER_PAGES = NO - -# The TAB_SIZE tag can be used to set the number of spaces in a tab. Doxygen -# uses this value to replace tabs by spaces in code fragments. -# Minimum value: 1, maximum value: 16, default value: 4. - -TAB_SIZE = 4 - -# This tag can be used to specify a number of aliases that act as commands in -# the documentation. An alias has the form: -# name=value -# For example adding -# "sideeffect=@par Side Effects:\n" -# will allow you to put the command \sideeffect (or @sideeffect) in the -# documentation, which will result in a user-defined paragraph with heading -# "Side Effects:". You can put \n's in the value part of an alias to insert -# newlines. - -ALIASES = - -# This tag can be used to specify a number of word-keyword mappings (TCL only). -# A mapping has the form "name=value". For example adding "class=itcl::class" -# will allow you to use the command class in the itcl::class meaning. - -TCL_SUBST = - -# Set the OPTIMIZE_OUTPUT_FOR_C tag to YES if your project consists of C sources -# only. Doxygen will then generate output that is more tailored for C. For -# instance, some of the names that are used will be different. The list of all -# members will be omitted, etc. -# The default value is: NO. - -OPTIMIZE_OUTPUT_FOR_C = YES - -# Set the OPTIMIZE_OUTPUT_JAVA tag to YES if your project consists of Java or -# Python sources only. Doxygen will then generate output that is more tailored -# for that language. For instance, namespaces will be presented as packages, -# qualified scopes will look different, etc. -# The default value is: NO. - -OPTIMIZE_OUTPUT_JAVA = NO - -# Set the OPTIMIZE_FOR_FORTRAN tag to YES if your project consists of Fortran -# sources. Doxygen will then generate output that is tailored for Fortran. -# The default value is: NO. - -OPTIMIZE_FOR_FORTRAN = NO - -# Set the OPTIMIZE_OUTPUT_VHDL tag to YES if your project consists of VHDL -# sources. Doxygen will then generate output that is tailored for VHDL. -# The default value is: NO. - -OPTIMIZE_OUTPUT_VHDL = NO - -# Doxygen selects the parser to use depending on the extension of the files it -# parses. With this tag you can assign which parser to use for a given -# extension. Doxygen has a built-in mapping, but you can override or extend it -# using this tag. The format is ext=language, where ext is a file extension, and -# language is one of the parsers supported by doxygen: IDL, Java, Javascript, -# C#, C, C++, D, PHP, Objective-C, Python, Fortran (fixed format Fortran: -# FortranFixed, free formatted Fortran: FortranFree, unknown formatted Fortran: -# Fortran. In the later case the parser tries to guess whether the code is fixed -# or free formatted code, this is the default for Fortran type files), VHDL. For -# instance to make doxygen treat .inc files as Fortran files (default is PHP), -# and .f files as C (default is Fortran), use: inc=Fortran f=C. -# -# Note For files without extension you can use no_extension as a placeholder. -# -# Note that for custom extensions you also need to set FILE_PATTERNS otherwise -# the files are not read by doxygen. - -EXTENSION_MAPPING = - -# If the MARKDOWN_SUPPORT tag is enabled then doxygen pre-processes all comments -# according to the Markdown format, which allows for more readable -# documentation. See http://daringfireball.net/projects/markdown/ for details. -# The output of markdown processing is further processed by doxygen, so you can -# mix doxygen, HTML, and XML commands with Markdown formatting. Disable only in -# case of backward compatibilities issues. -# The default value is: YES. - -MARKDOWN_SUPPORT = YES - -# When enabled doxygen tries to link words that correspond to documented -# classes, or namespaces to their corresponding documentation. Such a link can -# be prevented in individual cases by by putting a % sign in front of the word -# or globally by setting AUTOLINK_SUPPORT to NO. -# The default value is: YES. - -AUTOLINK_SUPPORT = YES - -# If you use STL classes (i.e. std::string, std::vector, etc.) but do not want -# to include (a tag file for) the STL sources as input, then you should set this -# tag to YES in order to let doxygen match functions declarations and -# definitions whose arguments contain STL classes (e.g. func(std::string); -# versus func(std::string) {}). This also make the inheritance and collaboration -# diagrams that involve STL classes more complete and accurate. -# The default value is: NO. - -BUILTIN_STL_SUPPORT = NO - -# If you use Microsoft's C++/CLI language, you should set this option to YES to -# enable parsing support. -# The default value is: NO. - -CPP_CLI_SUPPORT = NO - -# Set the SIP_SUPPORT tag to YES if your project consists of sip (see: -# http://www.riverbankcomputing.co.uk/software/sip/intro) sources only. Doxygen -# will parse them like normal C++ but will assume all classes use public instead -# of private inheritance when no explicit protection keyword is present. -# The default value is: NO. - -SIP_SUPPORT = NO - -# For Microsoft's IDL there are propget and propput attributes to indicate -# getter and setter methods for a property. Setting this option to YES will make -# doxygen to replace the get and set methods by a property in the documentation. -# This will only work if the methods are indeed getting or setting a simple -# type. If this is not the case, or you want to show the methods anyway, you -# should set this option to NO. -# The default value is: YES. - -IDL_PROPERTY_SUPPORT = YES - -# If member grouping is used in the documentation and the DISTRIBUTE_GROUP_DOC -# tag is set to YES, then doxygen will reuse the documentation of the first -# member in the group (if any) for the other members of the group. By default -# all members of a group must be documented explicitly. -# The default value is: NO. - -DISTRIBUTE_GROUP_DOC = NO - -# Set the SUBGROUPING tag to YES to allow class member groups of the same type -# (for instance a group of public functions) to be put as a subgroup of that -# type (e.g. under the Public Functions section). Set it to NO to prevent -# subgrouping. Alternatively, this can be done per class using the -# \nosubgrouping command. -# The default value is: YES. - -SUBGROUPING = YES - -# When the INLINE_GROUPED_CLASSES tag is set to YES, classes, structs and unions -# are shown inside the group in which they are included (e.g. using \ingroup) -# instead of on a separate page (for HTML and Man pages) or section (for LaTeX -# and RTF). -# -# Note that this feature does not work in combination with -# SEPARATE_MEMBER_PAGES. -# The default value is: NO. - -INLINE_GROUPED_CLASSES = NO - -# When the INLINE_SIMPLE_STRUCTS tag is set to YES, structs, classes, and unions -# with only public data fields or simple typedef fields will be shown inline in -# the documentation of the scope in which they are defined (i.e. file, -# namespace, or group documentation), provided this scope is documented. If set -# to NO, structs, classes, and unions are shown on a separate page (for HTML and -# Man pages) or section (for LaTeX and RTF). -# The default value is: NO. - -INLINE_SIMPLE_STRUCTS = NO - -# When TYPEDEF_HIDES_STRUCT tag is enabled, a typedef of a struct, union, or -# enum is documented as struct, union, or enum with the name of the typedef. So -# typedef struct TypeS {} TypeT, will appear in the documentation as a struct -# with name TypeT. When disabled the typedef will appear as a member of a file, -# namespace, or class. And the struct will be named TypeS. This can typically be -# useful for C code in case the coding convention dictates that all compound -# types are typedef'ed and only the typedef is referenced, never the tag name. -# The default value is: NO. - -TYPEDEF_HIDES_STRUCT = NO - -# The size of the symbol lookup cache can be set using LOOKUP_CACHE_SIZE. This -# cache is used to resolve symbols given their name and scope. Since this can be -# an expensive process and often the same symbol appears multiple times in the -# code, doxygen keeps a cache of pre-resolved symbols. If the cache is too small -# doxygen will become slower. If the cache is too large, memory is wasted. The -# cache size is given by this formula: 2^(16+LOOKUP_CACHE_SIZE). The valid range -# is 0..9, the default is 0, corresponding to a cache size of 2^16=65536 -# symbols. At the end of a run doxygen will report the cache usage and suggest -# the optimal cache size from a speed point of view. -# Minimum value: 0, maximum value: 9, default value: 0. - -LOOKUP_CACHE_SIZE = 0 - -#--------------------------------------------------------------------------- -# Build related configuration options -#--------------------------------------------------------------------------- - -# If the EXTRACT_ALL tag is set to YES doxygen will assume all entities in -# documentation are documented, even if no documentation was available. Private -# class members and static file members will be hidden unless the -# EXTRACT_PRIVATE respectively EXTRACT_STATIC tags are set to YES. -# Note: This will also disable the warnings about undocumented members that are -# normally produced when WARNINGS is set to YES. -# The default value is: NO. - -EXTRACT_ALL = YES - -# If the EXTRACT_PRIVATE tag is set to YES all private members of a class will -# be included in the documentation. -# The default value is: NO. - -EXTRACT_PRIVATE = YES - -# If the EXTRACT_PACKAGE tag is set to YES all members with package or internal -# scope will be included in the documentation. -# The default value is: NO. - -EXTRACT_PACKAGE = NO - -# If the EXTRACT_STATIC tag is set to YES all static members of a file will be -# included in the documentation. -# The default value is: NO. - -EXTRACT_STATIC = YES - -# If the EXTRACT_LOCAL_CLASSES tag is set to YES classes (and structs) defined -# locally in source files will be included in the documentation. If set to NO -# only classes defined in header files are included. Does not have any effect -# for Java sources. -# The default value is: YES. - -EXTRACT_LOCAL_CLASSES = YES - -# This flag is only useful for Objective-C code. When set to YES local methods, -# which are defined in the implementation section but not in the interface are -# included in the documentation. If set to NO only methods in the interface are -# included. -# The default value is: NO. - -EXTRACT_LOCAL_METHODS = NO - -# If this flag is set to YES, the members of anonymous namespaces will be -# extracted and appear in the documentation as a namespace called -# 'anonymous_namespace{file}', where file will be replaced with the base name of -# the file that contains the anonymous namespace. By default anonymous namespace -# are hidden. -# The default value is: NO. - -EXTRACT_ANON_NSPACES = NO - -# If the HIDE_UNDOC_MEMBERS tag is set to YES, doxygen will hide all -# undocumented members inside documented classes or files. If set to NO these -# members will be included in the various overviews, but no documentation -# section is generated. This option has no effect if EXTRACT_ALL is enabled. -# The default value is: NO. - -HIDE_UNDOC_MEMBERS = NO - -# If the HIDE_UNDOC_CLASSES tag is set to YES, doxygen will hide all -# undocumented classes that are normally visible in the class hierarchy. If set -# to NO these classes will be included in the various overviews. This option has -# no effect if EXTRACT_ALL is enabled. -# The default value is: NO. - -HIDE_UNDOC_CLASSES = NO - -# If the HIDE_FRIEND_COMPOUNDS tag is set to YES, doxygen will hide all friend -# (class|struct|union) declarations. If set to NO these declarations will be -# included in the documentation. -# The default value is: NO. - -HIDE_FRIEND_COMPOUNDS = NO - -# If the HIDE_IN_BODY_DOCS tag is set to YES, doxygen will hide any -# documentation blocks found inside the body of a function. If set to NO these -# blocks will be appended to the function's detailed documentation block. -# The default value is: NO. - -HIDE_IN_BODY_DOCS = NO - -# The INTERNAL_DOCS tag determines if documentation that is typed after a -# \internal command is included. If the tag is set to NO then the documentation -# will be excluded. Set it to YES to include the internal documentation. -# The default value is: NO. - -INTERNAL_DOCS = NO - -# If the CASE_SENSE_NAMES tag is set to NO then doxygen will only generate file -# names in lower-case letters. If set to YES upper-case letters are also -# allowed. This is useful if you have classes or files whose names only differ -# in case and if your file system supports case sensitive file names. Windows -# and Mac users are advised to set this option to NO. -# The default value is: system dependent. - -CASE_SENSE_NAMES = NO - -# If the HIDE_SCOPE_NAMES tag is set to NO then doxygen will show members with -# their full class and namespace scopes in the documentation. If set to YES the -# scope will be hidden. -# The default value is: NO. - -HIDE_SCOPE_NAMES = NO - -# If the SHOW_INCLUDE_FILES tag is set to YES then doxygen will put a list of -# the files that are included by a file in the documentation of that file. -# The default value is: YES. - -SHOW_INCLUDE_FILES = YES - -# If the SHOW_GROUPED_MEMB_INC tag is set to YES then Doxygen will add for each -# grouped member an include statement to the documentation, telling the reader -# which file to include in order to use the member. -# The default value is: NO. - -SHOW_GROUPED_MEMB_INC = NO - -# If the FORCE_LOCAL_INCLUDES tag is set to YES then doxygen will list include -# files with double quotes in the documentation rather than with sharp brackets. -# The default value is: NO. - -FORCE_LOCAL_INCLUDES = NO - -# If the INLINE_INFO tag is set to YES then a tag [inline] is inserted in the -# documentation for inline members. -# The default value is: YES. - -INLINE_INFO = YES - -# If the SORT_MEMBER_DOCS tag is set to YES then doxygen will sort the -# (detailed) documentation of file and class members alphabetically by member -# name. If set to NO the members will appear in declaration order. -# The default value is: YES. - -SORT_MEMBER_DOCS = YES - -# If the SORT_BRIEF_DOCS tag is set to YES then doxygen will sort the brief -# descriptions of file, namespace and class members alphabetically by member -# name. If set to NO the members will appear in declaration order. Note that -# this will also influence the order of the classes in the class list. -# The default value is: NO. - -SORT_BRIEF_DOCS = NO - -# If the SORT_MEMBERS_CTORS_1ST tag is set to YES then doxygen will sort the -# (brief and detailed) documentation of class members so that constructors and -# destructors are listed first. If set to NO the constructors will appear in the -# respective orders defined by SORT_BRIEF_DOCS and SORT_MEMBER_DOCS. -# Note: If SORT_BRIEF_DOCS is set to NO this option is ignored for sorting brief -# member documentation. -# Note: If SORT_MEMBER_DOCS is set to NO this option is ignored for sorting -# detailed member documentation. -# The default value is: NO. - -SORT_MEMBERS_CTORS_1ST = NO - -# If the SORT_GROUP_NAMES tag is set to YES then doxygen will sort the hierarchy -# of group names into alphabetical order. If set to NO the group names will -# appear in their defined order. -# The default value is: NO. - -SORT_GROUP_NAMES = NO - -# If the SORT_BY_SCOPE_NAME tag is set to YES, the class list will be sorted by -# fully-qualified names, including namespaces. If set to NO, the class list will -# be sorted only by class name, not including the namespace part. -# Note: This option is not very useful if HIDE_SCOPE_NAMES is set to YES. -# Note: This option applies only to the class list, not to the alphabetical -# list. -# The default value is: NO. - -SORT_BY_SCOPE_NAME = NO - -# If the STRICT_PROTO_MATCHING option is enabled and doxygen fails to do proper -# type resolution of all parameters of a function it will reject a match between -# the prototype and the implementation of a member function even if there is -# only one candidate or it is obvious which candidate to choose by doing a -# simple string match. By disabling STRICT_PROTO_MATCHING doxygen will still -# accept a match between prototype and implementation in such cases. -# The default value is: NO. - -STRICT_PROTO_MATCHING = NO - -# The GENERATE_TODOLIST tag can be used to enable ( YES) or disable ( NO) the -# todo list. This list is created by putting \todo commands in the -# documentation. -# The default value is: YES. - -GENERATE_TODOLIST = YES - -# The GENERATE_TESTLIST tag can be used to enable ( YES) or disable ( NO) the -# test list. This list is created by putting \test commands in the -# documentation. -# The default value is: YES. - -GENERATE_TESTLIST = YES - -# The GENERATE_BUGLIST tag can be used to enable ( YES) or disable ( NO) the bug -# list. This list is created by putting \bug commands in the documentation. -# The default value is: YES. - -GENERATE_BUGLIST = YES - -# The GENERATE_DEPRECATEDLIST tag can be used to enable ( YES) or disable ( NO) -# the deprecated list. This list is created by putting \deprecated commands in -# the documentation. -# The default value is: YES. - -GENERATE_DEPRECATEDLIST= YES - -# The ENABLED_SECTIONS tag can be used to enable conditional documentation -# sections, marked by \if <section_label> ... \endif and \cond <section_label> -# ... \endcond blocks. - -ENABLED_SECTIONS = - -# The MAX_INITIALIZER_LINES tag determines the maximum number of lines that the -# initial value of a variable or macro / define can have for it to appear in the -# documentation. If the initializer consists of more lines than specified here -# it will be hidden. Use a value of 0 to hide initializers completely. The -# appearance of the value of individual variables and macros / defines can be -# controlled using \showinitializer or \hideinitializer command in the -# documentation regardless of this setting. -# Minimum value: 0, maximum value: 10000, default value: 30. - -MAX_INITIALIZER_LINES = 30 - -# Set the SHOW_USED_FILES tag to NO to disable the list of files generated at -# the bottom of the documentation of classes and structs. If set to YES the list -# will mention the files that were used to generate the documentation. -# The default value is: YES. - -SHOW_USED_FILES = YES - -# Set the SHOW_FILES tag to NO to disable the generation of the Files page. This -# will remove the Files entry from the Quick Index and from the Folder Tree View -# (if specified). -# The default value is: YES. - -SHOW_FILES = YES - -# Set the SHOW_NAMESPACES tag to NO to disable the generation of the Namespaces -# page. This will remove the Namespaces entry from the Quick Index and from the -# Folder Tree View (if specified). -# The default value is: YES. - -SHOW_NAMESPACES = YES - -# The FILE_VERSION_FILTER tag can be used to specify a program or script that -# doxygen should invoke to get the current version for each file (typically from -# the version control system). Doxygen will invoke the program by executing (via -# popen()) the command command input-file, where command is the value of the -# FILE_VERSION_FILTER tag, and input-file is the name of an input file provided -# by doxygen. Whatever the program writes to standard output is used as the file -# version. For an example see the documentation. - -FILE_VERSION_FILTER = - -# The LAYOUT_FILE tag can be used to specify a layout file which will be parsed -# by doxygen. The layout file controls the global structure of the generated -# output files in an output format independent way. To create the layout file -# that represents doxygen's defaults, run doxygen with the -l option. You can -# optionally specify a file name after the option, if omitted DoxygenLayout.xml -# will be used as the name of the layout file. -# -# Note that if you run doxygen from a directory containing a file called -# DoxygenLayout.xml, doxygen will parse it automatically even if the LAYOUT_FILE -# tag is left empty. - -LAYOUT_FILE = - -# The CITE_BIB_FILES tag can be used to specify one or more bib files containing -# the reference definitions. This must be a list of .bib files. The .bib -# extension is automatically appended if omitted. This requires the bibtex tool -# to be installed. See also http://en.wikipedia.org/wiki/BibTeX for more info. -# For LaTeX the style of the bibliography can be controlled using -# LATEX_BIB_STYLE. To use this feature you need bibtex and perl available in the -# search path. See also \cite for info how to create references. - -CITE_BIB_FILES = - -#--------------------------------------------------------------------------- -# Configuration options related to warning and progress messages -#--------------------------------------------------------------------------- - -# The QUIET tag can be used to turn on/off the messages that are generated to -# standard output by doxygen. If QUIET is set to YES this implies that the -# messages are off. -# The default value is: NO. - -QUIET = NO - -# The WARNINGS tag can be used to turn on/off the warning messages that are -# generated to standard error ( stderr) by doxygen. If WARNINGS is set to YES -# this implies that the warnings are on. -# -# Tip: Turn warnings on while writing the documentation. -# The default value is: YES. - -WARNINGS = YES - -# If the WARN_IF_UNDOCUMENTED tag is set to YES, then doxygen will generate -# warnings for undocumented members. If EXTRACT_ALL is set to YES then this flag -# will automatically be disabled. -# The default value is: YES. - -WARN_IF_UNDOCUMENTED = YES - -# If the WARN_IF_DOC_ERROR tag is set to YES, doxygen will generate warnings for -# potential errors in the documentation, such as not documenting some parameters -# in a documented function, or documenting parameters that don't exist or using -# markup commands wrongly. -# The default value is: YES. - -WARN_IF_DOC_ERROR = YES - -# This WARN_NO_PARAMDOC option can be enabled to get warnings for functions that -# are documented, but have no documentation for their parameters or return -# value. If set to NO doxygen will only warn about wrong or incomplete parameter -# documentation, but not about the absence of documentation. -# The default value is: NO. - -WARN_NO_PARAMDOC = NO - -# The WARN_FORMAT tag determines the format of the warning messages that doxygen -# can produce. The string should contain the $file, $line, and $text tags, which -# will be replaced by the file and line number from which the warning originated -# and the warning text. Optionally the format may contain $version, which will -# be replaced by the version of the file (if it could be obtained via -# FILE_VERSION_FILTER) -# The default value is: $file:$line: $text. - -WARN_FORMAT = "$file:$line: $text" - -# The WARN_LOGFILE tag can be used to specify a file to which warning and error -# messages should be written. If left blank the output is written to standard -# error (stderr). - -WARN_LOGFILE = - -#--------------------------------------------------------------------------- -# Configuration options related to the input files -#--------------------------------------------------------------------------- - -# The INPUT tag is used to specify the files and/or directories that contain -# documented source files. You may enter file names like myfile.cpp or -# directories like /usr/src/myproject. Separate the files or directories with -# spaces. -# Note: If this tag is empty the current directory is searched. - -INPUT = ../src - -# This tag can be used to specify the character encoding of the source files -# that doxygen parses. Internally doxygen uses the UTF-8 encoding. Doxygen uses -# libiconv (or the iconv built into libc) for the transcoding. See the libiconv -# documentation (see: http://www.gnu.org/software/libiconv) for the list of -# possible encodings. -# The default value is: UTF-8. - -INPUT_ENCODING = UTF-8 - -# If the value of the INPUT tag contains directories, you can use the -# FILE_PATTERNS tag to specify one or more wildcard patterns (like *.cpp and -# *.h) to filter out the source-files in the directories. If left blank the -# following patterns are tested:*.c, *.cc, *.cxx, *.cpp, *.c++, *.java, *.ii, -# *.ixx, *.ipp, *.i++, *.inl, *.idl, *.ddl, *.odl, *.h, *.hh, *.hxx, *.hpp, -# *.h++, *.cs, *.d, *.php, *.php4, *.php5, *.phtml, *.inc, *.m, *.markdown, -# *.md, *.mm, *.dox, *.py, *.f90, *.f, *.for, *.tcl, *.vhd, *.vhdl, *.ucf, -# *.qsf, *.as and *.js. - -FILE_PATTERNS = *.h \ - *.c - -# The RECURSIVE tag can be used to specify whether or not subdirectories should -# be searched for input files as well. -# The default value is: NO. - -RECURSIVE = YES - -# The EXCLUDE tag can be used to specify files and/or directories that should be -# excluded from the INPUT source files. This way you can easily exclude a -# subdirectory from a directory tree whose root is specified with the INPUT tag. -# -# Note that relative paths are relative to the directory from which doxygen is -# run. - -EXCLUDE = - -# The EXCLUDE_SYMLINKS tag can be used to select whether or not files or -# directories that are symbolic links (a Unix file system feature) are excluded -# from the input. -# The default value is: NO. - -EXCLUDE_SYMLINKS = NO - -# If the value of the INPUT tag contains directories, you can use the -# EXCLUDE_PATTERNS tag to specify one or more wildcard patterns to exclude -# certain files from those directories. -# -# Note that the wildcards are matched against the file with absolute path, so to -# exclude all test directories for example use the pattern */test/* - -EXCLUDE_PATTERNS = - -# The EXCLUDE_SYMBOLS tag can be used to specify one or more symbol names -# (namespaces, classes, functions, etc.) that should be excluded from the -# output. The symbol name can be a fully qualified name, a word, or if the -# wildcard * is used, a substring. Examples: ANamespace, AClass, -# AClass::ANamespace, ANamespace::*Test -# -# Note that the wildcards are matched against the file with absolute path, so to -# exclude all test directories use the pattern */test/* - -EXCLUDE_SYMBOLS = - -# The EXAMPLE_PATH tag can be used to specify one or more files or directories -# that contain example code fragments that are included (see the \include -# command). - -EXAMPLE_PATH = ../examples \ - ../tests/src - -# If the value of the EXAMPLE_PATH tag contains directories, you can use the -# EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp and -# *.h) to filter out the source-files in the directories. If left blank all -# files are included. - -EXAMPLE_PATTERNS = - -# If the EXAMPLE_RECURSIVE tag is set to YES then subdirectories will be -# searched for input files to be used with the \include or \dontinclude commands -# irrespective of the value of the RECURSIVE tag. -# The default value is: NO. - -EXAMPLE_RECURSIVE = NO - -# The IMAGE_PATH tag can be used to specify one or more files or directories -# that contain images that are to be included in the documentation (see the -# \image command). - -IMAGE_PATH = - -# The INPUT_FILTER tag can be used to specify a program that doxygen should -# invoke to filter for each input file. Doxygen will invoke the filter program -# by executing (via popen()) the command: -# -# <filter> <input-file> -# -# where <filter> is the value of the INPUT_FILTER tag, and <input-file> is the -# name of an input file. Doxygen will then use the output that the filter -# program writes to standard output. If FILTER_PATTERNS is specified, this tag -# will be ignored. -# -# Note that the filter must not add or remove lines; it is applied before the -# code is scanned, but not when the output code is generated. If lines are added -# or removed, the anchors will not be placed correctly. - -INPUT_FILTER = - -# The FILTER_PATTERNS tag can be used to specify filters on a per file pattern -# basis. Doxygen will compare the file name with each pattern and apply the -# filter if there is a match. The filters are a list of the form: pattern=filter -# (like *.cpp=my_cpp_filter). See INPUT_FILTER for further information on how -# filters are used. If the FILTER_PATTERNS tag is empty or if none of the -# patterns match the file name, INPUT_FILTER is applied. - -FILTER_PATTERNS = - -# If the FILTER_SOURCE_FILES tag is set to YES, the input filter (if set using -# INPUT_FILTER ) will also be used to filter the input files that are used for -# producing the source files to browse (i.e. when SOURCE_BROWSER is set to YES). -# The default value is: NO. - -FILTER_SOURCE_FILES = NO - -# The FILTER_SOURCE_PATTERNS tag can be used to specify source filters per file -# pattern. A pattern will override the setting for FILTER_PATTERN (if any) and -# it is also possible to disable source filtering for a specific pattern using -# *.ext= (so without naming a filter). -# This tag requires that the tag FILTER_SOURCE_FILES is set to YES. - -FILTER_SOURCE_PATTERNS = - -# If the USE_MDFILE_AS_MAINPAGE tag refers to the name of a markdown file that -# is part of the input, its contents will be placed on the main page -# (index.html). This can be useful if you have a project on for instance GitHub -# and want to reuse the introduction page also for the doxygen output. - -USE_MDFILE_AS_MAINPAGE = - -#--------------------------------------------------------------------------- -# Configuration options related to source browsing -#--------------------------------------------------------------------------- - -# If the SOURCE_BROWSER tag is set to YES then a list of source files will be -# generated. Documented entities will be cross-referenced with these sources. -# -# Note: To get rid of all source code in the generated output, make sure that -# also VERBATIM_HEADERS is set to NO. -# The default value is: NO. - -SOURCE_BROWSER = YES - -# Setting the INLINE_SOURCES tag to YES will include the body of functions, -# classes and enums directly into the documentation. -# The default value is: NO. - -INLINE_SOURCES = NO - -# Setting the STRIP_CODE_COMMENTS tag to YES will instruct doxygen to hide any -# special comment blocks from generated source code fragments. Normal C, C++ and -# Fortran comments will always remain visible. -# The default value is: YES. - -STRIP_CODE_COMMENTS = NO - -# If the REFERENCED_BY_RELATION tag is set to YES then for each documented -# function all documented functions referencing it will be listed. -# The default value is: NO. - -REFERENCED_BY_RELATION = YES - -# If the REFERENCES_RELATION tag is set to YES then for each documented function -# all documented entities called/used by that function will be listed. -# The default value is: NO. - -REFERENCES_RELATION = YES - -# If the REFERENCES_LINK_SOURCE tag is set to YES and SOURCE_BROWSER tag is set -# to YES, then the hyperlinks from functions in REFERENCES_RELATION and -# REFERENCED_BY_RELATION lists will link to the source code. Otherwise they will -# link to the documentation. -# The default value is: YES. - -REFERENCES_LINK_SOURCE = YES - -# If SOURCE_TOOLTIPS is enabled (the default) then hovering a hyperlink in the -# source code will show a tooltip with additional information such as prototype, -# brief description and links to the definition and documentation. Since this -# will make the HTML file larger and loading of large files a bit slower, you -# can opt to disable this feature. -# The default value is: YES. -# This tag requires that the tag SOURCE_BROWSER is set to YES. - -SOURCE_TOOLTIPS = YES - -# If the USE_HTAGS tag is set to YES then the references to source code will -# point to the HTML generated by the htags(1) tool instead of doxygen built-in -# source browser. The htags tool is part of GNU's global source tagging system -# (see http://www.gnu.org/software/global/global.html). You will need version -# 4.8.6 or higher. -# -# To use it do the following: -# - Install the latest version of global -# - Enable SOURCE_BROWSER and USE_HTAGS in the config file -# - Make sure the INPUT points to the root of the source tree -# - Run doxygen as normal -# -# Doxygen will invoke htags (and that will in turn invoke gtags), so these -# tools must be available from the command line (i.e. in the search path). -# -# The result: instead of the source browser generated by doxygen, the links to -# source code will now point to the output of htags. -# The default value is: NO. -# This tag requires that the tag SOURCE_BROWSER is set to YES. - -USE_HTAGS = NO - -# If the VERBATIM_HEADERS tag is set the YES then doxygen will generate a -# verbatim copy of the header file for each class for which an include is -# specified. Set to NO to disable this. -# See also: Section \class. -# The default value is: YES. - -VERBATIM_HEADERS = YES - -# If the CLANG_ASSISTED_PARSING tag is set to YES, then doxygen will use the -# clang parser (see: http://clang.llvm.org/) for more accurate parsing at the -# cost of reduced performance. This can be particularly helpful with template -# rich C++ code for which doxygen's built-in parser lacks the necessary type -# information. -# Note: The availability of this option depends on whether or not doxygen was -# compiled with the --with-libclang option. -# The default value is: NO. - -CLANG_ASSISTED_PARSING = NO - -# If clang assisted parsing is enabled you can provide the compiler with command -# line options that you would normally use when invoking the compiler. Note that -# the include paths will already be set by doxygen for the files and directories -# specified with INPUT and INCLUDE_PATH. -# This tag requires that the tag CLANG_ASSISTED_PARSING is set to YES. - -CLANG_OPTIONS = - -#--------------------------------------------------------------------------- -# Configuration options related to the alphabetical class index -#--------------------------------------------------------------------------- - -# If the ALPHABETICAL_INDEX tag is set to YES, an alphabetical index of all -# compounds will be generated. Enable this if the project contains a lot of -# classes, structs, unions or interfaces. -# The default value is: YES. - -ALPHABETICAL_INDEX = NO - -# The COLS_IN_ALPHA_INDEX tag can be used to specify the number of columns in -# which the alphabetical index list will be split. -# Minimum value: 1, maximum value: 20, default value: 5. -# This tag requires that the tag ALPHABETICAL_INDEX is set to YES. - -COLS_IN_ALPHA_INDEX = 5 - -# In case all classes in a project start with a common prefix, all classes will -# be put under the same header in the alphabetical index. The IGNORE_PREFIX tag -# can be used to specify a prefix (or a list of prefixes) that should be ignored -# while generating the index headers. -# This tag requires that the tag ALPHABETICAL_INDEX is set to YES. - -IGNORE_PREFIX = - -#--------------------------------------------------------------------------- -# Configuration options related to the HTML output -#--------------------------------------------------------------------------- - -# If the GENERATE_HTML tag is set to YES doxygen will generate HTML output -# The default value is: YES. - -GENERATE_HTML = YES - -# The HTML_OUTPUT tag is used to specify where the HTML docs will be put. If a -# relative path is entered the value of OUTPUT_DIRECTORY will be put in front of -# it. -# The default directory is: html. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_OUTPUT = html - -# The HTML_FILE_EXTENSION tag can be used to specify the file extension for each -# generated HTML page (for example: .htm, .php, .asp). -# The default value is: .html. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_FILE_EXTENSION = .html - -# The HTML_HEADER tag can be used to specify a user-defined HTML header file for -# each generated HTML page. If the tag is left blank doxygen will generate a -# standard header. -# -# To get valid HTML the header file that includes any scripts and style sheets -# that doxygen needs, which is dependent on the configuration options used (e.g. -# the setting GENERATE_TREEVIEW). It is highly recommended to start with a -# default header using -# doxygen -w html new_header.html new_footer.html new_stylesheet.css -# YourConfigFile -# and then modify the file new_header.html. See also section "Doxygen usage" -# for information on how to generate the default header that doxygen normally -# uses. -# Note: The header is subject to change so you typically have to regenerate the -# default header when upgrading to a newer version of doxygen. For a description -# of the possible markers and block names see the documentation. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_HEADER = - -# The HTML_FOOTER tag can be used to specify a user-defined HTML footer for each -# generated HTML page. If the tag is left blank doxygen will generate a standard -# footer. See HTML_HEADER for more information on how to generate a default -# footer and what special commands can be used inside the footer. See also -# section "Doxygen usage" for information on how to generate the default footer -# that doxygen normally uses. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_FOOTER = - -# The HTML_STYLESHEET tag can be used to specify a user-defined cascading style -# sheet that is used by each HTML page. It can be used to fine-tune the look of -# the HTML output. If left blank doxygen will generate a default style sheet. -# See also section "Doxygen usage" for information on how to generate the style -# sheet that doxygen normally uses. -# Note: It is recommended to use HTML_EXTRA_STYLESHEET instead of this tag, as -# it is more robust and this tag (HTML_STYLESHEET) will in the future become -# obsolete. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_STYLESHEET = - -# The HTML_EXTRA_STYLESHEET tag can be used to specify additional user-defined -# cascading style sheets that are included after the standard style sheets -# created by doxygen. Using this option one can overrule certain style aspects. -# This is preferred over using HTML_STYLESHEET since it does not replace the -# standard style sheet and is therefor more robust against future updates. -# Doxygen will copy the style sheet files to the output directory. -# Note: The order of the extra stylesheet files is of importance (e.g. the last -# stylesheet in the list overrules the setting of the previous ones in the -# list). For an example see the documentation. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_EXTRA_STYLESHEET = - -# The HTML_EXTRA_FILES tag can be used to specify one or more extra images or -# other source files which should be copied to the HTML output directory. Note -# that these files will be copied to the base HTML output directory. Use the -# $relpath^ marker in the HTML_HEADER and/or HTML_FOOTER files to load these -# files. In the HTML_STYLESHEET file, use the file name only. Also note that the -# files will be copied as-is; there are no commands or markers available. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_EXTRA_FILES = - -# The HTML_COLORSTYLE_HUE tag controls the color of the HTML output. Doxygen -# will adjust the colors in the stylesheet and background images according to -# this color. Hue is specified as an angle on a colorwheel, see -# http://en.wikipedia.org/wiki/Hue for more information. For instance the value -# 0 represents red, 60 is yellow, 120 is green, 180 is cyan, 240 is blue, 300 -# purple, and 360 is red again. -# Minimum value: 0, maximum value: 359, default value: 220. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_COLORSTYLE_HUE = 220 - -# The HTML_COLORSTYLE_SAT tag controls the purity (or saturation) of the colors -# in the HTML output. For a value of 0 the output will use grayscales only. A -# value of 255 will produce the most vivid colors. -# Minimum value: 0, maximum value: 255, default value: 100. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_COLORSTYLE_SAT = 100 - -# The HTML_COLORSTYLE_GAMMA tag controls the gamma correction applied to the -# luminance component of the colors in the HTML output. Values below 100 -# gradually make the output lighter, whereas values above 100 make the output -# darker. The value divided by 100 is the actual gamma applied, so 80 represents -# a gamma of 0.8, The value 220 represents a gamma of 2.2, and 100 does not -# change the gamma. -# Minimum value: 40, maximum value: 240, default value: 80. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_COLORSTYLE_GAMMA = 80 - -# If the HTML_TIMESTAMP tag is set to YES then the footer of each generated HTML -# page will contain the date and time when the page was generated. Setting this -# to NO can help when comparing the output of multiple runs. -# The default value is: YES. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_TIMESTAMP = NO - -# If the HTML_DYNAMIC_SECTIONS tag is set to YES then the generated HTML -# documentation will contain sections that can be hidden and shown after the -# page has loaded. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_DYNAMIC_SECTIONS = NO - -# With HTML_INDEX_NUM_ENTRIES one can control the preferred number of entries -# shown in the various tree structured indices initially; the user can expand -# and collapse entries dynamically later on. Doxygen will expand the tree to -# such a level that at most the specified number of entries are visible (unless -# a fully collapsed tree already exceeds this amount). So setting the number of -# entries 1 will produce a full collapsed tree by default. 0 is a special value -# representing an infinite number of entries and will result in a full expanded -# tree by default. -# Minimum value: 0, maximum value: 9999, default value: 100. -# This tag requires that the tag GENERATE_HTML is set to YES. - -HTML_INDEX_NUM_ENTRIES = 100 - -# If the GENERATE_DOCSET tag is set to YES, additional index files will be -# generated that can be used as input for Apple's Xcode 3 integrated development -# environment (see: http://developer.apple.com/tools/xcode/), introduced with -# OSX 10.5 (Leopard). To create a documentation set, doxygen will generate a -# Makefile in the HTML output directory. Running make will produce the docset in -# that directory and running make install will install the docset in -# ~/Library/Developer/Shared/Documentation/DocSets so that Xcode will find it at -# startup. See http://developer.apple.com/tools/creatingdocsetswithdoxygen.html -# for more information. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTML is set to YES. - -GENERATE_DOCSET = NO - -# This tag determines the name of the docset feed. A documentation feed provides -# an umbrella under which multiple documentation sets from a single provider -# (such as a company or product suite) can be grouped. -# The default value is: Doxygen generated docs. -# This tag requires that the tag GENERATE_DOCSET is set to YES. - -DOCSET_FEEDNAME = "Doxygen generated docs" - -# This tag specifies a string that should uniquely identify the documentation -# set bundle. This should be a reverse domain-name style string, e.g. -# com.mycompany.MyDocSet. Doxygen will append .docset to the name. -# The default value is: org.doxygen.Project. -# This tag requires that the tag GENERATE_DOCSET is set to YES. - -DOCSET_BUNDLE_ID = org.aubio.aubio - -# The DOCSET_PUBLISHER_ID tag specifies a string that should uniquely identify -# the documentation publisher. This should be a reverse domain-name style -# string, e.g. com.mycompany.MyDocSet.documentation. -# The default value is: org.doxygen.Publisher. -# This tag requires that the tag GENERATE_DOCSET is set to YES. - -DOCSET_PUBLISHER_ID = org.aubio.aubio.Maintainer - -# The DOCSET_PUBLISHER_NAME tag identifies the documentation publisher. -# The default value is: Publisher. -# This tag requires that the tag GENERATE_DOCSET is set to YES. - -DOCSET_PUBLISHER_NAME = Publisher - -# If the GENERATE_HTMLHELP tag is set to YES then doxygen generates three -# additional HTML index files: index.hhp, index.hhc, and index.hhk. The -# index.hhp is a project file that can be read by Microsoft's HTML Help Workshop -# (see: http://www.microsoft.com/en-us/download/details.aspx?id=21138) on -# Windows. -# -# The HTML Help Workshop contains a compiler that can convert all HTML output -# generated by doxygen into a single compiled HTML file (.chm). Compiled HTML -# files are now used as the Windows 98 help format, and will replace the old -# Windows help format (.hlp) on all Windows platforms in the future. Compressed -# HTML files also contain an index, a table of contents, and you can search for -# words in the documentation. The HTML workshop also contains a viewer for -# compressed HTML files. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTML is set to YES. - -GENERATE_HTMLHELP = NO - -# The CHM_FILE tag can be used to specify the file name of the resulting .chm -# file. You can add a path in front of the file if the result should not be -# written to the html output directory. -# This tag requires that the tag GENERATE_HTMLHELP is set to YES. - -CHM_FILE = - -# The HHC_LOCATION tag can be used to specify the location (absolute path -# including file name) of the HTML help compiler ( hhc.exe). If non-empty -# doxygen will try to run the HTML help compiler on the generated index.hhp. -# The file has to be specified with full path. -# This tag requires that the tag GENERATE_HTMLHELP is set to YES. - -HHC_LOCATION = - -# The GENERATE_CHI flag controls if a separate .chi index file is generated ( -# YES) or that it should be included in the master .chm file ( NO). -# The default value is: NO. -# This tag requires that the tag GENERATE_HTMLHELP is set to YES. - -GENERATE_CHI = NO - -# The CHM_INDEX_ENCODING is used to encode HtmlHelp index ( hhk), content ( hhc) -# and project file content. -# This tag requires that the tag GENERATE_HTMLHELP is set to YES. - -CHM_INDEX_ENCODING = - -# The BINARY_TOC flag controls whether a binary table of contents is generated ( -# YES) or a normal table of contents ( NO) in the .chm file. Furthermore it -# enables the Previous and Next buttons. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTMLHELP is set to YES. - -BINARY_TOC = NO - -# The TOC_EXPAND flag can be set to YES to add extra items for group members to -# the table of contents of the HTML help documentation and to the tree view. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTMLHELP is set to YES. - -TOC_EXPAND = NO - -# If the GENERATE_QHP tag is set to YES and both QHP_NAMESPACE and -# QHP_VIRTUAL_FOLDER are set, an additional index file will be generated that -# can be used as input for Qt's qhelpgenerator to generate a Qt Compressed Help -# (.qch) of the generated HTML documentation. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTML is set to YES. - -GENERATE_QHP = NO - -# If the QHG_LOCATION tag is specified, the QCH_FILE tag can be used to specify -# the file name of the resulting .qch file. The path specified is relative to -# the HTML output folder. -# This tag requires that the tag GENERATE_QHP is set to YES. - -QCH_FILE = - -# The QHP_NAMESPACE tag specifies the namespace to use when generating Qt Help -# Project output. For more information please see Qt Help Project / Namespace -# (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#namespace). -# The default value is: org.doxygen.Project. -# This tag requires that the tag GENERATE_QHP is set to YES. - -QHP_NAMESPACE = - -# The QHP_VIRTUAL_FOLDER tag specifies the namespace to use when generating Qt -# Help Project output. For more information please see Qt Help Project / Virtual -# Folders (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#virtual- -# folders). -# The default value is: doc. -# This tag requires that the tag GENERATE_QHP is set to YES. - -QHP_VIRTUAL_FOLDER = doc - -# If the QHP_CUST_FILTER_NAME tag is set, it specifies the name of a custom -# filter to add. For more information please see Qt Help Project / Custom -# Filters (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#custom- -# filters). -# This tag requires that the tag GENERATE_QHP is set to YES. - -QHP_CUST_FILTER_NAME = - -# The QHP_CUST_FILTER_ATTRS tag specifies the list of the attributes of the -# custom filter to add. For more information please see Qt Help Project / Custom -# Filters (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#custom- -# filters). -# This tag requires that the tag GENERATE_QHP is set to YES. - -QHP_CUST_FILTER_ATTRS = - -# The QHP_SECT_FILTER_ATTRS tag specifies the list of the attributes this -# project's filter section matches. Qt Help Project / Filter Attributes (see: -# http://qt-project.org/doc/qt-4.8/qthelpproject.html#filter-attributes). -# This tag requires that the tag GENERATE_QHP is set to YES. - -QHP_SECT_FILTER_ATTRS = - -# The QHG_LOCATION tag can be used to specify the location of Qt's -# qhelpgenerator. If non-empty doxygen will try to run qhelpgenerator on the -# generated .qhp file. -# This tag requires that the tag GENERATE_QHP is set to YES. - -QHG_LOCATION = - -# If the GENERATE_ECLIPSEHELP tag is set to YES, additional index files will be -# generated, together with the HTML files, they form an Eclipse help plugin. To -# install this plugin and make it available under the help contents menu in -# Eclipse, the contents of the directory containing the HTML and XML files needs -# to be copied into the plugins directory of eclipse. The name of the directory -# within the plugins directory should be the same as the ECLIPSE_DOC_ID value. -# After copying Eclipse needs to be restarted before the help appears. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTML is set to YES. - -GENERATE_ECLIPSEHELP = NO - -# A unique identifier for the Eclipse help plugin. When installing the plugin -# the directory name containing the HTML and XML files should also have this -# name. Each documentation set should have its own identifier. -# The default value is: org.doxygen.Project. -# This tag requires that the tag GENERATE_ECLIPSEHELP is set to YES. - -ECLIPSE_DOC_ID = org.aubio.aubio - -# If you want full control over the layout of the generated HTML pages it might -# be necessary to disable the index and replace it with your own. The -# DISABLE_INDEX tag can be used to turn on/off the condensed index (tabs) at top -# of each HTML page. A value of NO enables the index and the value YES disables -# it. Since the tabs in the index contain the same information as the navigation -# tree, you can set this option to YES if you also set GENERATE_TREEVIEW to YES. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTML is set to YES. - -DISABLE_INDEX = NO - -# The GENERATE_TREEVIEW tag is used to specify whether a tree-like index -# structure should be generated to display hierarchical information. If the tag -# value is set to YES, a side panel will be generated containing a tree-like -# index structure (just like the one that is generated for HTML Help). For this -# to work a browser that supports JavaScript, DHTML, CSS and frames is required -# (i.e. any modern browser). Windows users are probably better off using the -# HTML help feature. Via custom stylesheets (see HTML_EXTRA_STYLESHEET) one can -# further fine-tune the look of the index. As an example, the default style -# sheet generated by doxygen has an example that shows how to put an image at -# the root of the tree instead of the PROJECT_NAME. Since the tree basically has -# the same information as the tab index, you could consider setting -# DISABLE_INDEX to YES when enabling this option. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTML is set to YES. - -GENERATE_TREEVIEW = NO - -# The ENUM_VALUES_PER_LINE tag can be used to set the number of enum values that -# doxygen will group on one line in the generated HTML documentation. -# -# Note that a value of 0 will completely suppress the enum values from appearing -# in the overview section. -# Minimum value: 0, maximum value: 20, default value: 4. -# This tag requires that the tag GENERATE_HTML is set to YES. - -ENUM_VALUES_PER_LINE = 4 - -# If the treeview is enabled (see GENERATE_TREEVIEW) then this tag can be used -# to set the initial width (in pixels) of the frame in which the tree is shown. -# Minimum value: 0, maximum value: 1500, default value: 250. -# This tag requires that the tag GENERATE_HTML is set to YES. - -TREEVIEW_WIDTH = 250 - -# When the EXT_LINKS_IN_WINDOW option is set to YES doxygen will open links to -# external symbols imported via tag files in a separate window. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTML is set to YES. - -EXT_LINKS_IN_WINDOW = NO - -# Use this tag to change the font size of LaTeX formulas included as images in -# the HTML documentation. When you change the font size after a successful -# doxygen run you need to manually remove any form_*.png images from the HTML -# output directory to force them to be regenerated. -# Minimum value: 8, maximum value: 50, default value: 10. -# This tag requires that the tag GENERATE_HTML is set to YES. - -FORMULA_FONTSIZE = 10 - -# Use the FORMULA_TRANPARENT tag to determine whether or not the images -# generated for formulas are transparent PNGs. Transparent PNGs are not -# supported properly for IE 6.0, but are supported on all modern browsers. -# -# Note that when changing this option you need to delete any form_*.png files in -# the HTML output directory before the changes have effect. -# The default value is: YES. -# This tag requires that the tag GENERATE_HTML is set to YES. - -FORMULA_TRANSPARENT = YES - -# Enable the USE_MATHJAX option to render LaTeX formulas using MathJax (see -# http://www.mathjax.org) which uses client side Javascript for the rendering -# instead of using prerendered bitmaps. Use this if you do not have LaTeX -# installed or if you want to formulas look prettier in the HTML output. When -# enabled you may also need to install MathJax separately and configure the path -# to it using the MATHJAX_RELPATH option. -# The default value is: NO. -# This tag requires that the tag GENERATE_HTML is set to YES. - -USE_MATHJAX = YES - -# When MathJax is enabled you can set the default output format to be used for -# the MathJax output. See the MathJax site (see: -# http://docs.mathjax.org/en/latest/output.html) for more details. -# Possible values are: HTML-CSS (which is slower, but has the best -# compatibility), NativeMML (i.e. MathML) and SVG. -# The default value is: HTML-CSS. -# This tag requires that the tag USE_MATHJAX is set to YES. - -MATHJAX_FORMAT = HTML-CSS - -# When MathJax is enabled you need to specify the location relative to the HTML -# output directory using the MATHJAX_RELPATH option. The destination directory -# should contain the MathJax.js script. For instance, if the mathjax directory -# is located at the same level as the HTML output directory, then -# MATHJAX_RELPATH should be ../mathjax. The default value points to the MathJax -# Content Delivery Network so you can quickly see the result without installing -# MathJax. However, it is strongly recommended to install a local copy of -# MathJax from http://www.mathjax.org before deployment. -# The default value is: http://cdn.mathjax.org/mathjax/latest. -# This tag requires that the tag USE_MATHJAX is set to YES. - -MATHJAX_RELPATH = http://cdn.mathjax.org/mathjax/latest - -# The MATHJAX_EXTENSIONS tag can be used to specify one or more MathJax -# extension names that should be enabled during MathJax rendering. For example -# MATHJAX_EXTENSIONS = TeX/AMSmath TeX/AMSsymbols -# This tag requires that the tag USE_MATHJAX is set to YES. - -MATHJAX_EXTENSIONS = - -# The MATHJAX_CODEFILE tag can be used to specify a file with javascript pieces -# of code that will be used on startup of the MathJax code. See the MathJax site -# (see: http://docs.mathjax.org/en/latest/output.html) for more details. For an -# example see the documentation. -# This tag requires that the tag USE_MATHJAX is set to YES. - -MATHJAX_CODEFILE = - -# When the SEARCHENGINE tag is enabled doxygen will generate a search box for -# the HTML output. The underlying search engine uses javascript and DHTML and -# should work on any modern browser. Note that when using HTML help -# (GENERATE_HTMLHELP), Qt help (GENERATE_QHP), or docsets (GENERATE_DOCSET) -# there is already a search function so this one should typically be disabled. -# For large projects the javascript based search engine can be slow, then -# enabling SERVER_BASED_SEARCH may provide a better solution. It is possible to -# search using the keyboard; to jump to the search box use <access key> + S -# (what the <access key> is depends on the OS and browser, but it is typically -# <CTRL>, <ALT>/<option>, or both). Inside the search box use the <cursor down -# key> to jump into the search results window, the results can be navigated -# using the <cursor keys>. Press <Enter> to select an item or <escape> to cancel -# the search. The filter options can be selected when the cursor is inside the -# search box by pressing <Shift>+<cursor down>. Also here use the <cursor keys> -# to select a filter and <Enter> or <escape> to activate or cancel the filter -# option. -# The default value is: YES. -# This tag requires that the tag GENERATE_HTML is set to YES. - -SEARCHENGINE = YES - -# When the SERVER_BASED_SEARCH tag is enabled the search engine will be -# implemented using a web server instead of a web client using Javascript. There -# are two flavors of web server based searching depending on the EXTERNAL_SEARCH -# setting. When disabled, doxygen will generate a PHP script for searching and -# an index file used by the script. When EXTERNAL_SEARCH is enabled the indexing -# and searching needs to be provided by external tools. See the section -# "External Indexing and Searching" for details. -# The default value is: NO. -# This tag requires that the tag SEARCHENGINE is set to YES. - -SERVER_BASED_SEARCH = NO - -# When EXTERNAL_SEARCH tag is enabled doxygen will no longer generate the PHP -# script for searching. Instead the search results are written to an XML file -# which needs to be processed by an external indexer. Doxygen will invoke an -# external search engine pointed to by the SEARCHENGINE_URL option to obtain the -# search results. -# -# Doxygen ships with an example indexer ( doxyindexer) and search engine -# (doxysearch.cgi) which are based on the open source search engine library -# Xapian (see: http://xapian.org/). -# -# See the section "External Indexing and Searching" for details. -# The default value is: NO. -# This tag requires that the tag SEARCHENGINE is set to YES. - -EXTERNAL_SEARCH = NO - -# The SEARCHENGINE_URL should point to a search engine hosted by a web server -# which will return the search results when EXTERNAL_SEARCH is enabled. -# -# Doxygen ships with an example indexer ( doxyindexer) and search engine -# (doxysearch.cgi) which are based on the open source search engine library -# Xapian (see: http://xapian.org/). See the section "External Indexing and -# Searching" for details. -# This tag requires that the tag SEARCHENGINE is set to YES. - -SEARCHENGINE_URL = - -# When SERVER_BASED_SEARCH and EXTERNAL_SEARCH are both enabled the unindexed -# search data is written to a file for indexing by an external tool. With the -# SEARCHDATA_FILE tag the name of this file can be specified. -# The default file is: searchdata.xml. -# This tag requires that the tag SEARCHENGINE is set to YES. - -SEARCHDATA_FILE = searchdata.xml - -# When SERVER_BASED_SEARCH and EXTERNAL_SEARCH are both enabled the -# EXTERNAL_SEARCH_ID tag can be used as an identifier for the project. This is -# useful in combination with EXTRA_SEARCH_MAPPINGS to search through multiple -# projects and redirect the results back to the right project. -# This tag requires that the tag SEARCHENGINE is set to YES. - -EXTERNAL_SEARCH_ID = - -# The EXTRA_SEARCH_MAPPINGS tag can be used to enable searching through doxygen -# projects other than the one defined by this configuration file, but that are -# all added to the same external search index. Each project needs to have a -# unique id set via EXTERNAL_SEARCH_ID. The search mapping then maps the id of -# to a relative location where the documentation can be found. The format is: -# EXTRA_SEARCH_MAPPINGS = tagname1=loc1 tagname2=loc2 ... -# This tag requires that the tag SEARCHENGINE is set to YES. - -EXTRA_SEARCH_MAPPINGS = - -#--------------------------------------------------------------------------- -# Configuration options related to the LaTeX output -#--------------------------------------------------------------------------- - -# If the GENERATE_LATEX tag is set to YES doxygen will generate LaTeX output. -# The default value is: YES. - -GENERATE_LATEX = NO - -# The LATEX_OUTPUT tag is used to specify where the LaTeX docs will be put. If a -# relative path is entered the value of OUTPUT_DIRECTORY will be put in front of -# it. -# The default directory is: latex. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -LATEX_OUTPUT = latex - -# The LATEX_CMD_NAME tag can be used to specify the LaTeX command name to be -# invoked. -# -# Note that when enabling USE_PDFLATEX this option is only used for generating -# bitmaps for formulas in the HTML output, but not in the Makefile that is -# written to the output directory. -# The default file is: latex. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -LATEX_CMD_NAME = latex - -# The MAKEINDEX_CMD_NAME tag can be used to specify the command name to generate -# index for LaTeX. -# The default file is: makeindex. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -MAKEINDEX_CMD_NAME = makeindex - -# If the COMPACT_LATEX tag is set to YES doxygen generates more compact LaTeX -# documents. This may be useful for small projects and may help to save some -# trees in general. -# The default value is: NO. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -COMPACT_LATEX = NO - -# The PAPER_TYPE tag can be used to set the paper type that is used by the -# printer. -# Possible values are: a4 (210 x 297 mm), letter (8.5 x 11 inches), legal (8.5 x -# 14 inches) and executive (7.25 x 10.5 inches). -# The default value is: a4. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -PAPER_TYPE = a4 - -# The EXTRA_PACKAGES tag can be used to specify one or more LaTeX package names -# that should be included in the LaTeX output. To get the times font for -# instance you can specify -# EXTRA_PACKAGES=times -# If left blank no extra packages will be included. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -EXTRA_PACKAGES = - -# The LATEX_HEADER tag can be used to specify a personal LaTeX header for the -# generated LaTeX document. The header should contain everything until the first -# chapter. If it is left blank doxygen will generate a standard header. See -# section "Doxygen usage" for information on how to let doxygen write the -# default header to a separate file. -# -# Note: Only use a user-defined header if you know what you are doing! The -# following commands have a special meaning inside the header: $title, -# $datetime, $date, $doxygenversion, $projectname, $projectnumber, -# $projectbrief, $projectlogo. Doxygen will replace $title with the empy string, -# for the replacement values of the other commands the user is refered to -# HTML_HEADER. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -LATEX_HEADER = - -# The LATEX_FOOTER tag can be used to specify a personal LaTeX footer for the -# generated LaTeX document. The footer should contain everything after the last -# chapter. If it is left blank doxygen will generate a standard footer. See -# LATEX_HEADER for more information on how to generate a default footer and what -# special commands can be used inside the footer. -# -# Note: Only use a user-defined footer if you know what you are doing! -# This tag requires that the tag GENERATE_LATEX is set to YES. - -LATEX_FOOTER = - -# The LATEX_EXTRA_FILES tag can be used to specify one or more extra images or -# other source files which should be copied to the LATEX_OUTPUT output -# directory. Note that the files will be copied as-is; there are no commands or -# markers available. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -LATEX_EXTRA_FILES = - -# If the PDF_HYPERLINKS tag is set to YES, the LaTeX that is generated is -# prepared for conversion to PDF (using ps2pdf or pdflatex). The PDF file will -# contain links (just like the HTML output) instead of page references. This -# makes the output suitable for online browsing using a PDF viewer. -# The default value is: YES. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -PDF_HYPERLINKS = YES - -# If the USE_PDFLATEX tag is set to YES, doxygen will use pdflatex to generate -# the PDF file directly from the LaTeX files. Set this option to YES to get a -# higher quality PDF documentation. -# The default value is: YES. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -USE_PDFLATEX = YES - -# If the LATEX_BATCHMODE tag is set to YES, doxygen will add the \batchmode -# command to the generated LaTeX files. This will instruct LaTeX to keep running -# if errors occur, instead of asking the user for help. This option is also used -# when generating formulas in HTML. -# The default value is: NO. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -LATEX_BATCHMODE = NO - -# If the LATEX_HIDE_INDICES tag is set to YES then doxygen will not include the -# index chapters (such as File Index, Compound Index, etc.) in the output. -# The default value is: NO. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -LATEX_HIDE_INDICES = NO - -# If the LATEX_SOURCE_CODE tag is set to YES then doxygen will include source -# code with syntax highlighting in the LaTeX output. -# -# Note that which sources are shown also depends on other settings such as -# SOURCE_BROWSER. -# The default value is: NO. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -LATEX_SOURCE_CODE = NO - -# The LATEX_BIB_STYLE tag can be used to specify the style to use for the -# bibliography, e.g. plainnat, or ieeetr. See -# http://en.wikipedia.org/wiki/BibTeX and \cite for more info. -# The default value is: plain. -# This tag requires that the tag GENERATE_LATEX is set to YES. - -LATEX_BIB_STYLE = plain - -#--------------------------------------------------------------------------- -# Configuration options related to the RTF output -#--------------------------------------------------------------------------- - -# If the GENERATE_RTF tag is set to YES doxygen will generate RTF output. The -# RTF output is optimized for Word 97 and may not look too pretty with other RTF -# readers/editors. -# The default value is: NO. - -GENERATE_RTF = NO - -# The RTF_OUTPUT tag is used to specify where the RTF docs will be put. If a -# relative path is entered the value of OUTPUT_DIRECTORY will be put in front of -# it. -# The default directory is: rtf. -# This tag requires that the tag GENERATE_RTF is set to YES. - -RTF_OUTPUT = rtf - -# If the COMPACT_RTF tag is set to YES doxygen generates more compact RTF -# documents. This may be useful for small projects and may help to save some -# trees in general. -# The default value is: NO. -# This tag requires that the tag GENERATE_RTF is set to YES. - -COMPACT_RTF = NO - -# If the RTF_HYPERLINKS tag is set to YES, the RTF that is generated will -# contain hyperlink fields. The RTF file will contain links (just like the HTML -# output) instead of page references. This makes the output suitable for online -# browsing using Word or some other Word compatible readers that support those -# fields. -# -# Note: WordPad (write) and others do not support links. -# The default value is: NO. -# This tag requires that the tag GENERATE_RTF is set to YES. - -RTF_HYPERLINKS = NO - -# Load stylesheet definitions from file. Syntax is similar to doxygen's config -# file, i.e. a series of assignments. You only have to provide replacements, -# missing definitions are set to their default value. -# -# See also section "Doxygen usage" for information on how to generate the -# default style sheet that doxygen normally uses. -# This tag requires that the tag GENERATE_RTF is set to YES. - -RTF_STYLESHEET_FILE = - -# Set optional variables used in the generation of an RTF document. Syntax is -# similar to doxygen's config file. A template extensions file can be generated -# using doxygen -e rtf extensionFile. -# This tag requires that the tag GENERATE_RTF is set to YES. - -RTF_EXTENSIONS_FILE = - -#--------------------------------------------------------------------------- -# Configuration options related to the man page output -#--------------------------------------------------------------------------- - -# If the GENERATE_MAN tag is set to YES doxygen will generate man pages for -# classes and files. -# The default value is: NO. - -GENERATE_MAN = NO - -# The MAN_OUTPUT tag is used to specify where the man pages will be put. If a -# relative path is entered the value of OUTPUT_DIRECTORY will be put in front of -# it. A directory man3 will be created inside the directory specified by -# MAN_OUTPUT. -# The default directory is: man. -# This tag requires that the tag GENERATE_MAN is set to YES. - -MAN_OUTPUT = man - -# The MAN_EXTENSION tag determines the extension that is added to the generated -# man pages. In case the manual section does not start with a number, the number -# 3 is prepended. The dot (.) at the beginning of the MAN_EXTENSION tag is -# optional. -# The default value is: .3. -# This tag requires that the tag GENERATE_MAN is set to YES. - -MAN_EXTENSION = .3 - -# The MAN_SUBDIR tag determines the name of the directory created within -# MAN_OUTPUT in which the man pages are placed. If defaults to man followed by -# MAN_EXTENSION with the initial . removed. -# This tag requires that the tag GENERATE_MAN is set to YES. - -MAN_SUBDIR = - -# If the MAN_LINKS tag is set to YES and doxygen generates man output, then it -# will generate one additional man file for each entity documented in the real -# man page(s). These additional files only source the real man page, but without -# them the man command would be unable to find the correct page. -# The default value is: NO. -# This tag requires that the tag GENERATE_MAN is set to YES. - -MAN_LINKS = NO - -#--------------------------------------------------------------------------- -# Configuration options related to the XML output -#--------------------------------------------------------------------------- - -# If the GENERATE_XML tag is set to YES doxygen will generate an XML file that -# captures the structure of the code including all documentation. -# The default value is: NO. - -GENERATE_XML = NO - -# The XML_OUTPUT tag is used to specify where the XML pages will be put. If a -# relative path is entered the value of OUTPUT_DIRECTORY will be put in front of -# it. -# The default directory is: xml. -# This tag requires that the tag GENERATE_XML is set to YES. - -XML_OUTPUT = xml - -# If the XML_PROGRAMLISTING tag is set to YES doxygen will dump the program -# listings (including syntax highlighting and cross-referencing information) to -# the XML output. Note that enabling this will significantly increase the size -# of the XML output. -# The default value is: YES. -# This tag requires that the tag GENERATE_XML is set to YES. - -XML_PROGRAMLISTING = YES - -#--------------------------------------------------------------------------- -# Configuration options related to the DOCBOOK output -#--------------------------------------------------------------------------- - -# If the GENERATE_DOCBOOK tag is set to YES doxygen will generate Docbook files -# that can be used to generate PDF. -# The default value is: NO. - -GENERATE_DOCBOOK = NO - -# The DOCBOOK_OUTPUT tag is used to specify where the Docbook pages will be put. -# If a relative path is entered the value of OUTPUT_DIRECTORY will be put in -# front of it. -# The default directory is: docbook. -# This tag requires that the tag GENERATE_DOCBOOK is set to YES. - -DOCBOOK_OUTPUT = docbook - -# If the DOCBOOK_PROGRAMLISTING tag is set to YES doxygen will include the -# program listings (including syntax highlighting and cross-referencing -# information) to the DOCBOOK output. Note that enabling this will significantly -# increase the size of the DOCBOOK output. -# The default value is: NO. -# This tag requires that the tag GENERATE_DOCBOOK is set to YES. - -DOCBOOK_PROGRAMLISTING = NO - -#--------------------------------------------------------------------------- -# Configuration options for the AutoGen Definitions output -#--------------------------------------------------------------------------- - -# If the GENERATE_AUTOGEN_DEF tag is set to YES doxygen will generate an AutoGen -# Definitions (see http://autogen.sf.net) file that captures the structure of -# the code including all documentation. Note that this feature is still -# experimental and incomplete at the moment. -# The default value is: NO. - -GENERATE_AUTOGEN_DEF = NO - -#--------------------------------------------------------------------------- -# Configuration options related to the Perl module output -#--------------------------------------------------------------------------- - -# If the GENERATE_PERLMOD tag is set to YES doxygen will generate a Perl module -# file that captures the structure of the code including all documentation. -# -# Note that this feature is still experimental and incomplete at the moment. -# The default value is: NO. - -GENERATE_PERLMOD = NO - -# If the PERLMOD_LATEX tag is set to YES doxygen will generate the necessary -# Makefile rules, Perl scripts and LaTeX code to be able to generate PDF and DVI -# output from the Perl module output. -# The default value is: NO. -# This tag requires that the tag GENERATE_PERLMOD is set to YES. - -PERLMOD_LATEX = NO - -# If the PERLMOD_PRETTY tag is set to YES the Perl module output will be nicely -# formatted so it can be parsed by a human reader. This is useful if you want to -# understand what is going on. On the other hand, if this tag is set to NO the -# size of the Perl module output will be much smaller and Perl will parse it -# just the same. -# The default value is: YES. -# This tag requires that the tag GENERATE_PERLMOD is set to YES. - -PERLMOD_PRETTY = YES - -# The names of the make variables in the generated doxyrules.make file are -# prefixed with the string contained in PERLMOD_MAKEVAR_PREFIX. This is useful -# so different doxyrules.make files included by the same Makefile don't -# overwrite each other's variables. -# This tag requires that the tag GENERATE_PERLMOD is set to YES. - -PERLMOD_MAKEVAR_PREFIX = - -#--------------------------------------------------------------------------- -# Configuration options related to the preprocessor -#--------------------------------------------------------------------------- - -# If the ENABLE_PREPROCESSING tag is set to YES doxygen will evaluate all -# C-preprocessor directives found in the sources and include files. -# The default value is: YES. - -ENABLE_PREPROCESSING = YES - -# If the MACRO_EXPANSION tag is set to YES doxygen will expand all macro names -# in the source code. If set to NO only conditional compilation will be -# performed. Macro expansion can be done in a controlled way by setting -# EXPAND_ONLY_PREDEF to YES. -# The default value is: NO. -# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. - -MACRO_EXPANSION = NO - -# If the EXPAND_ONLY_PREDEF and MACRO_EXPANSION tags are both set to YES then -# the macro expansion is limited to the macros specified with the PREDEFINED and -# EXPAND_AS_DEFINED tags. -# The default value is: NO. -# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. - -EXPAND_ONLY_PREDEF = NO - -# If the SEARCH_INCLUDES tag is set to YES the includes files in the -# INCLUDE_PATH will be searched if a #include is found. -# The default value is: YES. -# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. - -SEARCH_INCLUDES = YES - -# The INCLUDE_PATH tag can be used to specify one or more directories that -# contain include files that are not input files but should be processed by the -# preprocessor. -# This tag requires that the tag SEARCH_INCLUDES is set to YES. - -INCLUDE_PATH = - -# You can use the INCLUDE_FILE_PATTERNS tag to specify one or more wildcard -# patterns (like *.h and *.hpp) to filter out the header-files in the -# directories. If left blank, the patterns specified with FILE_PATTERNS will be -# used. -# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. - -INCLUDE_FILE_PATTERNS = - -# The PREDEFINED tag can be used to specify one or more macro names that are -# defined before the preprocessor is started (similar to the -D option of e.g. -# gcc). The argument of the tag is a list of macros of the form: name or -# name=definition (no spaces). If the definition and the "=" are omitted, "=1" -# is assumed. To prevent a macro definition from being undefined via #undef or -# recursively expanded use the := operator instead of the = operator. -# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. - -PREDEFINED = - -# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this -# tag can be used to specify a list of macro names that should be expanded. The -# macro definition that is found in the sources will be used. Use the PREDEFINED -# tag if you want to use a different macro definition that overrules the -# definition found in the source code. -# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. - -EXPAND_AS_DEFINED = - -# If the SKIP_FUNCTION_MACROS tag is set to YES then doxygen's preprocessor will -# remove all references to function-like macros that are alone on a line, have -# an all uppercase name, and do not end with a semicolon. Such function macros -# are typically used for boiler-plate code, and will confuse the parser if not -# removed. -# The default value is: YES. -# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. - -SKIP_FUNCTION_MACROS = YES - -#--------------------------------------------------------------------------- -# Configuration options related to external references -#--------------------------------------------------------------------------- - -# The TAGFILES tag can be used to specify one or more tag files. For each tag -# file the location of the external documentation should be added. The format of -# a tag file without this location is as follows: -# TAGFILES = file1 file2 ... -# Adding location for the tag files is done as follows: -# TAGFILES = file1=loc1 "file2 = loc2" ... -# where loc1 and loc2 can be relative or absolute paths or URLs. See the -# section "Linking to external documentation" for more information about the use -# of tag files. -# Note: Each tag file must have a unique name (where the name does NOT include -# the path). If a tag file is not located in the directory in which doxygen is -# run, you must also specify the path to the tagfile here. - -TAGFILES = - -# When a file name is specified after GENERATE_TAGFILE, doxygen will create a -# tag file that is based on the input files it reads. See section "Linking to -# external documentation" for more information about the usage of tag files. - -GENERATE_TAGFILE = - -# If the ALLEXTERNALS tag is set to YES all external class will be listed in the -# class index. If set to NO only the inherited external classes will be listed. -# The default value is: NO. - -ALLEXTERNALS = NO - -# If the EXTERNAL_GROUPS tag is set to YES all external groups will be listed in -# the modules index. If set to NO, only the current project's groups will be -# listed. -# The default value is: YES. - -EXTERNAL_GROUPS = YES - -# If the EXTERNAL_PAGES tag is set to YES all external pages will be listed in -# the related pages index. If set to NO, only the current project's pages will -# be listed. -# The default value is: YES. - -EXTERNAL_PAGES = YES - -# The PERL_PATH should be the absolute path and name of the perl script -# interpreter (i.e. the result of 'which perl'). -# The default file (with absolute path) is: /usr/bin/perl. - -PERL_PATH = /usr/bin/perl - -#--------------------------------------------------------------------------- -# Configuration options related to the dot tool -#--------------------------------------------------------------------------- - -# If the CLASS_DIAGRAMS tag is set to YES doxygen will generate a class diagram -# (in HTML and LaTeX) for classes with base or super classes. Setting the tag to -# NO turns the diagrams off. Note that this option also works with HAVE_DOT -# disabled, but it is recommended to install and use dot, since it yields more -# powerful graphs. -# The default value is: YES. - -CLASS_DIAGRAMS = YES - -# You can define message sequence charts within doxygen comments using the \msc -# command. Doxygen will then run the mscgen tool (see: -# http://www.mcternan.me.uk/mscgen/)) to produce the chart and insert it in the -# documentation. The MSCGEN_PATH tag allows you to specify the directory where -# the mscgen tool resides. If left empty the tool is assumed to be found in the -# default search path. - -MSCGEN_PATH = - -# You can include diagrams made with dia in doxygen documentation. Doxygen will -# then run dia to produce the diagram and insert it in the documentation. The -# DIA_PATH tag allows you to specify the directory where the dia binary resides. -# If left empty dia is assumed to be found in the default search path. - -DIA_PATH = - -# If set to YES, the inheritance and collaboration graphs will hide inheritance -# and usage relations if the target is undocumented or is not a class. -# The default value is: YES. - -HIDE_UNDOC_RELATIONS = NO - -# If you set the HAVE_DOT tag to YES then doxygen will assume the dot tool is -# available from the path. This tool is part of Graphviz (see: -# http://www.graphviz.org/), a graph visualization toolkit from AT&T and Lucent -# Bell Labs. The other options in this section have no effect if this option is -# set to NO -# The default value is: YES. - -HAVE_DOT = NO - -# The DOT_NUM_THREADS specifies the number of dot invocations doxygen is allowed -# to run in parallel. When set to 0 doxygen will base this on the number of -# processors available in the system. You can set it explicitly to a value -# larger than 0 to get control over the balance between CPU load and processing -# speed. -# Minimum value: 0, maximum value: 32, default value: 0. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_NUM_THREADS = 0 - -# When you want a differently looking font in the dot files that doxygen -# generates you can specify the font name using DOT_FONTNAME. You need to make -# sure dot is able to find the font, which can be done by putting it in a -# standard location or by setting the DOTFONTPATH environment variable or by -# setting DOT_FONTPATH to the directory containing the font. -# The default value is: Helvetica. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_FONTNAME = - -# The DOT_FONTSIZE tag can be used to set the size (in points) of the font of -# dot graphs. -# Minimum value: 4, maximum value: 24, default value: 10. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_FONTSIZE = 10 - -# By default doxygen will tell dot to use the default font as specified with -# DOT_FONTNAME. If you specify a different font using DOT_FONTNAME you can set -# the path where dot can find it using this tag. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_FONTPATH = - -# If the CLASS_GRAPH tag is set to YES then doxygen will generate a graph for -# each documented class showing the direct and indirect inheritance relations. -# Setting this tag to YES will force the CLASS_DIAGRAMS tag to NO. -# The default value is: YES. -# This tag requires that the tag HAVE_DOT is set to YES. - -CLASS_GRAPH = YES - -# If the COLLABORATION_GRAPH tag is set to YES then doxygen will generate a -# graph for each documented class showing the direct and indirect implementation -# dependencies (inheritance, containment, and class references variables) of the -# class with other documented classes. -# The default value is: YES. -# This tag requires that the tag HAVE_DOT is set to YES. - -COLLABORATION_GRAPH = YES - -# If the GROUP_GRAPHS tag is set to YES then doxygen will generate a graph for -# groups, showing the direct groups dependencies. -# The default value is: YES. -# This tag requires that the tag HAVE_DOT is set to YES. - -GROUP_GRAPHS = YES - -# If the UML_LOOK tag is set to YES doxygen will generate inheritance and -# collaboration diagrams in a style similar to the OMG's Unified Modeling -# Language. -# The default value is: NO. -# This tag requires that the tag HAVE_DOT is set to YES. - -UML_LOOK = NO - -# If the UML_LOOK tag is enabled, the fields and methods are shown inside the -# class node. If there are many fields or methods and many nodes the graph may -# become too big to be useful. The UML_LIMIT_NUM_FIELDS threshold limits the -# number of items for each type to make the size more manageable. Set this to 0 -# for no limit. Note that the threshold may be exceeded by 50% before the limit -# is enforced. So when you set the threshold to 10, up to 15 fields may appear, -# but if the number exceeds 15, the total amount of fields shown is limited to -# 10. -# Minimum value: 0, maximum value: 100, default value: 10. -# This tag requires that the tag HAVE_DOT is set to YES. - -UML_LIMIT_NUM_FIELDS = 10 - -# If the TEMPLATE_RELATIONS tag is set to YES then the inheritance and -# collaboration graphs will show the relations between templates and their -# instances. -# The default value is: NO. -# This tag requires that the tag HAVE_DOT is set to YES. - -TEMPLATE_RELATIONS = NO - -# If the INCLUDE_GRAPH, ENABLE_PREPROCESSING and SEARCH_INCLUDES tags are set to -# YES then doxygen will generate a graph for each documented file showing the -# direct and indirect include dependencies of the file with other documented -# files. -# The default value is: YES. -# This tag requires that the tag HAVE_DOT is set to YES. - -INCLUDE_GRAPH = YES - -# If the INCLUDED_BY_GRAPH, ENABLE_PREPROCESSING and SEARCH_INCLUDES tags are -# set to YES then doxygen will generate a graph for each documented file showing -# the direct and indirect include dependencies of the file with other documented -# files. -# The default value is: YES. -# This tag requires that the tag HAVE_DOT is set to YES. - -INCLUDED_BY_GRAPH = YES - -# If the CALL_GRAPH tag is set to YES then doxygen will generate a call -# dependency graph for every global function or class method. -# -# Note that enabling this option will significantly increase the time of a run. -# So in most cases it will be better to enable call graphs for selected -# functions only using the \callgraph command. -# The default value is: NO. -# This tag requires that the tag HAVE_DOT is set to YES. - -CALL_GRAPH = NO - -# If the CALLER_GRAPH tag is set to YES then doxygen will generate a caller -# dependency graph for every global function or class method. -# -# Note that enabling this option will significantly increase the time of a run. -# So in most cases it will be better to enable caller graphs for selected -# functions only using the \callergraph command. -# The default value is: NO. -# This tag requires that the tag HAVE_DOT is set to YES. - -CALLER_GRAPH = NO - -# If the GRAPHICAL_HIERARCHY tag is set to YES then doxygen will graphical -# hierarchy of all classes instead of a textual one. -# The default value is: YES. -# This tag requires that the tag HAVE_DOT is set to YES. - -GRAPHICAL_HIERARCHY = YES - -# If the DIRECTORY_GRAPH tag is set to YES then doxygen will show the -# dependencies a directory has on other directories in a graphical way. The -# dependency relations are determined by the #include relations between the -# files in the directories. -# The default value is: YES. -# This tag requires that the tag HAVE_DOT is set to YES. - -DIRECTORY_GRAPH = YES - -# The DOT_IMAGE_FORMAT tag can be used to set the image format of the images -# generated by dot. -# Note: If you choose svg you need to set HTML_FILE_EXTENSION to xhtml in order -# to make the SVG files visible in IE 9+ (other browsers do not have this -# requirement). -# Possible values are: png, png:cairo, png:cairo:cairo, png:cairo:gd, png:gd, -# png:gd:gd, jpg, jpg:cairo, jpg:cairo:gd, jpg:gd, jpg:gd:gd, gif, gif:cairo, -# gif:cairo:gd, gif:gd, gif:gd:gd and svg. -# The default value is: png. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_IMAGE_FORMAT = png - -# If DOT_IMAGE_FORMAT is set to svg, then this option can be set to YES to -# enable generation of interactive SVG images that allow zooming and panning. -# -# Note that this requires a modern browser other than Internet Explorer. Tested -# and working are Firefox, Chrome, Safari, and Opera. -# Note: For IE 9+ you need to set HTML_FILE_EXTENSION to xhtml in order to make -# the SVG files visible. Older versions of IE do not have SVG support. -# The default value is: NO. -# This tag requires that the tag HAVE_DOT is set to YES. - -INTERACTIVE_SVG = NO - -# The DOT_PATH tag can be used to specify the path where the dot tool can be -# found. If left blank, it is assumed the dot tool can be found in the path. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_PATH = - -# The DOTFILE_DIRS tag can be used to specify one or more directories that -# contain dot files that are included in the documentation (see the \dotfile -# command). -# This tag requires that the tag HAVE_DOT is set to YES. - -DOTFILE_DIRS = - -# The MSCFILE_DIRS tag can be used to specify one or more directories that -# contain msc files that are included in the documentation (see the \mscfile -# command). - -MSCFILE_DIRS = - -# The DIAFILE_DIRS tag can be used to specify one or more directories that -# contain dia files that are included in the documentation (see the \diafile -# command). - -DIAFILE_DIRS = - -# When using plantuml, the PLANTUML_JAR_PATH tag should be used to specify the -# path where java can find the plantuml.jar file. If left blank, it is assumed -# PlantUML is not used or called during a preprocessing step. Doxygen will -# generate a warning when it encounters a \startuml command in this case and -# will not generate output for the diagram. -# This tag requires that the tag HAVE_DOT is set to YES. - -PLANTUML_JAR_PATH = - -# The DOT_GRAPH_MAX_NODES tag can be used to set the maximum number of nodes -# that will be shown in the graph. If the number of nodes in a graph becomes -# larger than this value, doxygen will truncate the graph, which is visualized -# by representing a node as a red box. Note that doxygen if the number of direct -# children of the root node in a graph is already larger than -# DOT_GRAPH_MAX_NODES then the graph will not be shown at all. Also note that -# the size of a graph can be further restricted by MAX_DOT_GRAPH_DEPTH. -# Minimum value: 0, maximum value: 10000, default value: 50. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_GRAPH_MAX_NODES = 50 - -# The MAX_DOT_GRAPH_DEPTH tag can be used to set the maximum depth of the graphs -# generated by dot. A depth value of 3 means that only nodes reachable from the -# root by following a path via at most 3 edges will be shown. Nodes that lay -# further from the root node will be omitted. Note that setting this option to 1 -# or 2 may greatly reduce the computation time needed for large code bases. Also -# note that the size of a graph can be further restricted by -# DOT_GRAPH_MAX_NODES. Using a depth of 0 means no depth restriction. -# Minimum value: 0, maximum value: 1000, default value: 0. -# This tag requires that the tag HAVE_DOT is set to YES. - -MAX_DOT_GRAPH_DEPTH = 0 - -# Set the DOT_TRANSPARENT tag to YES to generate images with a transparent -# background. This is disabled by default, because dot on Windows does not seem -# to support this out of the box. -# -# Warning: Depending on the platform used, enabling this option may lead to -# badly anti-aliased labels on the edges of a graph (i.e. they become hard to -# read). -# The default value is: NO. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_TRANSPARENT = NO - -# Set the DOT_MULTI_TARGETS tag to YES allow dot to generate multiple output -# files in one run (i.e. multiple -o and -T options on the command line). This -# makes dot run faster, but since only newer versions of dot (>1.8.10) support -# this, this feature is disabled by default. -# The default value is: NO. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_MULTI_TARGETS = NO - -# If the GENERATE_LEGEND tag is set to YES doxygen will generate a legend page -# explaining the meaning of the various boxes and arrows in the dot generated -# graphs. -# The default value is: YES. -# This tag requires that the tag HAVE_DOT is set to YES. - -GENERATE_LEGEND = YES - -# If the DOT_CLEANUP tag is set to YES doxygen will remove the intermediate dot -# files that are used to generate the various graphs. -# The default value is: YES. -# This tag requires that the tag HAVE_DOT is set to YES. - -DOT_CLEANUP = YES diff --git a/doc/index.rst b/doc/index.rst index 70b9df8..5581f73 100644 --- a/doc/index.rst +++ b/doc/index.rst @@ -1,16 +1,51 @@ -aubio documentation -=================== +Welcome +======= -aubio is a collection of algorithms and tools to label music and sounds. It -listens to audio signals and attempts to detect events. For instance, when a -drum is hit, at which frequency is a note, or at what tempo is a rhythmic -melody. +aubio is a collection of algorithms and tools to label and transform music and +sounds. It scans or `listens` to audio signals and attempts to detect musical +events. For instance, when a drum is hit, at which frequency is a note, or at +what tempo is a rhythmic melody. -Its features include segmenting a sound file before each of its attacks, +aubio features include segmenting a sound file before each of its attacks, performing pitch detection, tapping the beat and producing midi streams from live audio. -aubio provide several algorithms and routines, including: +Quick links +=========== + +* :ref:`python` +* :ref:`manpages` +* :ref:`develop` +* :ref:`building` + +.. only:: devel + + .. include:: statuslinks.rst + +Project pages +============= + +* `Project homepage`_: https://aubio.org +* `aubio on github`_: https://github.com/aubio/aubio +* `aubio on pypi`_: https://pypi.python.org/pypi/aubio +* `Doxygen documentation`_: https://aubio.org/doc/latest/ +* `Mailing lists`_: https://lists.aubio.org + +.. _Project homepage: https://aubio.org +.. _aubio on github: https://github.com/aubio/aubio +.. _aubio on pypi: https://pypi.python.org/pypi/aubio +.. _Doxygen documentation: https://aubio.org/doc/latest/ +.. _Mailing lists: https://lists.aubio.org/ + +* `Travis Continuous integration page <https://travis-ci.org/aubio/aubio>`_ +* `Appveyor Continuous integration page <https://ci.appveyor.com/project/piem/aubio>`_ +* `Landscape python code validation <https://landscape.io/github/aubio/aubio/master>`_ +* `ReadTheDocs documentation <https://aubio.readthedocs.io/en/latest/>`_ + +Features +======== + +aubio provides several algorithms and routines, including: - several onset detection methods - different pitch detection methods @@ -21,93 +56,21 @@ aubio provide several algorithms and routines, including: - digital filters (low pass, high pass, and more) - spectral filtering - transient/steady-state separation -- sound file and audio devices read and write access +- sound file read and write access - various mathematics utilities for music applications The name aubio comes from *audio* with a typo: some errors are likely to be found in the results. -Python module -------------- - -A python module to access the library functions is also provided. Please see -the file ``python/README`` for more information on how to use it. - -Examples tools --------------- - -A few simple command line tools are included along with the library: - - - ``aubioonset`` outputs the time stamp of detected note onsets - - ``aubiopitch`` attempts to identify a fundamental frequency, or pitch, for - each frame of the input sound - - ``aubiomfcc`` computes Mel-frequency Cepstrum Coefficients - - ``aubiotrack`` outputs the time stamp of detected beats - - ``aubionotes`` emits midi-like notes, with an onset, a pitch, and a duration - - ``aubioquiet`` extracts quiet and loud regions - -Additionally, the python module comes with the following script: - - - ``aubiocut`` slices sound files at onset or beat timestamps - -C API basics ------------- - -The library is written in C and is optimised for speed and portability. - -The C API is designed in the following way: - -.. code-block:: c - - aubio_something_t * new_aubio_something(void * args); - audio_something_do(aubio_something_t * t, void * args); - smpl_t aubio_something_get_a_parameter(aubio_something_t * t); - uint_t aubio_something_set_a_parameter(aubio_something_t * t, smpl_t a_parameter); - void del_aubio_something(aubio_something_t * t); - -For performance and real-time operation, no memory allocation or freeing take -place in the ``_do`` methods. Instead, memory allocation should always take place -in the ``new_`` methods, whereas free operations are done in the ``del_`` methods. - -.. code-block:: bash - - ./waf configure - ./waf build - sudo ./waf install - -aubio compiles on Linux, Mac OS X, Cygwin, and iPhone. - -Documentation -------------- - -- Manual pages: http://aubio.org/documentation -- API documentation: http://aubio.org/doc/latest/ - -Contribute ----------- - -- Issue Tracker: https://github.com/piem/aubio/issues -- Source Code: https://github.com/piem/aubio - -Contact info ------------- - -The home page of this project can be found at: http://aubio.org/ - -Questions, comments, suggestions, and contributions are welcome. Use the -mailing list: <aubio-user@aubio.org>. - -To subscribe to the list, use the mailman form: -http://lists.aubio.org/listinfo/aubio-user/ - -Alternatively, feel free to contact directly the author. - - -Contents --------- +Content +======= .. toctree:: - :maxdepth: 1 + :maxdepth: 2 installing python_module + python + cli + develop + about diff --git a/doc/installing.rst b/doc/installing.rst index 6629e38..2c10412 100644 --- a/doc/installing.rst +++ b/doc/installing.rst @@ -1,65 +1,79 @@ -.. highlight:: bash - Installing aubio ================ -A number of distributions already include aubio. Check your favorite package -management system, or have a look at the `download page -<http://aubio.org/download>`_. +aubio runs on Linux, Windows, macOS, iOS, Android, and probably a few others +operating systems. -aubio uses `waf <https://waf.io/>`_ to configure, compile, and test the source. -A copy of ``waf`` is included along aubio, so all you need is a ``terminal`` -and a recent ``python`` installed. +Aubio is available as a C library and as a python module. -Source code +Cheat sheet ----------- -Check out the `download page <http://aubio.org/download>`_ for more options: -http://aubio.org/download. - -The latest stable release can be found at http://aubio.org/pub/:: +- :ref:`get aubio latest source code <building>`:: - $ curl -O http://aubio.org/pub/aubio-0.4.1.tar.bz2 - $ tar xf aubio-0.4.1.tar.bz2 - $ cd aubio-0.4.1 + # official repo + git clone https://git.aubio.org/aubio/aubio + # mirror + git clone https://github.com/aubio/aubio + # latest release + wget https://aubio.org/pub/aubio-<version>.tar.gz -The latest develop branch can be obtained with:: - $ git clone git://git.aubio.org/git/aubio/ aubio-devel - $ cd aubio-devel - $ git fetch origin develop:develop - $ git checkout develop +- :ref:`build aubio from source <building>`:: -Compiling ---------- + # 1. simple + cd aubio + make -To compile the C library, examples programs, and tests, run:: + # 2. step by step + ./scripts/get_waf.sh + ./waf configure + ./waf build + sudo ./waf install - $ ./waf configure +- :ref:`install python-aubio from source <python-install>`:: -Check out the available options using ``./waf configure --help | less``. Once -you are done with configuration, you can start building:: + # from git + pip install git+https://git.aubio.org/aubio/aubio/ + # mirror + pip install git+https://github.com/aubio/aubio/ + # from latest release + pip install https://aubio.org/pub/aubio-latest.tar.bz2 + # from pypi + pip install aubio + # from source directory + cd aubio + pip install -v . - $ ./waf build +- :ref:`install python-aubio from a pre-compiled binary <python-install>`:: -To install the freshly built C library and tools, simply run the following -command:: + # conda [osx, linux, win] + conda install -c conda-forge aubio + # .deb (debian, ubuntu) [linux] + sudo apt-get install python3-aubio python-aubio aubio-tools + # brew [osx] + brew install aubio --with-python - $ sudo ./waf install +- :ref:`get a pre-compiled version of libaubio <download>`:: -Cleaning --------- + # .deb (linux) WARNING: old version + sudo apt-get install aubio-tools -If you wish to uninstall the files installed by the ``install`` command, use -``uninstall``:: + # python module + ./setup.py install + # using pip + pip install . - $ sudo ./waf uninstall +- :ref:`check the list of optional dependencies <requirements>`:: -To clean the source directory, use the ``clean`` command:: + # debian / ubuntu + dpkg -l libavcodec-dev libavutil-dev libavformat-dev \ + libswresample-dev libavresample-dev \ + libsamplerate-dev libsndfile-dev \ + txt2man doxygen - $ ./waf clean +.. include:: download.rst -To also forget the options previously passed to the last ``./waf configure`` -invocation, use the ``distclean`` command:: +.. include:: building.rst - $ ./waf distclean +.. include:: requirements.rst diff --git a/doc/py_analysis.rst b/doc/py_analysis.rst new file mode 100644 index 0000000..47d7ab1 --- /dev/null +++ b/doc/py_analysis.rst @@ -0,0 +1,15 @@ +.. currentmodule:: aubio +.. default-domain:: py + +Analysis +-------- + +.. members of generated classes are not shown + +.. autoclass:: onset + +.. autoclass:: pitch + +.. autoclass:: tempo + +.. autoclass:: notes diff --git a/doc/py_datatypes.rst b/doc/py_datatypes.rst new file mode 100644 index 0000000..21fad69 --- /dev/null +++ b/doc/py_datatypes.rst @@ -0,0 +1,43 @@ +.. default-domain:: py +.. currentmodule:: aubio + +Data-types +---------- + +This section contains the documentation for :data:`float_type`, +:class:`fvec`, and :class:`cvec`. + +.. defined in rst only + +.. data:: float_type + + A string constant describing the floating-point representation used in + :class:`fvec`, :class:`cvec`, and elsewhere in this module. + + Defaults to `"float32"`. + + If `aubio` was built specifically with the option `--enable-double`, this + string will be defined to `"float64"`. See :ref:`py-doubleprecision` in + :ref:`python-install` for more details on building aubio in double + precision mode. + + .. rubric:: Examples + + >>> aubio.float_type + 'float32' + >>> numpy.zeros(10).dtype + 'float64' + >>> aubio.fvec(10).dtype + 'float32' + >>> np.arange(10, dtype=aubio.float_type).dtype + 'float32' + +.. defined in `python/lib/aubio/__init__.py` + +.. autoclass:: fvec + :members: + +.. defined in `python/ext/py-cvec.h` + +.. autoclass:: cvec + :members: diff --git a/doc/py_examples.rst b/doc/py_examples.rst new file mode 100644 index 0000000..cd20901 --- /dev/null +++ b/doc/py_examples.rst @@ -0,0 +1,42 @@ +.. default-domain:: py +.. currentmodule:: aubio + +Examples +-------- + +Below is a short selection of examples using the aubio module. + +Read a sound file +................. + +Here is a simple script, :download:`demo_source_simple.py +<../python/demos/demo_source_simple.py>` that reads all the samples from a +media file using :class:`source`: + +.. literalinclude:: ../python/demos/demo_source_simple.py + :language: python + +Filter a sound file +................... + +Here is another example, :download:`demo_filter.py +<../python/demos/demo_filter.py>`, which applies a filter to a sound file +and writes the filtered signal in another file: + +* read audio samples from a file with :class:`source` + +* filter them using an `A-weighting <https://en.wikipedia.org/wiki/A-weighting>`_ + filter using :class:`digital_filter` + +* write the filtered samples to a new file with :class:`sink`. + +.. literalinclude:: ../python/demos/demo_filter.py + :language: python + +More examples +............. + +For more examples showing how to use other components of the module, see +the `python demos folder`_. + +.. _python demos folder: https://github.com/aubio/aubio/blob/master/python/demos diff --git a/doc/py_io.rst b/doc/py_io.rst new file mode 100644 index 0000000..6be251f --- /dev/null +++ b/doc/py_io.rst @@ -0,0 +1,118 @@ +.. currentmodule:: aubio +.. default-domain:: py + +Input/Output +------------ + +This section contains the documentation for two classes: +:class:`source`, to read audio samples from files, and :class:`sink`, +to write audio samples to disk. + +.. defined in `python/ext` + +.. + Note: __call__ docstrings of objects defined in C must be written + specifically in RST, since there is no known way to add them to + their C implementation. + +.. + TODO: remove special-members documentation + +.. defined in py-source.c + +.. autoclass:: source + :members: + :special-members: __enter__ + :no-special-members: + + .. function:: __call__() + + Read at most `hop_size` new samples from self, return them in + a tuple with the number of samples actually read. + + The returned tuple contains: + + - a vector of shape `(hop_size,)`, filled with the `read` next + samples available, zero-padded if `read < hop_size` + - `read`, an integer indicating the number of samples read + + If opened with more than one channel, the frames will be + down-mixed to produce the new samples. + + :returns: A tuple of one array of samples and one integer. + :rtype: (array, int) + + .. seealso:: :meth:`__next__` + + .. rubric:: Example + + >>> src = aubio.source('stereo.wav') + >>> while True: + ... samples, read = src() + ... if read < src.hop_size: + ... break + + .. function:: __next__() + + Read at most `hop_size` new frames from self, return them in + an array. + + If source was opened with one channel, next(self) returns + an array of shape `(read,)`, where `read` is the actual + number of frames read (`0 <= read <= hop_size`). + + If `source` was opened with more then one channel, the + returned arrays will be of shape `(channels, read)`, where + `read` is the actual number of frames read (`0 <= read <= + hop_size`). + + :return: A tuple of one array of frames and one integer. + :rtype: (array, int) + + .. seealso:: :meth:`__call__` + + .. rubric:: Example + + >>> for frames in aubio.source('song.flac') + ... print(samples.shape) + + .. function:: __iter__() + + Implement iter(self). + + .. seealso:: :meth:`__next__` + + .. function:: __enter__() + + Implement context manager interface. The file will be opened + upon entering the context. See `with` statement. + + .. rubric:: Example + + >>> with aubio.source('loop.ogg') as src: + ... src.uri, src.samplerate, src.channels + + .. function:: __exit__() + + Implement context manager interface. The file will be closed + before exiting the context. See `with` statement. + + .. seealso:: :meth:`__enter__` + +.. py-sink.c + TODO: remove special-members documentation + +.. autoclass:: aubio.sink + :members: + + .. function:: __call__(vec, length) + + Write `length` samples from `vec`. + + :param array vec: input vector to write from + :param int length: number of samples to write + :example: + + >>> with aubio.sink('foo.wav') as snk: + ... snk(aubio.fvec(1025), 1025) + diff --git a/doc/py_spectral.rst b/doc/py_spectral.rst new file mode 100644 index 0000000..1697041 --- /dev/null +++ b/doc/py_spectral.rst @@ -0,0 +1,34 @@ +.. currentmodule:: aubio +.. default-domain:: py + +.. members of generated classes are not yet documented + +Spectral features +----------------- + +This section contains the documentation for: + +- :class:`dct` +- :class:`fft` +- :class:`filterbank` +- :class:`mfcc` +- :class:`pvoc` +- :class:`specdesc` +- :class:`tss` + +.. autoclass:: dct + +.. autoclass:: fft + :members: + +.. autoclass:: filterbank + :members: + +.. autoclass:: mfcc + +.. autoclass:: pvoc + :members: + +.. autoclass:: specdesc + +.. autoclass:: tss diff --git a/doc/py_synth.rst b/doc/py_synth.rst new file mode 100644 index 0000000..d7afb24 --- /dev/null +++ b/doc/py_synth.rst @@ -0,0 +1,9 @@ +.. currentmodule:: aubio +.. default-domain:: py + +Synthesis +--------- + +.. autoclass:: sampler + +.. autoclass:: wavetable diff --git a/doc/py_temporal.rst b/doc/py_temporal.rst new file mode 100644 index 0000000..9a0bb39 --- /dev/null +++ b/doc/py_temporal.rst @@ -0,0 +1,8 @@ +.. currentmodule:: aubio +.. default-domain:: py + +Digital filters +--------------- + +.. autoclass:: digital_filter + :members: diff --git a/doc/py_utils.rst b/doc/py_utils.rst new file mode 100644 index 0000000..94debf8 --- /dev/null +++ b/doc/py_utils.rst @@ -0,0 +1,84 @@ +.. default-domain:: py +.. currentmodule:: aubio + +Utilities +--------- + +This section documents various helper functions included in the aubio library. + +Note name conversion +.................... + +.. midiconv.py + +.. autofunction:: note2midi + +.. autofunction:: midi2note + +.. autofunction:: freq2note + +.. autofunction:: note2freq + +Frequency conversion +.................... + +.. python/ext/ufuncs.c + +.. autofunction:: freqtomidi + +.. autofunction:: miditofreq + +.. python/ext/py-musicutils.h + +.. autofunction:: meltohz + +.. autofunction:: hztomel + +.. python/ext/aubiomodule.c + +.. autofunction:: bintomidi +.. autofunction:: miditobin +.. autofunction:: bintofreq +.. autofunction:: freqtobin + +Audio file slicing +.................. + +.. slicing.py + +.. autofunction:: slice_source_at_stamps + +Windowing +......... + +.. python/ext/py-musicutils.h + +.. autofunction:: window + +Audio level detection +..................... + +.. python/ext/py-musicutils.h + +.. autofunction:: level_lin +.. autofunction:: db_spl +.. autofunction:: silence_detection +.. autofunction:: level_detection + +Vector utilities +................ + +.. python/ext/aubiomodule.c + +.. autofunction:: alpha_norm +.. autofunction:: zero_crossing_rate +.. autofunction:: min_removal + +.. python/ext/py-musicutils.h + +.. autofunction:: shift +.. autofunction:: ishift + +.. python/ext/ufuncs.c + +.. autofunction:: unwrap2pi diff --git a/doc/python.rst b/doc/python.rst new file mode 100644 index 0000000..cc91244 --- /dev/null +++ b/doc/python.rst @@ -0,0 +1,59 @@ +.. make sure our default-domain is python here +.. default-domain:: py + +.. set current module +.. currentmodule:: aubio + +.. + we follow numpy type docstrings, see: + https://numpydoc.readthedocs.io/en/latest/format.html#docstring-standard +.. + note: we do not import aubio's docstring, which will be displayed from an + interpreter. + +.. .. automodule:: aubio + + +.. _python: + +Python documentation +==================== + +This module provides a number of classes and functions for the analysis of +music and audio signals. + +Contents +-------- + +.. toctree:: + :maxdepth: 1 + + py_datatypes + py_io + py_temporal + py_spectral + py_analysis + py_synth + py_utils + py_examples + +Introduction +------------ + +This document provides a reference guide. For documentation on how to +install aubio, see :ref:`python-install`. + +Examples included in this guide and within the code are written assuming +both `aubio` and `numpy`_ have been imported: + +.. code-block:: python + + >>> import aubio + >>> import numpy as np + +`Changed in 0.4.8` : Prior to this version, almost no documentation was +provided with the python module. This version adds documentation for some +classes, including :class:`fvec`, :class:`cvec`, :class:`source`, and +:class:`sink`. + +.. _numpy: https://www.numpy.org diff --git a/doc/python_module.rst b/doc/python_module.rst index eab5cd4..cd04f18 100644 --- a/doc/python_module.rst +++ b/doc/python_module.rst @@ -1,33 +1,103 @@ -aubio Python module -=================== +.. _python-install: -Building the module -------------------- +Installing aubio for Python +=========================== -From ``aubio`` source directory, run the following: +aubio is available as a package for Python 2.7 and Python 3. The aubio +extension is written C using the `Python/C`_ and the `Numpy/C`_ APIs. -.. code-block:: bash +.. _Python/C: https://docs.python.org/c-api/index.html +.. _Numpy/C: https://docs.scipy.org/doc/numpy/reference/c-api.html - $ cd python - $ ./setup.py build - $ sudo ./setup.py install +For general documentation on how to install Python packages, see `Installing +Packages`_. -Using the module +Installing aubio with pip +------------------------- + +aubio can be installed from `PyPI`_ using ``pip``: + +.. code-block:: console + + $ pip install aubio + +See also `Installing from PyPI`_ for general documentation. + +.. note:: + + aubio is currently a `source only`_ package, so you will need a compiler to + install it from `PyPI`_. See also `Installing aubio with conda`_ for + pre-compiled binaries. + +.. _PyPI: https://pypi.python.org/pypi/aubio +.. _Installing Packages: https://packaging.python.org/tutorials/installing-packages/ +.. _Installing from PyPI: https://packaging.python.org/tutorials/installing-packages/#installing-from-pypi +.. _source only: https://packaging.python.org/tutorials/installing-packages/#source-distributions-vs-wheels + +Installing aubio with conda +--------------------------- + +`Conda packages`_ are available through the `conda-forge`_ channel for Linux, +macOS, and Windows: + +.. code-block:: console + + $ conda config --add channels conda-forge + $ conda install -c conda-forge aubio + +.. _Conda packages: https://anaconda.org/conda-forge/aubio +.. _conda-forge: https://conda-forge.org/ + +.. _py-doubleprecision: + +Double precision ---------------- -To use the python module, simply import aubio: +This module can be compiled in double-precision mode, in which case the +default type for floating-point samples will be 64-bit. The default is +single precision mode (32-bit, recommended). + +To build the aubio module with double precision, use the option +`--enable-double` of the `build_ext` subcommand: + +.. code:: bash + + $ ./setup.py clean + $ ./setup.py build_ext --enable-double + $ pip install -v . + +**Note**: If linking against `libaubio`, make sure the library was also +compiled in :ref:`doubleprecision` mode. + + +Checking your installation +-------------------------- + +Once the python module is installed, its version can be checked with: + +.. code-block:: console + + $ python -c "import aubio; print(aubio.version, aubio.float_type)" + +The command line `aubio` is also installed: + +.. code-block:: console + + $ aubio -h + -.. code-block:: python +Python tests +------------ - #! /usr/bin/env python - import aubio +A number of Python tests are provided in the `python/tests`_ folder. To run +them, install `pytest`_ and run it from the aubio source directory: - s = aubio.source(sys.argv[1], 0, 256) - while True: - samples, read = s() - print samples - if read < 256: break +.. code-block:: console -Check out the `python demos for aubio -<https://github.com/piem/aubio/blob/develop/python/demos/>`_ for more examples. + $ pip install pytest + $ git clone https://git.aubio.org/aubio/aubio + $ cd aubio + $ pytest +.. _python/tests: https://github.com/aubio/aubio/blob/master/python/tests +.. _pytest: https://pytest.org diff --git a/doc/requirements.rst b/doc/requirements.rst new file mode 100644 index 0000000..fcad2d1 --- /dev/null +++ b/doc/requirements.rst @@ -0,0 +1,373 @@ +.. _requirements: + +Build options +============= + +If built without any external dependencies aubio can be somewhat useful, for +instance to read, process, and write simple wav files. + +To support more media input formats and add more features to aubio, you can use +one or all of the following `external libraries`_. + +You may also want to know more about the `other options`_ and the `platform +notes`_ + +The configure script will automatically for these extra libraries. To make sure +the library or feature is used, pass the `--enable-flag` to waf. To disable +this feature, use `--disable-feature`. + +To find out more about the build commands, use the `--verbose` option. + +External libraries +------------------ + +External libraries are checked for using ``pkg-config``. Set the +``PKG_CONFIG_PATH`` environment variable if you have them installed in an +unusual location. + + +.. note:: + + If ``pkg-config`` is not found in ``PATH``, the configure step will + succeed, but none of the external libraries will be used. + +Media libraries +--------------- + +libav +..... + + `libav.org <https://libav.org/>`_, open source audio and video processing + tools. + +If all of the following libraries are found, they will be used to compile +``aubio_source_avcodec``. so that ``aubio_source`` will be able to decode audio +from all formats supported by `libav +<https://libav.org/documentation/general.html#Audio-Codecs>`_. + +* libavcodec +* libavformat +* libavutil +* libavresample + +To enable this option, configure with ``--enable-avcodec``. The build will then +failed if the required libraries are not found. To disable this option, +configure with ``--disable-avcodec`` + + +libsndfile +.......... + + `libsndfile <http://www.mega-nerd.com/libsndfile/>`_, a C library for reading + and writing sampled sound files. + +With libsndfile built in, ``aubio_source_sndfile`` will be built in and used by +``aubio_source``. + +To enable this option, configure with ``--enable-sndfile``. The build will then +fail if the required library is not found. To disable this option, configure +with ``--disable-sndfile`` + +libsamplerate +............. + + `libsamplerate <http://www.mega-nerd.com/SRC/>`_, a sample rate converter for + audio. + +With libsamplerate built in, ``aubio_source_sndfile`` will support resampling, +and ``aubio_resample`` will be fully functional. + +To enable this option, configure with ``--enable-samplerate``. The build will +then fail if the required library is not found. To disable this option, +configure with ``--disable-samplerate`` + +Optimisation libraries +---------------------- + +libfftw3 +........ + + `FFTW <http://fftw.org/>`_, a C subroutine for computing the discrete Fourier + transform + +With libfftw3 built in, ``aubio_fft`` will use `FFTW`_ to +compute Fast Fourier Transform (FFT), allowing aubio to compute FFT on length +that are not a power of 2. + +To enable this option, configure with ``--enable-fftw3``. The build will +then fail if the required library is not found. To disable this option, +configure with ``--disable-fftw3`` + +blas +.... + +On macOs/iOS, `blas +<https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms>`_ are made +available through the Accelerate framework. + +On Linux, they can be enabled with ``--enable-blas``. On Debian (etch), +`atlas`_, `openblas`_, and `libblas`_ have been successfully tested. + +When enabled, ``waf`` will check for the current blas configuration by running +``pkg-config --libs blas``. Depending of the library path returned by +``pkg-config``, different headers will be searched for. + +.. note:: + + On Debian systems, `multiple versions of BLAS and LAPACK + <https://wiki.debian.org/DebianScience/LinearAlgebraLibraries>`_ can be + installed. To configure which libblas is being used: + + .. code-block:: console + + $ sudo update-alternatives --config libblas.so + +.. + Expected pkg-config output for each alternative: + /usr/lib/atlas-base/atlas/libblas.so + -L/usr/lib/atlas-base/atlas -lblas + /usr/lib/openblas-base/libblas.so + -L/usr/lib/openblas-base -lblas + /usr/lib/libblas/libblas.so + -lblas + +atlas +..... + +`ATLAS BLAS APIs <http://math-atlas.sourceforge.net/>`_ will be used the path +returned by ``pkg-config --libs blas`` contains ``atlas``. + +.. + ``<atlas/cblas.h>`` will be included. + +Example: + +.. code-block:: console + + $ pkg-config --libs blas + -L/usr/lib/atlas-base/atlas -lblas + $ ./waf configure --enable-atlas + [...] + Checking for 'blas' : yes + Checking for header atlas/cblas.h : yes + +openblas +........ + +`OpenBlas libraries <https://www.openblas.net/>`_ will be used when the output +of ``pkg-config --libs blas`` contains 'openblas', + +.. + ``<openblas/cblas.h>`` will be included. + +Example: + +.. code-block:: console + + $ pkg-config --libs blas + -L/usr/lib/openblas-base -lblas + $ ./waf configure --enable-atlas + [...] + Checking for 'blas' : yes + Checking for header openblas/cblas.h : yes + +libblas +....... + +`Netlib's libblas (LAPACK) <https://www.netlib.org/lapack/>`_ will be used if +no specific library path is specified by ``pkg-config`` + +.. + ``<cblas.h>`` will be included. + +Example: + +.. code-block:: console + + $ pkg-config --libs blas + -lblas + $ ./waf configure --enable-atlas + [...] + Checking for 'blas' : yes + Checking for header cblas.h : yes + + +Platform notes +-------------- + +On all platforms, you will need to have installed: + + - a compiler (gcc, clang, msvc, ...) + - python (any version >= 2.7, including 3.x) + - a terminal to run command lines in + +Linux +..... + +The following `External libraries`_ will be used if found: `libav`_, +`libsamplerate`_, `libsndfile`_, `libfftw3`_. + +macOS +..... + +The following system frameworks will be used on Mac OS X systems: + + - `Accelerate <https://developer.apple.com/reference/accelerate>`_ to compute + FFTs and other vectorized operations optimally. + + - `CoreAudio <https://developer.apple.com/reference/coreaudio>`_ and + `AudioToolbox <https://developer.apple.com/reference/audiotoolbox>`_ to + decode audio from files and network streams. + +.. note:: + + To build a fat binary for both ``i386`` and ``x86_64``, use ``./waf configure + --enable-fat``. + +The following `External libraries`_ will also be checked: `libav`_, +`libsamplerate`_, `libsndfile`_, `libfftw3`_. + +To build a fat binary on a darwin like system (macOS, tvOS, appleOS, ...) +platforms, configure with ``--enable-fat``. + +Windows +....... + +To use a specific version of the compiler, ``--msvc_version``. To build for a +specific architecture, use ``--msvc_target``. For instance, to build aubio +for ``x86`` using ``msvc 12.0``, use: + +.. code:: bash + + waf configure --msvc_version='msvc 12.0' --msvc_target='x86' + + +The following `External libraries`_ will be used if found: `libav`_, +`libsamplerate`_, `libsndfile`_, `libfftw3`_. + +iOS +... + +The following system frameworks will be used on iOS and iOS Simulator. + + - `Accelerate <https://developer.apple.com/reference/accelerate>`_ to compute + FFTs and other vectorized operations optimally. + + - `CoreAudio <https://developer.apple.com/reference/coreaudio>`_ and + `AudioToolbox <https://developer.apple.com/reference/audiotoolbox>`_ to + decode audio from files and network streams. + +To build aubio for iOS, configure with ``--with-target-platform=ios``. For the +iOS Simulator, use ``--with-target-platform=iosimulator`` instead. + +By default, aubio is built with the following flags on iOS: + +.. code:: bash + + CFLAGS="-fembed-bitcode -arch arm64 -arch armv7 -arch armv7s -miphoneos-version-min=6.1" + +and on iOS Simulator: + +.. code:: + + CFLAGS="-arch i386 -arch x86_64 -mios-simulator-version-min=6.1" + +Set ``CFLAGS`` and ``LINKFLAGS`` to change these default values, or edit +``wscript`` directly. + +Other options +------------- + +Some additional options can be passed to the configure step. For the complete +list of options, run: + +.. code:: bash + + $ ./waf --help + +Here is an example of a custom command: + +.. code:: bash + + $ ./waf --verbose configure build install \ + --enable-avcodec --enable-wavread --disable-wavwrite \ + --enable-sndfile --enable-samplerate --enable-docs \ + --destdir $PWD/build/destdir --testcmd="echo %s" \ + --prefix=/opt --libdir=/opt/lib/multiarch \ + --manpagesdir=/opt/share/man \ + uninstall clean distclean dist distcheck + +.. _doubleprecision: + +Double precision +................ + +The datatype used to store real numbers in aubio is named `smpl_t`. By default, +`smpl_t` is defined as `float`, a `single-precision format +<https://en.wikipedia.org/wiki/Single-precision_floating-point_format>`_ +(32-bit). Some algorithms require a floating point representation with a +higher precision, for instance to prevent arithmetic underflow in recursive +filters. In aubio, these special samples are named `lsmp_t` and defined as +`double` by default (64-bit). + +Sometimes it may be useful to compile aubio in `double-precision`, for instance +to reproduce numerical results obtained with 64-bit routines. In this case, +`smpl_t` will be defined as `double`. + +The following table shows how `smpl_t` and `lsmp_t` are defined in single- and +double-precision modes: + +.. list-table:: Single and double-precision modes + :align: center + + * - + - single + - double + * - `smpl_t` + - ``float`` + - ``double`` + * - `lsmp_t` + - ``double`` + - ``long double`` + +To compile aubio in double precision mode, configure with ``--enable-double``. + +To compile in single-precision mode (default), use ``--disable-double`` (or +simply none of these two options). + +Disabling the tests +................... + +In some case, for instance when cross-compiling, unit tests should not be run. +Option ``--notests`` can be used for this purpose. The tests will not be +executed, but the binaries will be compiled, ensuring that linking against +libaubio works as expected. + +.. note:: + + The ``--notests`` option should be passed to both ``build`` and ``install`` + targets, otherwise waf will try to run them. + +Edit wscript +............ + +Many of the options are gathered in the file `wscript`. a good starting point +when looking for additional options. + +.. _build_docs: + +Building the docs +----------------- + +If the following command line tools are found, the documentation will be built +built: + + - `doxygen <http://doxygen.org>`_ to build the :ref:`doxygen-documentation`. + - `txt2man <https://github.com/mvertes/txt2man>`_ to build the :ref:`manpages` + - `sphinx <http://sphinx-doc.org>`_ to build this document + +These tools are searched for in the current ``PATH`` environment variable. +By default, the documentation is built only if the tools are found. + +To disable the documentation, configure with ``--disable-docs``. To build with +the documentation, configure with ``--enable-docs``. diff --git a/doc/statuslinks.rst b/doc/statuslinks.rst new file mode 100644 index 0000000..5be7ffb --- /dev/null +++ b/doc/statuslinks.rst @@ -0,0 +1,24 @@ +Current status +============== + +.. image:: https://travis-ci.org/aubio/aubio.svg?branch=master + :target: https://travis-ci.org/aubio/aubio + :alt: Travis build status + +.. image:: https://ci.appveyor.com/api/projects/status/f3lhy3a57rkgn5yi?svg=true + :target: https://ci.appveyor.com/project/piem/aubio/ + :alt: Appveyor build status + +.. image:: https://landscape.io/github/aubio/aubio/master/landscape.svg?style=flat + :target: https://landscape.io/github/aubio/aubio/master + :alt: Landscape code health + +.. image:: https://readthedocs.org/projects/aubio/badge/?version=latest + :target: https://aubio.readthedocs.io/en/latest/?badge=latest + :alt: Documentation status + +.. image:: https://img.shields.io/github/commits-since/aubio/aubio/latest.svg + :target: https://github.com/aubio/aubio + :alt: Commits since last release + + diff --git a/doc/web.cfg b/doc/web.cfg index 7497775..97951c7 100644 --- a/doc/web.cfg +++ b/doc/web.cfg @@ -1,4 +1,4 @@ -# Doxyfile 1.8.8 +# Doxyfile 1.8.13 # This file describes the settings to be used by the documentation system # doxygen (www.doxygen.org) for a project. @@ -38,7 +38,7 @@ PROJECT_NAME = aubio # could be handy for archiving the generated documentation or if some version # control system is used. -PROJECT_NUMBER = "0.4.2~alpha" +PROJECT_NUMBER = "latest" # Using the PROJECT_BRIEF tag one can provide an optional one line description # for a project that appears at the top of each page and should give viewer a @@ -46,10 +46,10 @@ PROJECT_NUMBER = "0.4.2~alpha" PROJECT_BRIEF = -# With the PROJECT_LOGO tag one can specify an logo or icon that is included in -# the documentation. The maximum height of the logo should not exceed 55 pixels -# and the maximum width should not exceed 200 pixels. Doxygen will copy the logo -# to the output directory. +# With the PROJECT_LOGO tag one can specify a logo or an icon that is included +# in the documentation. The maximum height of the logo should not exceed 55 +# pixels and the maximum width should not exceed 200 pixels. Doxygen will copy +# the logo to the output directory. PROJECT_LOGO = @@ -60,7 +60,7 @@ PROJECT_LOGO = OUTPUT_DIRECTORY = web -# If the CREATE_SUBDIRS tag is set to YES, then doxygen will create 4096 sub- +# If the CREATE_SUBDIRS tag is set to YES then doxygen will create 4096 sub- # directories (in 2 levels) under the output directory of each output format and # will distribute the generated files over these directories. Enabling this # option can be useful when feeding doxygen a huge amount of source files, where @@ -93,14 +93,14 @@ ALLOW_UNICODE_NAMES = NO OUTPUT_LANGUAGE = English -# If the BRIEF_MEMBER_DESC tag is set to YES doxygen will include brief member +# If the BRIEF_MEMBER_DESC tag is set to YES, doxygen will include brief member # descriptions after the members that are listed in the file and class # documentation (similar to Javadoc). Set to NO to disable this. # The default value is: YES. BRIEF_MEMBER_DESC = YES -# If the REPEAT_BRIEF tag is set to YES doxygen will prepend the brief +# If the REPEAT_BRIEF tag is set to YES, doxygen will prepend the brief # description of a member or function before the detailed description # # Note: If both HIDE_UNDOC_MEMBERS and BRIEF_MEMBER_DESC are set to NO, the @@ -135,7 +135,7 @@ ALWAYS_DETAILED_SEC = NO INLINE_INHERITED_MEMB = NO -# If the FULL_PATH_NAMES tag is set to YES doxygen will prepend the full path +# If the FULL_PATH_NAMES tag is set to YES, doxygen will prepend the full path # before files name in the file list and in the header files. If set to NO the # shortest path that makes the file name unique will be used # The default value is: YES. @@ -205,9 +205,9 @@ MULTILINE_CPP_IS_BRIEF = NO INHERIT_DOCS = YES -# If the SEPARATE_MEMBER_PAGES tag is set to YES, then doxygen will produce a -# new page for each member. If set to NO, the documentation of a member will be -# part of the file/class/namespace that contains it. +# If the SEPARATE_MEMBER_PAGES tag is set to YES then doxygen will produce a new +# page for each member. If set to NO, the documentation of a member will be part +# of the file/class/namespace that contains it. # The default value is: NO. SEPARATE_MEMBER_PAGES = NO @@ -276,7 +276,7 @@ OPTIMIZE_OUTPUT_VHDL = NO # instance to make doxygen treat .inc files as Fortran files (default is PHP), # and .f files as C (default is Fortran), use: inc=Fortran f=C. # -# Note For files without extension you can use no_extension as a placeholder. +# Note: For files without extension you can use no_extension as a placeholder. # # Note that for custom extensions you also need to set FILE_PATTERNS otherwise # the files are not read by doxygen. @@ -293,10 +293,19 @@ EXTENSION_MAPPING = MARKDOWN_SUPPORT = YES +# When the TOC_INCLUDE_HEADINGS tag is set to a non-zero value, all headings up +# to that level are automatically included in the table of contents, even if +# they do not have an id attribute. +# Note: This feature currently applies only to Markdown headings. +# Minimum value: 0, maximum value: 99, default value: 0. +# This tag requires that the tag MARKDOWN_SUPPORT is set to YES. + +TOC_INCLUDE_HEADINGS = 0 + # When enabled doxygen tries to link words that correspond to documented # classes, or namespaces to their corresponding documentation. Such a link can -# be prevented in individual cases by by putting a % sign in front of the word -# or globally by setting AUTOLINK_SUPPORT to NO. +# be prevented in individual cases by putting a % sign in front of the word or +# globally by setting AUTOLINK_SUPPORT to NO. # The default value is: YES. AUTOLINK_SUPPORT = YES @@ -336,13 +345,20 @@ SIP_SUPPORT = NO IDL_PROPERTY_SUPPORT = YES # If member grouping is used in the documentation and the DISTRIBUTE_GROUP_DOC -# tag is set to YES, then doxygen will reuse the documentation of the first +# tag is set to YES then doxygen will reuse the documentation of the first # member in the group (if any) for the other members of the group. By default # all members of a group must be documented explicitly. # The default value is: NO. DISTRIBUTE_GROUP_DOC = NO +# If one adds a struct or class to a group and this option is enabled, then also +# any nested class or struct is added to the same group. By default this option +# is disabled and one has to add nested compounds explicitly via \ingroup. +# The default value is: NO. + +GROUP_NESTED_COMPOUNDS = NO + # Set the SUBGROUPING tag to YES to allow class member groups of the same type # (for instance a group of public functions) to be put as a subgroup of that # type (e.g. under the Public Functions section). Set it to NO to prevent @@ -401,7 +417,7 @@ LOOKUP_CACHE_SIZE = 0 # Build related configuration options #--------------------------------------------------------------------------- -# If the EXTRACT_ALL tag is set to YES doxygen will assume all entities in +# If the EXTRACT_ALL tag is set to YES, doxygen will assume all entities in # documentation are documented, even if no documentation was available. Private # class members and static file members will be hidden unless the # EXTRACT_PRIVATE respectively EXTRACT_STATIC tags are set to YES. @@ -411,35 +427,35 @@ LOOKUP_CACHE_SIZE = 0 EXTRACT_ALL = NO -# If the EXTRACT_PRIVATE tag is set to YES all private members of a class will +# If the EXTRACT_PRIVATE tag is set to YES, all private members of a class will # be included in the documentation. # The default value is: NO. EXTRACT_PRIVATE = NO -# If the EXTRACT_PACKAGE tag is set to YES all members with package or internal +# If the EXTRACT_PACKAGE tag is set to YES, all members with package or internal # scope will be included in the documentation. # The default value is: NO. EXTRACT_PACKAGE = NO -# If the EXTRACT_STATIC tag is set to YES all static members of a file will be +# If the EXTRACT_STATIC tag is set to YES, all static members of a file will be # included in the documentation. # The default value is: NO. EXTRACT_STATIC = NO -# If the EXTRACT_LOCAL_CLASSES tag is set to YES classes (and structs) defined -# locally in source files will be included in the documentation. If set to NO +# If the EXTRACT_LOCAL_CLASSES tag is set to YES, classes (and structs) defined +# locally in source files will be included in the documentation. If set to NO, # only classes defined in header files are included. Does not have any effect # for Java sources. # The default value is: YES. EXTRACT_LOCAL_CLASSES = YES -# This flag is only useful for Objective-C code. When set to YES local methods, +# This flag is only useful for Objective-C code. If set to YES, local methods, # which are defined in the implementation section but not in the interface are -# included in the documentation. If set to NO only methods in the interface are +# included in the documentation. If set to NO, only methods in the interface are # included. # The default value is: NO. @@ -464,21 +480,21 @@ HIDE_UNDOC_MEMBERS = NO # If the HIDE_UNDOC_CLASSES tag is set to YES, doxygen will hide all # undocumented classes that are normally visible in the class hierarchy. If set -# to NO these classes will be included in the various overviews. This option has -# no effect if EXTRACT_ALL is enabled. +# to NO, these classes will be included in the various overviews. This option +# has no effect if EXTRACT_ALL is enabled. # The default value is: NO. HIDE_UNDOC_CLASSES = NO # If the HIDE_FRIEND_COMPOUNDS tag is set to YES, doxygen will hide all friend -# (class|struct|union) declarations. If set to NO these declarations will be +# (class|struct|union) declarations. If set to NO, these declarations will be # included in the documentation. # The default value is: NO. HIDE_FRIEND_COMPOUNDS = NO # If the HIDE_IN_BODY_DOCS tag is set to YES, doxygen will hide any -# documentation blocks found inside the body of a function. If set to NO these +# documentation blocks found inside the body of a function. If set to NO, these # blocks will be appended to the function's detailed documentation block. # The default value is: NO. @@ -492,7 +508,7 @@ HIDE_IN_BODY_DOCS = NO INTERNAL_DOCS = NO # If the CASE_SENSE_NAMES tag is set to NO then doxygen will only generate file -# names in lower-case letters. If set to YES upper-case letters are also +# names in lower-case letters. If set to YES, upper-case letters are also # allowed. This is useful if you have classes or files whose names only differ # in case and if your file system supports case sensitive file names. Windows # and Mac users are advised to set this option to NO. @@ -501,12 +517,19 @@ INTERNAL_DOCS = NO CASE_SENSE_NAMES = NO # If the HIDE_SCOPE_NAMES tag is set to NO then doxygen will show members with -# their full class and namespace scopes in the documentation. If set to YES the +# their full class and namespace scopes in the documentation. If set to YES, the # scope will be hidden. # The default value is: NO. HIDE_SCOPE_NAMES = NO +# If the HIDE_COMPOUND_REFERENCE tag is set to NO (default) then doxygen will +# append additional text to a page's title, such as Class Reference. If set to +# YES the compound reference will be hidden. +# The default value is: NO. + +HIDE_COMPOUND_REFERENCE= NO + # If the SHOW_INCLUDE_FILES tag is set to YES then doxygen will put a list of # the files that are included by a file in the documentation of that file. # The default value is: YES. @@ -534,14 +557,14 @@ INLINE_INFO = YES # If the SORT_MEMBER_DOCS tag is set to YES then doxygen will sort the # (detailed) documentation of file and class members alphabetically by member -# name. If set to NO the members will appear in declaration order. +# name. If set to NO, the members will appear in declaration order. # The default value is: YES. SORT_MEMBER_DOCS = YES # If the SORT_BRIEF_DOCS tag is set to YES then doxygen will sort the brief # descriptions of file, namespace and class members alphabetically by member -# name. If set to NO the members will appear in declaration order. Note that +# name. If set to NO, the members will appear in declaration order. Note that # this will also influence the order of the classes in the class list. # The default value is: NO. @@ -586,27 +609,25 @@ SORT_BY_SCOPE_NAME = NO STRICT_PROTO_MATCHING = NO -# The GENERATE_TODOLIST tag can be used to enable ( YES) or disable ( NO) the -# todo list. This list is created by putting \todo commands in the -# documentation. +# The GENERATE_TODOLIST tag can be used to enable (YES) or disable (NO) the todo +# list. This list is created by putting \todo commands in the documentation. # The default value is: YES. GENERATE_TODOLIST = YES -# The GENERATE_TESTLIST tag can be used to enable ( YES) or disable ( NO) the -# test list. This list is created by putting \test commands in the -# documentation. +# The GENERATE_TESTLIST tag can be used to enable (YES) or disable (NO) the test +# list. This list is created by putting \test commands in the documentation. # The default value is: YES. GENERATE_TESTLIST = YES -# The GENERATE_BUGLIST tag can be used to enable ( YES) or disable ( NO) the bug +# The GENERATE_BUGLIST tag can be used to enable (YES) or disable (NO) the bug # list. This list is created by putting \bug commands in the documentation. # The default value is: YES. GENERATE_BUGLIST = YES -# The GENERATE_DEPRECATEDLIST tag can be used to enable ( YES) or disable ( NO) +# The GENERATE_DEPRECATEDLIST tag can be used to enable (YES) or disable (NO) # the deprecated list. This list is created by putting \deprecated commands in # the documentation. # The default value is: YES. @@ -631,8 +652,8 @@ ENABLED_SECTIONS = MAX_INITIALIZER_LINES = 30 # Set the SHOW_USED_FILES tag to NO to disable the list of files generated at -# the bottom of the documentation of classes and structs. If set to YES the list -# will mention the files that were used to generate the documentation. +# the bottom of the documentation of classes and structs. If set to YES, the +# list will mention the files that were used to generate the documentation. # The default value is: YES. SHOW_USED_FILES = YES @@ -696,7 +717,7 @@ CITE_BIB_FILES = QUIET = NO # The WARNINGS tag can be used to turn on/off the warning messages that are -# generated to standard error ( stderr) by doxygen. If WARNINGS is set to YES +# generated to standard error (stderr) by doxygen. If WARNINGS is set to YES # this implies that the warnings are on. # # Tip: Turn warnings on while writing the documentation. @@ -704,7 +725,7 @@ QUIET = NO WARNINGS = YES -# If the WARN_IF_UNDOCUMENTED tag is set to YES, then doxygen will generate +# If the WARN_IF_UNDOCUMENTED tag is set to YES then doxygen will generate # warnings for undocumented members. If EXTRACT_ALL is set to YES then this flag # will automatically be disabled. # The default value is: YES. @@ -721,12 +742,18 @@ WARN_IF_DOC_ERROR = YES # This WARN_NO_PARAMDOC option can be enabled to get warnings for functions that # are documented, but have no documentation for their parameters or return -# value. If set to NO doxygen will only warn about wrong or incomplete parameter -# documentation, but not about the absence of documentation. +# value. If set to NO, doxygen will only warn about wrong or incomplete +# parameter documentation, but not about the absence of documentation. # The default value is: NO. WARN_NO_PARAMDOC = NO +# If the WARN_AS_ERROR tag is set to YES then doxygen will immediately stop when +# a warning is encountered. +# The default value is: NO. + +WARN_AS_ERROR = NO + # The WARN_FORMAT tag determines the format of the warning messages that doxygen # can produce. The string should contain the $file, $line, and $text tags, which # will be replaced by the file and line number from which the warning originated @@ -750,7 +777,7 @@ WARN_LOGFILE = # The INPUT tag is used to specify the files and/or directories that contain # documented source files. You may enter file names like myfile.cpp or # directories like /usr/src/myproject. Separate the files or directories with -# spaces. +# spaces. See also FILE_PATTERNS and EXTENSION_MAPPING # Note: If this tag is empty the current directory is searched. INPUT = ../src @@ -766,12 +793,17 @@ INPUT_ENCODING = UTF-8 # If the value of the INPUT tag contains directories, you can use the # FILE_PATTERNS tag to specify one or more wildcard patterns (like *.cpp and -# *.h) to filter out the source-files in the directories. If left blank the -# following patterns are tested:*.c, *.cc, *.cxx, *.cpp, *.c++, *.java, *.ii, -# *.ixx, *.ipp, *.i++, *.inl, *.idl, *.ddl, *.odl, *.h, *.hh, *.hxx, *.hpp, -# *.h++, *.cs, *.d, *.php, *.php4, *.php5, *.phtml, *.inc, *.m, *.markdown, -# *.md, *.mm, *.dox, *.py, *.f90, *.f, *.for, *.tcl, *.vhd, *.vhdl, *.ucf, -# *.qsf, *.as and *.js. +# *.h) to filter out the source-files in the directories. +# +# Note that for custom extensions or not directly supported extensions you also +# need to set EXTENSION_MAPPING for the extension otherwise the files are not +# read by doxygen. +# +# If left blank the following patterns are tested:*.c, *.cc, *.cxx, *.cpp, +# *.c++, *.java, *.ii, *.ixx, *.ipp, *.i++, *.inl, *.idl, *.ddl, *.odl, *.h, +# *.hh, *.hxx, *.hpp, *.h++, *.cs, *.d, *.php, *.php4, *.php5, *.phtml, *.inc, +# *.m, *.markdown, *.md, *.mm, *.dox, *.py, *.pyw, *.f90, *.f95, *.f03, *.f08, +# *.f, *.for, *.tcl, *.vhd, *.vhdl, *.ucf and *.qsf. FILE_PATTERNS = *.h @@ -790,6 +822,7 @@ RECURSIVE = YES EXCLUDE = ../src/aubio_priv.h \ ../src/mathutils.h \ + ../src/io/ioutils.h \ ../src/io/audio_unit.h \ ../src/io/source_sndfile.h \ ../src/io/source_apple_audio.h \ @@ -802,6 +835,7 @@ EXCLUDE = ../src/aubio_priv.h \ ../src/pitch/pitchmcomb.h \ ../src/pitch/pitchyin.h \ ../src/pitch/pitchyinfft.h \ + ../src/pitch/pitchyinfast.h \ ../src/pitch/pitchschmitt.h \ ../src/pitch/pitchfcomb.h \ ../src/pitch/pitchspecacf.h \ @@ -878,6 +912,10 @@ IMAGE_PATH = # Note that the filter must not add or remove lines; it is applied before the # code is scanned, but not when the output code is generated. If lines are added # or removed, the anchors will not be placed correctly. +# +# Note that for custom extensions or not directly supported extensions you also +# need to set EXTENSION_MAPPING for the extension otherwise the files are not +# properly processed by doxygen. INPUT_FILTER = @@ -887,11 +925,15 @@ INPUT_FILTER = # (like *.cpp=my_cpp_filter). See INPUT_FILTER for further information on how # filters are used. If the FILTER_PATTERNS tag is empty or if none of the # patterns match the file name, INPUT_FILTER is applied. +# +# Note that for custom extensions or not directly supported extensions you also +# need to set EXTENSION_MAPPING for the extension otherwise the files are not +# properly processed by doxygen. FILTER_PATTERNS = # If the FILTER_SOURCE_FILES tag is set to YES, the input filter (if set using -# INPUT_FILTER ) will also be used to filter the input files that are used for +# INPUT_FILTER) will also be used to filter the input files that are used for # producing the source files to browse (i.e. when SOURCE_BROWSER is set to YES). # The default value is: NO. @@ -951,7 +993,7 @@ REFERENCED_BY_RELATION = YES REFERENCES_RELATION = YES # If the REFERENCES_LINK_SOURCE tag is set to YES and SOURCE_BROWSER tag is set -# to YES, then the hyperlinks from functions in REFERENCES_RELATION and +# to YES then the hyperlinks from functions in REFERENCES_RELATION and # REFERENCED_BY_RELATION lists will link to the source code. Otherwise they will # link to the documentation. # The default value is: YES. @@ -998,13 +1040,13 @@ USE_HTAGS = NO VERBATIM_HEADERS = YES -# If the CLANG_ASSISTED_PARSING tag is set to YES, then doxygen will use the +# If the CLANG_ASSISTED_PARSING tag is set to YES then doxygen will use the # clang parser (see: http://clang.llvm.org/) for more accurate parsing at the # cost of reduced performance. This can be particularly helpful with template # rich C++ code for which doxygen's built-in parser lacks the necessary type # information. # Note: The availability of this option depends on whether or not doxygen was -# compiled with the --with-libclang option. +# generated with the -Duse-libclang=ON option for CMake. # The default value is: NO. CLANG_ASSISTED_PARSING = NO @@ -1047,7 +1089,7 @@ IGNORE_PREFIX = # Configuration options related to the HTML output #--------------------------------------------------------------------------- -# If the GENERATE_HTML tag is set to YES doxygen will generate HTML output +# If the GENERATE_HTML tag is set to YES, doxygen will generate HTML output # The default value is: YES. GENERATE_HTML = YES @@ -1113,10 +1155,10 @@ HTML_STYLESHEET = # cascading style sheets that are included after the standard style sheets # created by doxygen. Using this option one can overrule certain style aspects. # This is preferred over using HTML_STYLESHEET since it does not replace the -# standard style sheet and is therefor more robust against future updates. +# standard style sheet and is therefore more robust against future updates. # Doxygen will copy the style sheet files to the output directory. -# Note: The order of the extra stylesheet files is of importance (e.g. the last -# stylesheet in the list overrules the setting of the previous ones in the +# Note: The order of the extra style sheet files is of importance (e.g. the last +# style sheet in the list overrules the setting of the previous ones in the # list). For an example see the documentation. # This tag requires that the tag GENERATE_HTML is set to YES. @@ -1133,7 +1175,7 @@ HTML_EXTRA_STYLESHEET = HTML_EXTRA_FILES = # The HTML_COLORSTYLE_HUE tag controls the color of the HTML output. Doxygen -# will adjust the colors in the stylesheet and background images according to +# will adjust the colors in the style sheet and background images according to # this color. Hue is specified as an angle on a colorwheel, see # http://en.wikipedia.org/wiki/Hue for more information. For instance the value # 0 represents red, 60 is yellow, 120 is green, 180 is cyan, 240 is blue, 300 @@ -1164,8 +1206,9 @@ HTML_COLORSTYLE_GAMMA = 80 # If the HTML_TIMESTAMP tag is set to YES then the footer of each generated HTML # page will contain the date and time when the page was generated. Setting this -# to NO can help when comparing the output of multiple runs. -# The default value is: YES. +# to YES can help to show when doxygen was last run and thus if the +# documentation is up to date. +# The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_TIMESTAMP = NO @@ -1261,28 +1304,28 @@ GENERATE_HTMLHELP = NO CHM_FILE = # The HHC_LOCATION tag can be used to specify the location (absolute path -# including file name) of the HTML help compiler ( hhc.exe). If non-empty +# including file name) of the HTML help compiler (hhc.exe). If non-empty, # doxygen will try to run the HTML help compiler on the generated index.hhp. # The file has to be specified with full path. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. HHC_LOCATION = -# The GENERATE_CHI flag controls if a separate .chi index file is generated ( -# YES) or that it should be included in the master .chm file ( NO). +# The GENERATE_CHI flag controls if a separate .chi index file is generated +# (YES) or that it should be included in the master .chm file (NO). # The default value is: NO. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. GENERATE_CHI = NO -# The CHM_INDEX_ENCODING is used to encode HtmlHelp index ( hhk), content ( hhc) +# The CHM_INDEX_ENCODING is used to encode HtmlHelp index (hhk), content (hhc) # and project file content. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. CHM_INDEX_ENCODING = -# The BINARY_TOC flag controls whether a binary table of contents is generated ( -# YES) or a normal table of contents ( NO) in the .chm file. Furthermore it +# The BINARY_TOC flag controls whether a binary table of contents is generated +# (YES) or a normal table of contents (NO) in the .chm file. Furthermore it # enables the Previous and Next buttons. # The default value is: NO. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. @@ -1396,7 +1439,7 @@ DISABLE_INDEX = NO # index structure (just like the one that is generated for HTML Help). For this # to work a browser that supports JavaScript, DHTML, CSS and frames is required # (i.e. any modern browser). Windows users are probably better off using the -# HTML help feature. Via custom stylesheets (see HTML_EXTRA_STYLESHEET) one can +# HTML help feature. Via custom style sheets (see HTML_EXTRA_STYLESHEET) one can # further fine-tune the look of the index. As an example, the default style # sheet generated by doxygen has an example that shows how to put an image at # the root of the tree instead of the PROJECT_NAME. Since the tree basically has @@ -1424,7 +1467,7 @@ ENUM_VALUES_PER_LINE = 4 TREEVIEW_WIDTH = 250 -# When the EXT_LINKS_IN_WINDOW option is set to YES doxygen will open links to +# If the EXT_LINKS_IN_WINDOW option is set to YES, doxygen will open links to # external symbols imported via tag files in a separate window. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. @@ -1453,7 +1496,7 @@ FORMULA_TRANSPARENT = YES # Enable the USE_MATHJAX option to render LaTeX formulas using MathJax (see # http://www.mathjax.org) which uses client side Javascript for the rendering -# instead of using prerendered bitmaps. Use this if you do not have LaTeX +# instead of using pre-rendered bitmaps. Use this if you do not have LaTeX # installed or if you want to formulas look prettier in the HTML output. When # enabled you may also need to install MathJax separately and configure the path # to it using the MATHJAX_RELPATH option. @@ -1539,7 +1582,7 @@ SERVER_BASED_SEARCH = NO # external search engine pointed to by the SEARCHENGINE_URL option to obtain the # search results. # -# Doxygen ships with an example indexer ( doxyindexer) and search engine +# Doxygen ships with an example indexer (doxyindexer) and search engine # (doxysearch.cgi) which are based on the open source search engine library # Xapian (see: http://xapian.org/). # @@ -1552,7 +1595,7 @@ EXTERNAL_SEARCH = NO # The SEARCHENGINE_URL should point to a search engine hosted by a web server # which will return the search results when EXTERNAL_SEARCH is enabled. # -# Doxygen ships with an example indexer ( doxyindexer) and search engine +# Doxygen ships with an example indexer (doxyindexer) and search engine # (doxysearch.cgi) which are based on the open source search engine library # Xapian (see: http://xapian.org/). See the section "External Indexing and # Searching" for details. @@ -1590,7 +1633,7 @@ EXTRA_SEARCH_MAPPINGS = # Configuration options related to the LaTeX output #--------------------------------------------------------------------------- -# If the GENERATE_LATEX tag is set to YES doxygen will generate LaTeX output. +# If the GENERATE_LATEX tag is set to YES, doxygen will generate LaTeX output. # The default value is: YES. GENERATE_LATEX = NO @@ -1621,7 +1664,7 @@ LATEX_CMD_NAME = latex MAKEINDEX_CMD_NAME = makeindex -# If the COMPACT_LATEX tag is set to YES doxygen generates more compact LaTeX +# If the COMPACT_LATEX tag is set to YES, doxygen generates more compact LaTeX # documents. This may be useful for small projects and may help to save some # trees in general. # The default value is: NO. @@ -1639,9 +1682,12 @@ COMPACT_LATEX = NO PAPER_TYPE = a4 # The EXTRA_PACKAGES tag can be used to specify one or more LaTeX package names -# that should be included in the LaTeX output. To get the times font for -# instance you can specify -# EXTRA_PACKAGES=times +# that should be included in the LaTeX output. The package can be specified just +# by its name or with the correct syntax as to be used with the LaTeX +# \usepackage command. To get the times font for instance you can specify : +# EXTRA_PACKAGES=times or EXTRA_PACKAGES={times} +# To use the option intlimits with the amsmath package you can specify: +# EXTRA_PACKAGES=[intlimits]{amsmath} # If left blank no extra packages will be included. # This tag requires that the tag GENERATE_LATEX is set to YES. @@ -1656,9 +1702,9 @@ EXTRA_PACKAGES = # Note: Only use a user-defined header if you know what you are doing! The # following commands have a special meaning inside the header: $title, # $datetime, $date, $doxygenversion, $projectname, $projectnumber, -# $projectbrief, $projectlogo. Doxygen will replace $title with the empy string, -# for the replacement values of the other commands the user is refered to -# HTML_HEADER. +# $projectbrief, $projectlogo. Doxygen will replace $title with the empty +# string, for the replacement values of the other commands the user is referred +# to HTML_HEADER. # This tag requires that the tag GENERATE_LATEX is set to YES. LATEX_HEADER = @@ -1674,6 +1720,17 @@ LATEX_HEADER = LATEX_FOOTER = +# The LATEX_EXTRA_STYLESHEET tag can be used to specify additional user-defined +# LaTeX style sheets that are included after the standard style sheets created +# by doxygen. Using this option one can overrule certain style aspects. Doxygen +# will copy the style sheet files to the output directory. +# Note: The order of the extra style sheet files is of importance (e.g. the last +# style sheet in the list overrules the setting of the previous ones in the +# list). +# This tag requires that the tag GENERATE_LATEX is set to YES. + +LATEX_EXTRA_STYLESHEET = + # The LATEX_EXTRA_FILES tag can be used to specify one or more extra images or # other source files which should be copied to the LATEX_OUTPUT output # directory. Note that the files will be copied as-is; there are no commands or @@ -1692,7 +1749,7 @@ LATEX_EXTRA_FILES = PDF_HYPERLINKS = YES # If the USE_PDFLATEX tag is set to YES, doxygen will use pdflatex to generate -# the PDF file directly from the LaTeX files. Set this option to YES to get a +# the PDF file directly from the LaTeX files. Set this option to YES, to get a # higher quality PDF documentation. # The default value is: YES. # This tag requires that the tag GENERATE_LATEX is set to YES. @@ -1733,11 +1790,19 @@ LATEX_SOURCE_CODE = NO LATEX_BIB_STYLE = plain +# If the LATEX_TIMESTAMP tag is set to YES then the footer of each generated +# page will contain the date and time when the page was generated. Setting this +# to NO can help when comparing the output of multiple runs. +# The default value is: NO. +# This tag requires that the tag GENERATE_LATEX is set to YES. + +LATEX_TIMESTAMP = NO + #--------------------------------------------------------------------------- # Configuration options related to the RTF output #--------------------------------------------------------------------------- -# If the GENERATE_RTF tag is set to YES doxygen will generate RTF output. The +# If the GENERATE_RTF tag is set to YES, doxygen will generate RTF output. The # RTF output is optimized for Word 97 and may not look too pretty with other RTF # readers/editors. # The default value is: NO. @@ -1752,7 +1817,7 @@ GENERATE_RTF = NO RTF_OUTPUT = rtf -# If the COMPACT_RTF tag is set to YES doxygen generates more compact RTF +# If the COMPACT_RTF tag is set to YES, doxygen generates more compact RTF # documents. This may be useful for small projects and may help to save some # trees in general. # The default value is: NO. @@ -1789,11 +1854,21 @@ RTF_STYLESHEET_FILE = RTF_EXTENSIONS_FILE = +# If the RTF_SOURCE_CODE tag is set to YES then doxygen will include source code +# with syntax highlighting in the RTF output. +# +# Note that which sources are shown also depends on other settings such as +# SOURCE_BROWSER. +# The default value is: NO. +# This tag requires that the tag GENERATE_RTF is set to YES. + +RTF_SOURCE_CODE = NO + #--------------------------------------------------------------------------- # Configuration options related to the man page output #--------------------------------------------------------------------------- -# If the GENERATE_MAN tag is set to YES doxygen will generate man pages for +# If the GENERATE_MAN tag is set to YES, doxygen will generate man pages for # classes and files. # The default value is: NO. @@ -1837,7 +1912,7 @@ MAN_LINKS = NO # Configuration options related to the XML output #--------------------------------------------------------------------------- -# If the GENERATE_XML tag is set to YES doxygen will generate an XML file that +# If the GENERATE_XML tag is set to YES, doxygen will generate an XML file that # captures the structure of the code including all documentation. # The default value is: NO. @@ -1851,7 +1926,7 @@ GENERATE_XML = NO XML_OUTPUT = xml -# If the XML_PROGRAMLISTING tag is set to YES doxygen will dump the program +# If the XML_PROGRAMLISTING tag is set to YES, doxygen will dump the program # listings (including syntax highlighting and cross-referencing information) to # the XML output. Note that enabling this will significantly increase the size # of the XML output. @@ -1864,7 +1939,7 @@ XML_PROGRAMLISTING = YES # Configuration options related to the DOCBOOK output #--------------------------------------------------------------------------- -# If the GENERATE_DOCBOOK tag is set to YES doxygen will generate Docbook files +# If the GENERATE_DOCBOOK tag is set to YES, doxygen will generate Docbook files # that can be used to generate PDF. # The default value is: NO. @@ -1878,7 +1953,7 @@ GENERATE_DOCBOOK = NO DOCBOOK_OUTPUT = docbook -# If the DOCBOOK_PROGRAMLISTING tag is set to YES doxygen will include the +# If the DOCBOOK_PROGRAMLISTING tag is set to YES, doxygen will include the # program listings (including syntax highlighting and cross-referencing # information) to the DOCBOOK output. Note that enabling this will significantly # increase the size of the DOCBOOK output. @@ -1891,10 +1966,10 @@ DOCBOOK_PROGRAMLISTING = NO # Configuration options for the AutoGen Definitions output #--------------------------------------------------------------------------- -# If the GENERATE_AUTOGEN_DEF tag is set to YES doxygen will generate an AutoGen -# Definitions (see http://autogen.sf.net) file that captures the structure of -# the code including all documentation. Note that this feature is still -# experimental and incomplete at the moment. +# If the GENERATE_AUTOGEN_DEF tag is set to YES, doxygen will generate an +# AutoGen Definitions (see http://autogen.sf.net) file that captures the +# structure of the code including all documentation. Note that this feature is +# still experimental and incomplete at the moment. # The default value is: NO. GENERATE_AUTOGEN_DEF = NO @@ -1903,7 +1978,7 @@ GENERATE_AUTOGEN_DEF = NO # Configuration options related to the Perl module output #--------------------------------------------------------------------------- -# If the GENERATE_PERLMOD tag is set to YES doxygen will generate a Perl module +# If the GENERATE_PERLMOD tag is set to YES, doxygen will generate a Perl module # file that captures the structure of the code including all documentation. # # Note that this feature is still experimental and incomplete at the moment. @@ -1911,7 +1986,7 @@ GENERATE_AUTOGEN_DEF = NO GENERATE_PERLMOD = NO -# If the PERLMOD_LATEX tag is set to YES doxygen will generate the necessary +# If the PERLMOD_LATEX tag is set to YES, doxygen will generate the necessary # Makefile rules, Perl scripts and LaTeX code to be able to generate PDF and DVI # output from the Perl module output. # The default value is: NO. @@ -1919,9 +1994,9 @@ GENERATE_PERLMOD = NO PERLMOD_LATEX = NO -# If the PERLMOD_PRETTY tag is set to YES the Perl module output will be nicely +# If the PERLMOD_PRETTY tag is set to YES, the Perl module output will be nicely # formatted so it can be parsed by a human reader. This is useful if you want to -# understand what is going on. On the other hand, if this tag is set to NO the +# understand what is going on. On the other hand, if this tag is set to NO, the # size of the Perl module output will be much smaller and Perl will parse it # just the same. # The default value is: YES. @@ -1941,14 +2016,14 @@ PERLMOD_MAKEVAR_PREFIX = # Configuration options related to the preprocessor #--------------------------------------------------------------------------- -# If the ENABLE_PREPROCESSING tag is set to YES doxygen will evaluate all +# If the ENABLE_PREPROCESSING tag is set to YES, doxygen will evaluate all # C-preprocessor directives found in the sources and include files. # The default value is: YES. ENABLE_PREPROCESSING = YES -# If the MACRO_EXPANSION tag is set to YES doxygen will expand all macro names -# in the source code. If set to NO only conditional compilation will be +# If the MACRO_EXPANSION tag is set to YES, doxygen will expand all macro names +# in the source code. If set to NO, only conditional compilation will be # performed. Macro expansion can be done in a controlled way by setting # EXPAND_ONLY_PREDEF to YES. # The default value is: NO. @@ -1964,7 +2039,7 @@ MACRO_EXPANSION = NO EXPAND_ONLY_PREDEF = NO -# If the SEARCH_INCLUDES tag is set to YES the includes files in the +# If the SEARCH_INCLUDES tag is set to YES, the include files in the # INCLUDE_PATH will be searched if a #include is found. # The default value is: YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES. @@ -2040,20 +2115,21 @@ TAGFILES = GENERATE_TAGFILE = -# If the ALLEXTERNALS tag is set to YES all external class will be listed in the -# class index. If set to NO only the inherited external classes will be listed. +# If the ALLEXTERNALS tag is set to YES, all external class will be listed in +# the class index. If set to NO, only the inherited external classes will be +# listed. # The default value is: NO. ALLEXTERNALS = NO -# If the EXTERNAL_GROUPS tag is set to YES all external groups will be listed in -# the modules index. If set to NO, only the current project's groups will be +# If the EXTERNAL_GROUPS tag is set to YES, all external groups will be listed +# in the modules index. If set to NO, only the current project's groups will be # listed. # The default value is: YES. EXTERNAL_GROUPS = YES -# If the EXTERNAL_PAGES tag is set to YES all external pages will be listed in +# If the EXTERNAL_PAGES tag is set to YES, all external pages will be listed in # the related pages index. If set to NO, only the current project's pages will # be listed. # The default value is: YES. @@ -2070,7 +2146,7 @@ PERL_PATH = /usr/bin/perl # Configuration options related to the dot tool #--------------------------------------------------------------------------- -# If the CLASS_DIAGRAMS tag is set to YES doxygen will generate a class diagram +# If the CLASS_DIAGRAMS tag is set to YES, doxygen will generate a class diagram # (in HTML and LaTeX) for classes with base or super classes. Setting the tag to # NO turns the diagrams off. Note that this option also works with HAVE_DOT # disabled, but it is recommended to install and use dot, since it yields more @@ -2095,7 +2171,7 @@ MSCGEN_PATH = DIA_PATH = -# If set to YES, the inheritance and collaboration graphs will hide inheritance +# If set to YES the inheritance and collaboration graphs will hide inheritance # and usage relations if the target is undocumented or is not a class. # The default value is: YES. @@ -2168,7 +2244,7 @@ COLLABORATION_GRAPH = YES GROUP_GRAPHS = YES -# If the UML_LOOK tag is set to YES doxygen will generate inheritance and +# If the UML_LOOK tag is set to YES, doxygen will generate inheritance and # collaboration diagrams in a style similar to the OMG's Unified Modeling # Language. # The default value is: NO. @@ -2220,7 +2296,8 @@ INCLUDED_BY_GRAPH = YES # # Note that enabling this option will significantly increase the time of a run. # So in most cases it will be better to enable call graphs for selected -# functions only using the \callgraph command. +# functions only using the \callgraph command. Disabling a call graph can be +# accomplished by means of the command \hidecallgraph. # The default value is: NO. # This tag requires that the tag HAVE_DOT is set to YES. @@ -2231,7 +2308,8 @@ CALL_GRAPH = NO # # Note that enabling this option will significantly increase the time of a run. # So in most cases it will be better to enable caller graphs for selected -# functions only using the \callergraph command. +# functions only using the \callergraph command. Disabling a caller graph can be +# accomplished by means of the command \hidecallergraph. # The default value is: NO. # This tag requires that the tag HAVE_DOT is set to YES. @@ -2254,13 +2332,17 @@ GRAPHICAL_HIERARCHY = YES DIRECTORY_GRAPH = YES # The DOT_IMAGE_FORMAT tag can be used to set the image format of the images -# generated by dot. +# generated by dot. For an explanation of the image formats see the section +# output formats in the documentation of the dot tool (Graphviz (see: +# http://www.graphviz.org/)). # Note: If you choose svg you need to set HTML_FILE_EXTENSION to xhtml in order # to make the SVG files visible in IE 9+ (other browsers do not have this # requirement). # Possible values are: png, png:cairo, png:cairo:cairo, png:cairo:gd, png:gd, # png:gd:gd, jpg, jpg:cairo, jpg:cairo:gd, jpg:gd, jpg:gd:gd, gif, gif:cairo, -# gif:cairo:gd, gif:gd, gif:gd:gd and svg. +# gif:cairo:gd, gif:gd, gif:gd:gd, svg, png:gd, png:gd:gd, png:cairo, +# png:cairo:gd, png:cairo:cairo, png:cairo:gdiplus, png:gdiplus and +# png:gdiplus:gdiplus. # The default value is: png. # This tag requires that the tag HAVE_DOT is set to YES. @@ -2308,10 +2390,19 @@ DIAFILE_DIRS = # PlantUML is not used or called during a preprocessing step. Doxygen will # generate a warning when it encounters a \startuml command in this case and # will not generate output for the diagram. -# This tag requires that the tag HAVE_DOT is set to YES. PLANTUML_JAR_PATH = +# When using plantuml, the PLANTUML_CFG_FILE tag can be used to specify a +# configuration file for plantuml. + +PLANTUML_CFG_FILE = + +# When using plantuml, the specified paths are searched for files specified by +# the !include statement in a plantuml block. + +PLANTUML_INCLUDE_PATH = + # The DOT_GRAPH_MAX_NODES tag can be used to set the maximum number of nodes # that will be shown in the graph. If the number of nodes in a graph becomes # larger than this value, doxygen will truncate the graph, which is visualized @@ -2348,7 +2439,7 @@ MAX_DOT_GRAPH_DEPTH = 0 DOT_TRANSPARENT = NO -# Set the DOT_MULTI_TARGETS tag to YES allow dot to generate multiple output +# Set the DOT_MULTI_TARGETS tag to YES to allow dot to generate multiple output # files in one run (i.e. multiple -o and -T options on the command line). This # makes dot run faster, but since only newer versions of dot (>1.8.10) support # this, this feature is disabled by default. @@ -2365,7 +2456,7 @@ DOT_MULTI_TARGETS = NO GENERATE_LEGEND = YES -# If the DOT_CLEANUP tag is set to YES doxygen will remove the intermediate dot +# If the DOT_CLEANUP tag is set to YES, doxygen will remove the intermediate dot # files that are used to generate the various graphs. # The default value is: YES. # This tag requires that the tag HAVE_DOT is set to YES. diff --git a/doc/xcode_frameworks.rst b/doc/xcode_frameworks.rst new file mode 100644 index 0000000..af06c6c --- /dev/null +++ b/doc/xcode_frameworks.rst @@ -0,0 +1,72 @@ +Frameworks for Xcode +-------------------- + +`Binary frameworks`_ are available and ready to use in your XCode project, for +`iOS`_ and `macOS`_. + +#. Download and extract the corresponding ``framework.zip`` file from the `Download`_ page + +#. Select **Build Phases** in your project setting and unfold **Link Binary with Libraries** + +#. Add *AudioToolbox* and *Accelerate* system frameworks (or make sure they are listed) + +#. Add ``aubio.framework`` from the unzipped ``framework.zip`` + +#. Include the aubio header in your code: + + * in C/C++: + + .. code-block:: c + + #include <aubio/aubio.h> + + * in Obj-C: + + .. code-block:: obj-c + + #import <aubio/aubio.h> + + * in Swift: + + .. code-block:: swift + + import aubio + +Using aubio from swift +---------------------- + +Once you have downloaded and installed :ref:`aubio.framework +<xcode-frameworks-label>`, you sould be able to use aubio from C, Obj-C, and +Swift source files. + + +Here is a short example showing how to read a sound file in swift: + + + .. code-block:: swift + + import aubio + + let path = Bundle.main.path(forResource: "example", ofType: "mp4") + if (path != nil) { + let hop_size : uint_t = 512 + let a = new_fvec(hop_size) + let b = new_aubio_source(path, 0, hop_size) + var read: uint_t = 0 + var total_frames : uint_t = 0 + while (true) { + aubio_source_do(b, a, &read) + total_frames += read + if (read < hop_size) { break } + } + print("read", total_frames, "frames at", aubio_source_get_samplerate(b), "Hz") + del_aubio_source(b) + del_fvec(a) + } else { + print("could not find file") + } + + +.. _Binary frameworks: https://aubio.org/download +.. _iOS: https://aubio.org/download#ios +.. _macOS: https://aubio.org/download#osx diff --git a/examples/aubiomfcc.c b/examples/aubiomfcc.c index d8bb910..f333d63 100644 --- a/examples/aubiomfcc.c +++ b/examples/aubiomfcc.c @@ -48,6 +48,7 @@ void process_print (void) } int main(int argc, char **argv) { + int ret = 0; // change some default params buffer_size = 512; hop_size = 256; @@ -62,15 +63,19 @@ int main(int argc, char **argv) { fftgrain = new_cvec (buffer_size); mfcc = new_aubio_mfcc(buffer_size, n_filters, n_coefs, samplerate); mfcc_out = new_fvec(n_coefs); + if (pv == NULL || fftgrain == NULL || mfcc == NULL || mfcc_out == NULL) { + ret = 1; + goto beach; + } - examples_common_process((aubio_process_func_t)process_block, process_print); + examples_common_process(process_block, process_print); del_aubio_pvoc (pv); del_cvec (fftgrain); del_aubio_mfcc(mfcc); del_fvec(mfcc_out); +beach: examples_common_del(); - return 0; + return ret; } - diff --git a/examples/aubionotes.c b/examples/aubionotes.c index 7541098..0969774 100644 --- a/examples/aubionotes.c +++ b/examples/aubionotes.c @@ -21,6 +21,8 @@ #include "utils.h" #define PROG_HAS_PITCH 1 #define PROG_HAS_ONSET 1 +#define PROG_HAS_NOTES 1 +#define PROG_HAS_SILENCE 1 #define PROG_HAS_JACK 1 // TODO add PROG_HAS_OUTPUT #include "parse_args.h" @@ -33,12 +35,12 @@ void process_block (fvec_t *ibuf, fvec_t *obuf) aubio_notes_do (notes, ibuf, obuf); // did we get a note off? if (obuf->data[2] != 0) { - lastmidi = aubio_freqtomidi (obuf->data[2]) + .5; + lastmidi = obuf->data[2]; send_noteon(lastmidi, 0); } // did we get a note on? if (obuf->data[0] != 0) { - lastmidi = aubio_freqtomidi (obuf->data[0]) + .5; + lastmidi = obuf->data[0]; send_noteon(lastmidi, obuf->data[1]); } } @@ -49,6 +51,8 @@ void process_print (void) } int main(int argc, char **argv) { + int ret = 0; + examples_common_init(argc,argv); verbmsg ("using source: %s at %dHz\n", source_uri, samplerate); @@ -64,15 +68,38 @@ int main(int argc, char **argv) { verbmsg ("tolerance: %f\n", pitch_tolerance); notes = new_aubio_notes ("default", buffer_size, hop_size, samplerate); + if (notes == NULL) { ret = 1; goto beach; } + + if (onset_minioi != 0.) { + aubio_notes_set_minioi_ms(notes, onset_minioi); + } + if (onset_threshold != 0.) { + errmsg ("warning: onset threshold not supported yet\n"); + //aubio_onset_set_threshold(aubio_notes_get_aubio_onset(o), onset_threshold); + } + if (silence_threshold != -90.) { + if (aubio_notes_set_silence (notes, silence_threshold) != 0) { + errmsg ("failed setting notes silence threshold to %.2f\n", + silence_threshold); + } + } + if (release_drop != 10.) { + if (aubio_notes_set_release_drop (notes, release_drop) != 0) { + errmsg ("failed setting notes release drop to %.2f\n", + release_drop); + } + } - examples_common_process((aubio_process_func_t)process_block, process_print); + examples_common_process(process_block, process_print); - // send a last note off - send_noteon (lastmidi, 0); + // send a last note off if required + if (lastmidi) { + send_noteon (lastmidi, 0); + } del_aubio_notes (notes); +beach: examples_common_del(); - return 0; + return ret; } - diff --git a/examples/aubioonset.c b/examples/aubioonset.c index 69e7bc9..ca3496b 100644 --- a/examples/aubioonset.c +++ b/examples/aubioonset.c @@ -21,6 +21,7 @@ #include "utils.h" #define PROG_HAS_ONSET 1 #define PROG_HAS_OUTPUT 1 +#define PROG_HAS_SILENCE 1 #define PROG_HAS_JACK 1 #include "parse_args.h" @@ -42,10 +43,11 @@ void process_block(fvec_t *ibuf, fvec_t *obuf) } else { aubio_wavetable_stop ( wavetable ); } - if (mix_input) + if (mix_input) { aubio_wavetable_do (wavetable, ibuf, obuf); - else + } else { aubio_wavetable_do (wavetable, obuf, obuf); + } } void process_print (void) @@ -57,20 +59,26 @@ void process_print (void) } int main(int argc, char **argv) { + int ret = 0; examples_common_init(argc,argv); - verbmsg ("using source: %s at %dHz\n", source_uri, samplerate); - verbmsg ("onset method: %s, ", onset_method); - verbmsg ("buffer_size: %d, ", buffer_size); - verbmsg ("hop_size: %d, ", hop_size); - verbmsg ("silence: %f, ", silence_threshold); - verbmsg ("threshold: %f\n", onset_threshold); - o = new_aubio_onset (onset_method, buffer_size, hop_size, samplerate); + if (o == NULL) { ret = 1; goto beach; } if (onset_threshold != 0.) aubio_onset_set_threshold (o, onset_threshold); if (silence_threshold != -90.) aubio_onset_set_silence (o, silence_threshold); + if (onset_minioi != 0.) + aubio_onset_set_minioi_s (o, onset_minioi); + + verbmsg ("using source: %s at %dHz\n", source_uri, samplerate); + verbmsg ("onset method: %s, ", onset_method); + verbmsg ("buffer_size: %d, ", buffer_size); + verbmsg ("hop_size: %d, ", hop_size); + verbmsg ("silence: %f, ", aubio_onset_get_silence(o)); + verbmsg ("threshold: %f, ", aubio_onset_get_threshold(o)); + verbmsg ("awhitening: %f, ", aubio_onset_get_awhitening(o)); + verbmsg ("compression: %f\n", aubio_onset_get_compression(o)); onset = new_fvec (1); @@ -78,15 +86,18 @@ int main(int argc, char **argv) { aubio_wavetable_set_freq ( wavetable, 2450.); //aubio_sampler_load (sampler, "/archives/sounds/woodblock.aiff"); - examples_common_process((aubio_process_func_t)process_block, process_print); + examples_common_process(process_block, process_print); // send a last note off - send_noteon (miditap_note, 0); + if (usejack) { + send_noteon (miditap_note, 0); + } del_aubio_onset (o); del_aubio_wavetable (wavetable); del_fvec (onset); +beach: examples_common_del(); - return 0; + return ret; } diff --git a/examples/aubiopitch.c b/examples/aubiopitch.c index bdda950..3a3d37e 100644 --- a/examples/aubiopitch.c +++ b/examples/aubiopitch.c @@ -21,6 +21,7 @@ #include "utils.h" #define PROG_HAS_PITCH 1 #define PROG_HAS_OUTPUT 1 +#define PROG_HAS_SILENCE 1 #define PROG_HAS_JACK 1 #include "parse_args.h" @@ -51,6 +52,7 @@ void process_print (void) } int main(int argc, char **argv) { + int ret = 0; buffer_size = 2048; @@ -64,6 +66,7 @@ int main(int argc, char **argv) { verbmsg ("tolerance: %f\n", pitch_tolerance); o = new_aubio_pitch (pitch_method, buffer_size, hop_size, samplerate); + if (o == NULL) { ret = 1; goto beach; } if (pitch_tolerance != 0.) aubio_pitch_set_tolerance (o, pitch_tolerance); if (silence_threshold != -90.) @@ -76,13 +79,13 @@ int main(int argc, char **argv) { wavetable = new_aubio_wavetable (samplerate, hop_size); aubio_wavetable_play ( wavetable ); - examples_common_process((aubio_process_func_t)process_block,process_print); + examples_common_process(process_block, process_print); del_aubio_pitch (o); del_aubio_wavetable (wavetable); del_fvec (pitch); +beach: examples_common_del(); - return 0; + return ret; } - diff --git a/examples/aubioquiet.c b/examples/aubioquiet.c index bbe158b..1d7c6d8 100644 --- a/examples/aubioquiet.c +++ b/examples/aubioquiet.c @@ -19,6 +19,7 @@ */ #include "utils.h" +#define PROG_HAS_SILENCE 1 #include "parse_args.h" sint_t wassilence = 1, issilence; @@ -54,7 +55,7 @@ int main(int argc, char **argv) { verbmsg ("using source: %s at %dHz\n", source_uri, samplerate); verbmsg ("buffer_size: %d, ", buffer_size); verbmsg ("hop_size: %d\n", hop_size); - examples_common_process((aubio_process_func_t)process_block,process_print); + examples_common_process(process_block, process_print); examples_common_del(); return 0; } diff --git a/examples/aubiotrack.c b/examples/aubiotrack.c index dcffbff..32e8b62 100644 --- a/examples/aubiotrack.c +++ b/examples/aubiotrack.c @@ -21,6 +21,7 @@ #include "utils.h" #define PROG_HAS_TEMPO 1 #define PROG_HAS_ONSET 1 +#define PROG_HAS_SILENCE 1 #define PROG_HAS_OUTPUT 1 #define PROG_HAS_JACK 1 #include "parse_args.h" @@ -45,10 +46,11 @@ void process_block(fvec_t * ibuf, fvec_t *obuf) { } else { aubio_wavetable_stop ( wavetable ); } - if (mix_input) + if (mix_input) { aubio_wavetable_do (wavetable, ibuf, obuf); - else + } else { aubio_wavetable_do (wavetable, obuf, obuf); + } } void process_print (void) { @@ -59,6 +61,7 @@ void process_print (void) { } int main(int argc, char **argv) { + int ret = 0; // override general settings from utils.c buffer_size = 1024; hop_size = 512; @@ -74,24 +77,28 @@ int main(int argc, char **argv) { tempo_out = new_fvec(2); tempo = new_aubio_tempo(tempo_method, buffer_size, hop_size, samplerate); + if (tempo == NULL) { ret = 1; goto beach; } // set silence threshold very low to output beats even during silence // aubio_tempo_set_silence(tempo, -1000.); if (onset_threshold != 0.) aubio_tempo_set_threshold (tempo, onset_threshold); + if (onset_minioi != 0.) errmsg ("warning: minioio not supported yet\n"); wavetable = new_aubio_wavetable (samplerate, hop_size); aubio_wavetable_set_freq ( wavetable, 2450.); //aubio_sampler_load (sampler, "/archives/sounds/woodblock.aiff"); - examples_common_process((aubio_process_func_t)process_block,process_print); + examples_common_process(process_block, process_print); // send a last note off - send_noteon (miditap_note, 0); + if (usejack) { + send_noteon (miditap_note, 0); + } del_aubio_tempo(tempo); del_aubio_wavetable (wavetable); del_fvec(tempo_out); +beach: examples_common_del(); - return 0; + return ret; } - diff --git a/examples/parse_args.h b/examples/parse_args.h index 58423a2..f8c33d2 100644 --- a/examples/parse_args.h +++ b/examples/parse_args.h @@ -36,6 +36,7 @@ extern uint_t hop_size; // onset stuff extern char_t * onset_method; extern smpl_t onset_threshold; +extern smpl_t onset_minioi; // pitch stuff extern char_t * pitch_method; extern char_t * pitch_unit; @@ -46,6 +47,7 @@ extern uint_t time_format; extern char_t * tempo_method; // more general stuff extern smpl_t silence_threshold; +extern smpl_t release_drop; extern uint_t mix_input; // midi tap extern smpl_t miditap_note; @@ -63,8 +65,8 @@ int parse_args (int argc, char **argv); // internal stuff extern int blocks; -extern fvec_t *ibuf; -extern fvec_t *obuf; +extern fvec_t *input_buffer; +extern fvec_t *output_buffer; const char *prog_name; @@ -91,17 +93,25 @@ void usage (FILE * stream, int exit_code) " default=hfc\n" " -t --onset-threshold set onset detection threshold\n" " a value between 0.1 (more detections) and 1 (less); default=0.3\n" + " -M --minioi set minimum inter-onset interval\n" + " a value in second; default=0.012\n" #endif /* PROG_HAS_ONSET */ #ifdef PROG_HAS_PITCH " -p --pitch select pitch detection algorithm\n" - " <default|yinfft|yin|mcomb|fcomb|schmitt>; default=yinfft\n" + " <default|yinfft|yinfast|yin|mcomb|fcomb|schmitt>; default=yinfft\n" " -u --pitch-unit select pitch output unit\n" " <default|freq|hertz|Hz|midi|cent|bin>; default=freq\n" " -l --pitch-tolerance select pitch tolerance\n" " (yin, yinfft only) a value between 0.1 and 0.7; default=0.3\n" #endif /* PROG_HAS_PITCH */ +#ifdef PROG_HAS_SILENCE " -s --silence select silence threshold\n" " a value in dB, for instance -70, or -100; default=-90\n" +#endif /* PROG_HAS_SILENCE */ +#ifdef PROG_HAS_NOTES + " -d --release-drop select release drop threshold\n" + " a positive value in dB; default=10\n" +#endif " -T --time-format select time values output format\n" " (samples, ms, seconds) default=seconds\n" #ifdef PROG_HAS_OUTPUT @@ -109,10 +119,14 @@ void usage (FILE * stream, int exit_code) " input signal will be added to output synthesis\n" " -f --force-overwrite overwrite output file if needed\n" " do not fail if output file already exists\n" -#endif -#ifdef PROG_HAS_JACK +#endif /* PROG_HAS_OUTPUT */ +#if defined(PROG_HAS_JACK) && defined(HAVE_JACK) " -j --jack use Jack\n" -#endif +#if defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) + " -N --miditap-note MIDI note; default=69.\n" + " -V --miditap-velo MIDI velocity; default=65.\n" +#endif /* defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) */ +#endif /* defined(PROG_HAS_JACK) && defined(HAVE_JACK) */ " -v --verbose be verbose\n" " -h --help display this message\n" ); @@ -131,18 +145,30 @@ parse_args (int argc, char **argv) "i:r:B:H:" #ifdef PROG_HAS_JACK "j" +#if defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) + "N:V:" +#endif /* defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) */ #endif /* PROG_HAS_JACK */ #ifdef PROG_HAS_OUTPUT "o:" #endif /* PROG_HAS_OUTPUT */ #ifdef PROG_HAS_ONSET - "O:t:" + "O:t:M:" #endif /* PROG_HAS_ONSET */ #ifdef PROG_HAS_PITCH "p:u:l:" #endif /* PROG_HAS_PITCH */ "T:" - "s:mf"; +#ifdef PROG_HAS_SILENCE + "s:" +#endif /* PROG_HAS_SILENCE */ +#ifdef PROG_HAS_NOTES + "d:" +#endif /* PROG_HAS_SILENCE */ +#ifdef PROG_HAS_OUTPUT + "mf" +#endif /* PROG_HAS_OUTPUT */ + ; int next_option; struct option long_options[] = { {"help", 0, NULL, 'h'}, @@ -153,6 +179,10 @@ parse_args (int argc, char **argv) {"hopsize", 1, NULL, 'H'}, #ifdef PROG_HAS_JACK {"jack", 0, NULL, 'j'}, +#if defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) + {"miditap-note", 1, NULL, 'N'}, + {"miditap-velo", 1, NULL, 'V'}, +#endif /* PROG_HAS_ONSET !PROG_HAS_PITCH */ #endif /* PROG_HAS_JACK */ #ifdef PROG_HAS_OUTPUT {"output", 1, NULL, 'o'}, @@ -160,24 +190,32 @@ parse_args (int argc, char **argv) #ifdef PROG_HAS_ONSET {"onset", 1, NULL, 'O'}, {"onset-threshold", 1, NULL, 't'}, + {"onset-minioi", 1, NULL, 'M'}, #endif /* PROG_HAS_ONSET */ #ifdef PROG_HAS_PITCH {"pitch", 1, NULL, 'p'}, {"pitch-unit", 1, NULL, 'u'}, {"pitch-tolerance", 1, NULL, 'l'}, #endif /* PROG_HAS_PITCH */ +#ifdef PROG_HAS_SILENCE {"silence", 1, NULL, 's'}, +#endif /* PROG_HAS_SILENCE */ +#ifdef PROG_HAS_NOTES + {"release-drop", 1, NULL, 'd'}, +#endif /* PROG_HAS_NOTES */ {"time-format", 1, NULL, 'T'}, +#ifdef PROG_HAS_OUTPUT {"mix-input", 0, NULL, 'm'}, {"force-overwrite", 0, NULL, 'f'}, +#endif /* PROG_HAS_OUTPUT */ {NULL, 0, NULL, 0} }; #endif /* HAVE_GETOPT_H */ - prog_name = argv[0]; + // better safe than sorry if (argc < 1) { usage (stderr, 1); - return -1; } + prog_name = argv[0]; #ifdef HAVE_GETOPT_H do { next_option = getopt_long (argc, argv, options, long_options, NULL); @@ -191,6 +229,12 @@ parse_args (int argc, char **argv) case 'j': usejack = 1; break; + case 'N': + miditap_note = (smpl_t) atoi (optarg); + break; + case 'V': + miditap_velo = (smpl_t) atoi (optarg); + break; case 'i': source_uri = optarg; break; @@ -215,6 +259,9 @@ parse_args (int argc, char **argv) case 't': /* threshold value for onset */ onset_threshold = (smpl_t) atof (optarg); break; + case 'M': /* minimum inter-onset-interval */ + onset_minioi = (smpl_t) atof (optarg); + break; case 'p': pitch_method = optarg; break; @@ -238,6 +285,9 @@ parse_args (int argc, char **argv) case 's': /* silence threshold */ silence_threshold = (smpl_t) atof (optarg); break; + case 'd': /* release-drop threshold */ + release_drop = (smpl_t) atof (optarg); + break; case 'm': /* mix_input flag */ mix_input = 1; break; @@ -277,7 +327,8 @@ parse_args (int argc, char **argv) usejack = 1; #else errmsg("Error: no arguments given (and no available audio input)\n"); - usage ( stderr, 1 ); + errmsg(" consider recompiling with jack support (--enable-jack)\n"); + exit ( 1 ); #endif /* HAVE_JACK */ #else errmsg("Error: no arguments given\n"); diff --git a/examples/utils.c b/examples/utils.c index dc27386..c8e44a3 100644 --- a/examples/utils.c +++ b/examples/utils.c @@ -43,6 +43,7 @@ uint_t hop_size = 256; // onset stuff char_t * onset_method = "default"; smpl_t onset_threshold = 0.0; // will be set if != 0. +smpl_t onset_minioi = 0.0; // will be set if != 0. // pitch stuff char_t * pitch_unit = "default"; char_t * pitch_method = "default"; @@ -53,6 +54,7 @@ uint_t time_format = 0; // for "seconds", 1 for "ms", 2 for "samples" char_t * tempo_method = "default"; // more general stuff smpl_t silence_threshold = -90.; +smpl_t release_drop = 10.; uint_t mix_input = 0; uint_t force_overwrite = 0; @@ -61,8 +63,8 @@ uint_t force_overwrite = 0; // internal memory stuff aubio_source_t *this_source = NULL; aubio_sink_t *this_sink = NULL; -fvec_t *ibuf; -fvec_t *obuf; +fvec_t *input_buffer; +fvec_t *output_buffer; smpl_t miditap_note = 69.; smpl_t miditap_velo = 65.; @@ -74,7 +76,12 @@ extern void usage (FILE * stream, int exit_code); extern int parse_args (int argc, char **argv); #if HAVE_JACK +#define MAX_MIDI_EVENTS 128 +#define MAX_MIDI_EVENT_SIZE 3 aubio_jack_t *jack_setup; +jack_midi_event_t ev; +jack_midi_data_t midi_data[MAX_MIDI_EVENTS * MAX_MIDI_EVENT_SIZE]; +size_t midi_event_count = 0; #endif /* HAVE_JACK */ void examples_common_init (int argc, char **argv); @@ -119,15 +126,15 @@ void examples_common_init (int argc, char **argv) source_uri = "jack"; #endif /* HAVE_JACK */ } - ibuf = new_fvec (hop_size); - obuf = new_fvec (hop_size); + input_buffer = new_fvec (hop_size); + output_buffer = new_fvec (hop_size); } void examples_common_del (void) { - del_fvec (ibuf); - del_fvec (obuf); + del_fvec (input_buffer); + del_fvec (output_buffer); aubio_cleanup (); fflush(stderr); fflush(stdout); @@ -141,6 +148,8 @@ void examples_common_process (aubio_process_func_t process_func, if (usejack) { #ifdef HAVE_JACK + ev.size = MAX_MIDI_EVENT_SIZE; + ev.time = 0; // send it now debug ("Jack activation ...\n"); aubio_jack_activate (jack_setup, process_func); debug ("Processing (Ctrl+C to quit) ...\n"); @@ -157,14 +166,14 @@ void examples_common_process (aubio_process_func_t process_func, blocks = 0; do { - aubio_source_do (this_source, ibuf, &read); - process_func (ibuf, obuf); + aubio_source_do (this_source, input_buffer, &read); + process_func (input_buffer, output_buffer); // print to console if verbose or no output given if (verbose || sink_uri == NULL) { print(); } if (this_sink) { - aubio_sink_do (this_sink, obuf, hop_size); + aubio_sink_do (this_sink, output_buffer, hop_size); } blocks++; total_read += read; @@ -175,7 +184,8 @@ void examples_common_process (aubio_process_func_t process_func, total_read, blocks, hop_size, source_uri, samplerate); del_aubio_source (this_source); - del_aubio_sink (this_sink); + if (this_sink) + del_aubio_sink (this_sink); } } @@ -184,11 +194,11 @@ void send_noteon (smpl_t pitch, smpl_t velo) { #ifdef HAVE_JACK - jack_midi_event_t ev; - ev.size = 3; - ev.buffer = malloc (3 * sizeof (jack_midi_data_t)); // FIXME - ev.time = 0; if (usejack) { + ev.buffer = midi_data + midi_event_count++ * MAX_MIDI_EVENT_SIZE; + if (midi_event_count >= MAX_MIDI_EVENTS) { + midi_event_count = 0; + } ev.buffer[2] = velo; ev.buffer[1] = pitch; if (velo == 0) { diff --git a/examples/utils.h b/examples/utils.h index e911bef..5cc9ef9 100644 --- a/examples/utils.h +++ b/examples/utils.h @@ -66,7 +66,7 @@ typedef void (aubio_print_func_t) (void); void send_noteon (smpl_t pitch, smpl_t velo); /** common process function */ -typedef int (*aubio_process_func_t) (fvec_t * input, fvec_t * output); +typedef void (*aubio_process_func_t) (fvec_t * input, fvec_t * output); void process_block (fvec_t *ibuf, fvec_t *obuf); void process_print (void); diff --git a/nose2.cfg b/nose2.cfg deleted file mode 100644 index d1be6d8..0000000 --- a/nose2.cfg +++ /dev/null @@ -1,6 +0,0 @@ -[unittest] -start-dir = python/tests/ -plugins = nose2.plugins.mp - -[multiprocess] -always-on = false diff --git a/python/README.md b/python/README.md index 7ee0fa4..0367ada 100644 --- a/python/README.md +++ b/python/README.md @@ -1,100 +1,109 @@ -Python aubio module -=================== +aubio +===== -This module wraps the aubio library for Python using the numpy module. +aubio is a collection of tools for music and audio analysis. -Using the Python aubio module ------------------------------ +This package integrates the aubio library with [NumPy] to provide a set of +efficient tools to process and analyse audio signals, including: -After installing python-aubio, you will be able to import the aubio module: +- read audio from any media file, including videos and remote streams +- high quality phase vocoder, spectral filterbanks, and linear filters +- Mel-Frequency Cepstrum Coefficients and standard spectral descriptors +- detection of note attacks (onset) +- pitch tracking (fundamental frequency estimation) +- beat detection and tempo tracking - $ python - [...] - >>> import aubio - >>> help(aubio.miditofreq) +aubio works with both Python 2 and Python 3. -Finding some inspiration ------------------------- +Links +----- -Some examples are available in the `python/demos` directory. These scripts are -small programs written in python and using python-aubio. +- [module documentation][doc_python] +- [installation instructions][doc_python_install] +- [aubio manual][manual] +- [aubio homepage][homepage] +- [issue tracker][bugtracker] -For instance, `demo_source.py` reads a media file. +Demos +----- - $ ./python/demos/demo_source.py /path/to/sound/sample.wav +Some examples are available in the [`python/demos` folder][demos_dir]. Each +script is a command line program which accepts one ore more argument. -and `demo_timestretch_online.py` stretches the original file into a new one: +**Notes**: installing additional modules is required to run some of the demos. - $ ./python/demo/demo_timestretch_online.py loop.wav stretched_loop.wav 0.92` +### Analysis -Note: you might need to install additional modules to run some of the demos. -Some demos use [matplotlib](http://matplotlib.org/) to draw plots, others use -[PySoundCard](https://github.com/bastibe/PySoundCard) to play and record -sounds. +- `demo_source.py` uses aubio to read audio samples from media files +- `demo_onset_plot.py` detects attacks in a sound file and plots the results + using [matplotlib] +- `demo_pitch.py` looks for fundamental frequency in a sound file and plots the + results using [matplotlib] +- `demo_spectrogram.py`, `demo_specdesc.py`, `demo_mfcc.py` for spectral + analysis. -Testing the Python module -------------------------- +### Real-time -To run the all the python tests, use the script: +- `demo_pyaudio.py` and `demo_tapthebeat.py` use [pyaudio] +- `demo_pysoundcard_play.py`, `demo_pysoundcard.py` use [PySoundCard] +- `demo_alsa.py` uses [pyalsaaudio] - $ ./python/tests/run_all_tests +### Others -Each test script can also be called one at a time. For instance: +- `demo_timestretch.py` can change the duration of an input file and write the + new sound to disk, +- `demo_wav2midi.py` detects the notes in a file and uses [mido] to write the + results into a MIDI file - $ ./python/tests/test_note2midi.py -v +### Example -Install in a virtualenv ------------------------ +Use `demo_timestretch_online.py` to slow down `loop.wav`, write the results in +`stretched_loop.wav`: -You should be able to install python-aubio directly from the top source -directory of aubio. + $ python demo_timestretch_online.py loop.wav stretched_loop.wav 0.92 -First, create a virtualenv to hold the required python module: - - $ virtualenv pyaubio - $ source pyaubio/bin/activate - -Now install and build the python extension using: - - $ pip install . - -Install requirements --------------------- - -Before compiling this module, you must have compiled libaubio. - -A simple way to do this is with pip: - - $ pip install -r requirements.txt - -For more information about how this module works, please refer to the [Python/C -API Reference Manual] (http://docs.python.org/c-api/index.html) and the -[Numpy/C API Reference](http://docs.scipy.org/doc/numpy/reference/c-api.html). - -Compiling python aubio ----------------------- - -To build the aubio Python module, run the following command from the top source -directory of aubio: - - $ ./setup.py build - -Note: if libaubio was previously built using waf, the script will use it. -Otherwise, the entire library will be built inside the python extension. - -To find out more about `setup.py` options: - - $ ./setup.py --help - -Installing +Built with ---------- -To install the Python module: - - $ ./setup.py install - -Alternatively, you may want to use the Python module without installing it by -setting your PYTHONPATH, for instance as follows: - - $ export PYTHONPATH=$PYTHONPATH:$PWD/`ls -rtd build/lib.* | head -1`:$PWD/tests - +The core of aubio is written in C for portability and speed. In addition to +[NumPy], aubio can be optionally built to use one or more of the following +libraries: + +- media file reading: + + - [ffmpeg] / [avcodec] to decode and read audio from almost any format, + - [libsndfile] to read audio from uncompressed sound files, + - [libsamplerate] to re-sample audio signals, + - [CoreAudio] to read all media formats supported by macOS, iOS, and tvOS. + +- hardware acceleration: + + - [Atlas] and [Blas], for accelerated vector and matrix computations, + - [fftw3], to compute fast Fourier Transforms of any size, + - [Accelerate] for accelerated FFT and matrix computations (macOS/iOS), + - [Intel IPP], accelerated vector computation and FFT implementation. + +[ffmpeg]: https://ffmpeg.org +[avcodec]: https://libav.org +[libsndfile]: http://www.mega-nerd.com/libsndfile/ +[libsamplerate]: http://www.mega-nerd.com/SRC/ +[CoreAudio]: https://developer.apple.com/reference/coreaudio +[Atlas]: http://math-atlas.sourceforge.net/ +[Blas]: https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms +[fftw3]: http://fftw.org +[Accelerate]: https://developer.apple.com/reference/accelerate +[Intel IPP]: https://software.intel.com/en-us/intel-ipp + +[demos_dir]:https://github.com/aubio/aubio/tree/master/python/demos +[pyaudio]:https://people.csail.mit.edu/hubert/pyaudio/ +[PySoundCard]:https://github.com/bastibe/PySoundCard +[pyalsaaudio]:https://larsimmisch.github.io/pyalsaaudio/ +[mido]:https://mido.readthedocs.io + +[manual]: https://aubio.org/manual/latest/ +[doc_python]: https://aubio.org/manual/latest/python.html +[doc_python_install]: https://aubio.org/manual/latest/python_module.html +[homepage]: https://aubio.org +[NumPy]: https://www.numpy.org +[bugtracker]: https://github.com/aubio/aubio/issues +[matplotlib]:https://matplotlib.org/ diff --git a/python/__init__.py b/python/__init__.py deleted file mode 100644 index e69de29..0000000 --- a/python/__init__.py +++ /dev/null diff --git a/python/demos/demo_alsa.py b/python/demos/demo_alsa.py new file mode 100755 index 0000000..cd58a33 --- /dev/null +++ b/python/demos/demo_alsa.py @@ -0,0 +1,45 @@ +#! /usr/bin/env python + +import alsaaudio +import numpy as np +import aubio + +# constants +samplerate = 44100 +win_s = 2048 +hop_s = win_s // 2 +framesize = hop_s + +# set up audio input +recorder = alsaaudio.PCM(type=alsaaudio.PCM_CAPTURE) +recorder.setperiodsize(framesize) +recorder.setrate(samplerate) +recorder.setformat(alsaaudio.PCM_FORMAT_FLOAT_LE) +recorder.setchannels(1) + +# create aubio pitch detection (first argument is method, "default" is +# "yinfft", can also be "yin", "mcomb", fcomb", "schmitt"). +pitcher = aubio.pitch("default", win_s, hop_s, samplerate) +# set output unit (can be 'midi', 'cent', 'Hz', ...) +pitcher.set_unit("Hz") +# ignore frames under this level (dB) +pitcher.set_silence(-40) + +print("Starting to listen, press Ctrl+C to stop") + +# main loop +while True: + try: + # read data from audio input + _, data = recorder.read() + # convert data to aubio float samples + samples = np.fromstring(data, dtype=aubio.float_type) + # pitch of current frame + freq = pitcher(samples)[0] + # compute energy of current block + energy = np.sum(samples**2)/len(samples) + # do something with the results + print("{:10.4f} {:10.4f}".format(freq,energy)) + except KeyboardInterrupt: + print("Ctrl+C pressed, exiting") + break diff --git a/python/demos/demo_bench_yin.py b/python/demos/demo_bench_yin.py new file mode 100755 index 0000000..0d03c05 --- /dev/null +++ b/python/demos/demo_bench_yin.py @@ -0,0 +1,50 @@ +#! /usr/bin/env python + +import numpy as np +from aubio import pitch +import pylab as plt + +buf_size = 2048 * 1 +hop_size = buf_size // 4 + +samplerate = 44100 +minfreq = 40 +maxfreq = 6000 + +def sinewave(freq, duration, samplerate = samplerate): + """ generate a sinewave """ + length = hop_size + while length < duration * samplerate: + length += hop_size + return np.sin( 2. * np.pi * np.arange(length) * freq / samplerate ).astype("float32") + +def get_stats_for_pitch_method(method, freqs, samplerate = samplerate): + """ for a given pitch method and a list of frequency, generate a sinewave + and get mean deviation """ + means = np.zeros(len(freqs)) + medians = np.zeros(len(freqs)) + for freq, fn in zip(freqs, range(len(freqs))): + s = sinewave(freq, .50).reshape(-1, hop_size) + #s = (sinewave(freq, .50) + .0*sinewave(freq/2., .50)).reshape(-1, hop_size) + p = pitch(method, buf_size, hop_size, samplerate = samplerate) + candidates = np.zeros(len(s)) + #samples = np.zeros(buf_size) + for frame, i in zip(s, range(len(s))): + candidates[i] = p(frame)[0] + # skip first few candidates + candidates = candidates[4:] + means[fn] = np.mean(candidates[candidates != 0] - freq) + medians[fn] = np.median(candidates[candidates != 0] - freq) + print (freq, means[fn], medians[fn]) + return means, medians + +if __name__ == '__main__': + freqs = np.arange(minfreq, maxfreq, 1.) + modes = ["yin", "yinfft"] + for mode in modes: + means, medians = get_stats_for_pitch_method(mode, freqs) + plt.figure() + plt.plot(freqs, means, 'g-') + plt.plot(freqs, medians, 'r--') + #plt.savefig(mode + '_deviations_test.png', dpi=300) + plt.show() diff --git a/python/demos/demo_bpm_extract.py b/python/demos/demo_bpm_extract.py index ba7fbad..b04ebea 100755 --- a/python/demos/demo_bpm_extract.py +++ b/python/demos/demo_bpm_extract.py @@ -3,26 +3,33 @@ from aubio import source, tempo from numpy import median, diff -def get_file_bpm(path, params = None): +def get_file_bpm(path, params=None): """ Calculate the beats per minute (bpm) of a given file. path: path to the file param: dictionary of parameters """ if params is None: params = {} - try: - win_s = params['win_s'] - samplerate = params['samplerate'] - hop_s = params['hop_s'] - except KeyError: - """ - # super fast - samplerate, win_s, hop_s = 4000, 128, 64 - # fast - samplerate, win_s, hop_s = 8000, 512, 128 - """ - # default: - samplerate, win_s, hop_s = 44100, 1024, 512 + # default: + samplerate, win_s, hop_s = 44100, 1024, 512 + if 'mode' in params: + if params.mode in ['super-fast']: + # super fast + samplerate, win_s, hop_s = 4000, 128, 64 + elif params.mode in ['fast']: + # fast + samplerate, win_s, hop_s = 8000, 512, 128 + elif params.mode in ['default']: + pass + else: + raise ValueError("unknown mode {:s}".format(params.mode)) + # manual settings + if 'samplerate' in params: + samplerate = params.samplerate + if 'win_s' in params: + win_s = params.win_s + if 'hop_s' in params: + hop_s = params.hop_s s = source(path, samplerate, hop_s) samplerate = s.samplerate @@ -44,19 +51,29 @@ def get_file_bpm(path, params = None): if read < hop_s: break - # Convert to periods and to bpm - if len(beats) > 1: - if len(beats) < 4: - print("few beats found in {:s}".format(path)) - bpms = 60./diff(beats) - b = median(bpms) - else: - b = 0 - print("not enough beats found in {:s}".format(path)) - return b + def beats_to_bpm(beats, path): + # if enough beats are found, convert to periods then to bpm + if len(beats) > 1: + if len(beats) < 4: + print("few beats found in {:s}".format(path)) + bpms = 60./diff(beats) + return median(bpms) + else: + print("not enough beats found in {:s}".format(path)) + return 0 + + return beats_to_bpm(beats, path) if __name__ == '__main__': - import sys - for f in sys.argv[1:]: - bpm = get_file_bpm(f) + import argparse + parser = argparse.ArgumentParser() + parser.add_argument('-m', '--mode', + help="mode [default|fast|super-fast]", + dest="mode", default='default') + parser.add_argument('sources', + nargs='+', + help="input_files") + args = parser.parse_args() + for f in args.sources: + bpm = get_file_bpm(f, params = args) print("{:6s} {:s}".format("{:2f}".format(bpm), f)) diff --git a/python/demos/demo_filter.py b/python/demos/demo_filter.py index 10226ce..ab3df6d 100755 --- a/python/demos/demo_filter.py +++ b/python/demos/demo_filter.py @@ -1,36 +1,53 @@ #! /usr/bin/env python +import sys +import os.path +import aubio -def apply_filter(path): - from aubio import source, sink, digital_filter - from os.path import basename, splitext +def apply_filter(path, target): # open input file, get its samplerate - s = source(path) + s = aubio.source(path) samplerate = s.samplerate # create an A-weighting filter - f = digital_filter(7) + f = aubio.digital_filter(7) f.set_a_weighting(samplerate) - # alternatively, apply another filter # create output file - o = sink("filtered_" + splitext(basename(path))[0] + ".wav", samplerate) + o = aubio.sink(target, samplerate) total_frames = 0 while True: + # read from source samples, read = s() + # filter samples filtered_samples = f(samples) + # write to sink o(filtered_samples, read) + # count frames read total_frames += read - if read < s.hop_size: break + # end of file reached + if read < s.hop_size: + break + # print some info duration = total_frames / float(samplerate) - print ("read {:s}".format(s.uri)) - print ("applied A-weighting filtered ({:d} Hz)".format(samplerate)) - print ("wrote {:s} ({:.2f} s)".format(o.uri, duration)) + input_str = "input: {:s} ({:.2f} s, {:d} Hz)" + output_str = "output: {:s}, A-weighting filtered ({:d} frames total)" + print(input_str.format(s.uri, duration, samplerate)) + print(output_str.format(o.uri, total_frames)) if __name__ == '__main__': - import sys - for f in sys.argv[1:]: - apply_filter(f) + usage = "{:s} <input_file> [output_file]".format(sys.argv[0]) + if not 1 < len(sys.argv) < 4: + print(usage) + sys.exit(1) + if len(sys.argv) < 3: + input_path = sys.argv[1] + basename = os.path.splitext(os.path.basename(input_path))[0] + ".wav" + output_path = "filtered_" + basename + else: + input_path, output_path = sys.argv[1:] + # run function + apply_filter(input_path, output_path) diff --git a/python/demos/demo_filterbank.py b/python/demos/demo_filterbank.py index 54aff57..ec29d11 100755 --- a/python/demos/demo_filterbank.py +++ b/python/demos/demo_filterbank.py @@ -1,30 +1,44 @@ #! /usr/bin/env python -from aubio import filterbank, fvec -from pylab import loglog, show, xlim, ylim, xlabel, ylabel, title -from numpy import vstack, arange +"""Create a filterbank from a list of frequencies. -win_s = 2048 +This demo uses `aubio.filterbank.set_triangle_bands` to build a set of +triangular filters from a list of frequencies. + +The filterbank coefficients are then modified before being displayed.""" + +import aubio +import numpy as np +import matplotlib.pyplot as plt + +# sampling rate and size of the fft samplerate = 48000 +win_s = 2048 +# define a list of custom frequency freq_list = [60, 80, 200, 400, 800, 1600, 3200, 6400, 12800, 24000] +# number of filters to create n_filters = len(freq_list) - 2 -f = filterbank(n_filters, win_s) -freqs = fvec(freq_list) +# create a new filterbank +f = aubio.filterbank(n_filters, win_s) +freqs = aubio.fvec(freq_list) f.set_triangle_bands(freqs, samplerate) +# get the coefficients from the filterbank coeffs = f.get_coeffs() -coeffs[4] *= 5. - +# apply a gain to fifth band +coeffs[4] *= 6. +# load the modified coeffs into the filterbank f.set_coeffs(coeffs) -times = vstack([arange(win_s // 2 + 1) * samplerate / win_s] * n_filters) -title('Bank of filters built using a simple list of boundaries\nThe middle band has been amplified by 2.') -loglog(times.T, f.get_coeffs().T, '.-') -xlim([50, samplerate/2]) -ylim([1.0e-6, 2.0e-2]) -xlabel('log frequency (Hz)') -ylabel('log amplitude') - -show() +# display the band gains in a loglog plot +freqs = np.vstack([np.arange(win_s // 2 + 1) * samplerate / win_s] * n_filters) +plt.title('filterbank built from a list of frequencies\n' + 'The 5th band has been amplified by a factor 6.') +plt.loglog(freqs.T, f.get_coeffs().T, '.-') +plt.xlim([50, samplerate/2]) +plt.ylim([1.0e-6, 2.0e-2]) +plt.xlabel('log frequency (Hz)') +plt.ylabel('log amplitude') +plt.show() diff --git a/python/demos/demo_mfcc.py b/python/demos/demo_mfcc.py index dfbd7ed..5a33d15 100755 --- a/python/demos/demo_mfcc.py +++ b/python/demos/demo_mfcc.py @@ -2,20 +2,27 @@ import sys from aubio import source, pvoc, mfcc -from numpy import vstack, zeros +from numpy import vstack, zeros, diff -win_s = 512 # fft size -hop_s = win_s // 4 # hop size n_filters = 40 # must be 40 for mfcc n_coeffs = 13 -samplerate = 44100 if len(sys.argv) < 2: - print("Usage: %s <source_filename>" % sys.argv[0]) + print("Usage: %s <source_filename> [samplerate] [win_s] [hop_s] [mode]" % sys.argv[0]) + print(" where [mode] can be 'delta' or 'ddelta' for first and second derivatives") sys.exit(1) source_filename = sys.argv[1] +if len(sys.argv) > 2: samplerate = int(sys.argv[2]) +else: samplerate = 0 +if len(sys.argv) > 3: win_s = int(sys.argv[3]) +else: win_s = 512 +if len(sys.argv) > 4: hop_s = int(sys.argv[4]) +else: hop_s = win_s // 4 +if len(sys.argv) > 5: mode = sys.argv[5] +else: mode = "default" + samplerate = 0 if len( sys.argv ) > 2: samplerate = int(sys.argv[2]) @@ -48,18 +55,28 @@ get_waveform_plot( source_filename, samplerate, block_size = hop_s, ax = wave) wave.xaxis.set_visible(False) wave.yaxis.set_visible(False) +# compute first and second derivatives +if mode in ["delta", "ddelta"]: + mfccs = diff(mfccs, axis = 0) +if mode == "ddelta": + mfccs = diff(mfccs, axis = 0) + all_times = arange(mfccs.shape[0]) * hop_s n_coeffs = mfccs.shape[1] for i in range(n_coeffs): ax = plt.axes ( [0.1, 0.75 - ((i+1) * 0.65 / n_coeffs), 0.8, 0.65 / n_coeffs], sharex = wave ) ax.xaxis.set_visible(False) - ax.yaxis.set_visible(False) + ax.set_yticks([]) + ax.set_ylabel('%d' % i) ax.plot(all_times, mfccs.T[i]) # add time to the last axis -set_xlabels_sample2time( ax, frames_read, samplerate) +set_xlabels_sample2time( ax, frames_read, samplerate) #plt.ylabel('spectral descriptor value') ax.xaxis.set_visible(True) -wave.set_title('MFCC for %s' % source_filename) +title = 'MFCC for %s' % source_filename +if mode == "delta": title = mode + " " + title +elif mode == "ddelta": title = "double-delta" + " " + title +wave.set_title(title) plt.show() diff --git a/python/demos/demo_notes.py b/python/demos/demo_notes.py new file mode 100755 index 0000000..301013a --- /dev/null +++ b/python/demos/demo_notes.py @@ -0,0 +1,37 @@ +#! /usr/bin/env python + +import sys +from aubio import source, notes + +if len(sys.argv) < 2: + print("Usage: %s <filename> [samplerate]" % sys.argv[0]) + sys.exit(1) + +filename = sys.argv[1] + +downsample = 1 +samplerate = 44100 // downsample +if len( sys.argv ) > 2: samplerate = int(sys.argv[2]) + +win_s = 512 // downsample # fft size +hop_s = 256 // downsample # hop size + +s = source(filename, samplerate, hop_s) +samplerate = s.samplerate + +tolerance = 0.8 + +notes_o = notes("default", win_s, hop_s, samplerate) + +print("%8s" % "time","[ start","vel","last ]") + +# total number of frames read +total_frames = 0 +while True: + samples, read = s() + new_note = notes_o(samples) + if (new_note[0] != 0): + note_str = ' '.join(["%.2f" % i for i in new_note]) + print("%.6f" % (total_frames/float(samplerate)), new_note) + total_frames += read + if read < hop_s: break diff --git a/python/demos/demo_pitch_sinusoid.py b/python/demos/demo_pitch_sinusoid.py index 629f327..b7ef7b4 100755 --- a/python/demos/demo_pitch_sinusoid.py +++ b/python/demos/demo_pitch_sinusoid.py @@ -37,7 +37,7 @@ freqs[ pointer : pointer + partition ] = 1480 pointer += partition pointer += partition -freqs[ pointer : pointer + partition ] = 400 + 5 * np.random.random(sin_length/8) +freqs[ pointer : pointer + partition ] = 400 + 5 * np.random.random(sin_length//8) a = build_sinusoid(sin_length, freqs, samplerate) diff --git a/python/demos/demo_pyaudio.py b/python/demos/demo_pyaudio.py new file mode 100755 index 0000000..4c14cd5 --- /dev/null +++ b/python/demos/demo_pyaudio.py @@ -0,0 +1,75 @@ +#! /usr/bin/env python + +# Use pyaudio to open the microphone and run aubio.pitch on the stream of +# incoming samples. If a filename is given as the first argument, it will +# record 5 seconds of audio to this location. Otherwise, the script will +# run until Ctrl+C is pressed. + +# Examples: +# $ ./python/demos/demo_pyaudio.py +# $ ./python/demos/demo_pyaudio.py /tmp/recording.wav + +import pyaudio +import sys +import numpy as np +import aubio + +# initialise pyaudio +p = pyaudio.PyAudio() + +# open stream +buffer_size = 1024 +pyaudio_format = pyaudio.paFloat32 +n_channels = 1 +samplerate = 44100 +stream = p.open(format=pyaudio_format, + channels=n_channels, + rate=samplerate, + input=True, + frames_per_buffer=buffer_size) + +if len(sys.argv) > 1: + # record 5 seconds + output_filename = sys.argv[1] + record_duration = 5 # exit 1 + outputsink = aubio.sink(sys.argv[1], samplerate) + total_frames = 0 +else: + # run forever + outputsink = None + record_duration = None + +# setup pitch +tolerance = 0.8 +win_s = 4096 # fft size +hop_s = buffer_size # hop size +pitch_o = aubio.pitch("default", win_s, hop_s, samplerate) +pitch_o.set_unit("midi") +pitch_o.set_tolerance(tolerance) + +print("*** starting recording") +while True: + try: + audiobuffer = stream.read(buffer_size) + signal = np.fromstring(audiobuffer, dtype=np.float32) + + pitch = pitch_o(signal)[0] + confidence = pitch_o.get_confidence() + + print("{} / {}".format(pitch,confidence)) + + if outputsink: + outputsink(signal, len(signal)) + + if record_duration: + total_frames += len(signal) + if record_duration * samplerate < total_frames: + break + except KeyboardInterrupt: + print("*** Ctrl+C pressed, exiting") + break + +print("*** done recording") +stream.stop_stream() +stream.close() +p.terminate() diff --git a/python/demos/demo_source_simple.py b/python/demos/demo_source_simple.py new file mode 100755 index 0000000..f0577f8 --- /dev/null +++ b/python/demos/demo_source_simple.py @@ -0,0 +1,20 @@ +#! /usr/bin/env python + +"""A simple example using aubio.source.""" + +import sys +import aubio + +samplerate = 0 # use original source samplerate +hop_size = 256 # number of frames to read in one block +src = aubio.source(sys.argv[1], samplerate, hop_size) +total_frames = 0 + +while True: + samples, read = src() # read hop_size new samples from source + total_frames += read # increment total number of frames + if read < hop_size: # end of file reached + break + +fmt_string = "read {:d} frames at {:d}Hz from {:s}" +print(fmt_string.format(total_frames, src.samplerate, src.uri)) diff --git a/python/demos/demo_tapthebeat.py b/python/demos/demo_tapthebeat.py new file mode 100755 index 0000000..0483379 --- /dev/null +++ b/python/demos/demo_tapthebeat.py @@ -0,0 +1,78 @@ +#! /usr/bin/env python + +""" A simple demo using aubio and pyaudio to play beats in real time + +Note you will need to have pyaudio installed: `pip install pyaudio`. + +Examples: + ./demo_tapthebeat.py ~/Music/track1.ogg + +When compiled with ffmpeg/libav, you should be able to open remote streams. For +instance using youtube-dl (`pip install youtube-dl`): + + ./demo_tapthebeat.py `youtube-dl -xg https://youtu.be/zZbM9n9j3_g` + +""" + +import sys +import time +import pyaudio +import aubio +import numpy as np + +win_s = 1024 # fft size +hop_s = win_s // 2 # hop size + +# parse command line arguments +if len(sys.argv) < 2: + print("Usage: %s <filename> [samplerate]" % sys.argv[0]) + sys.exit(1) + +filename = sys.argv[1] + +samplerate = 0 +if len( sys.argv ) > 2: samplerate = int(sys.argv[2]) + +# create aubio source +a_source = aubio.source(filename, samplerate, hop_s) +samplerate = a_source.samplerate + +# create aubio tempo detection +a_tempo = aubio.tempo("default", win_s, hop_s, samplerate) + +# create a simple click sound +click = 0.7 * np.sin(2. * np.pi * np.arange(hop_s) / hop_s * samplerate / 3000.) + +# pyaudio callback +def pyaudio_callback(_in_data, _frame_count, _time_info, _status): + samples, read = a_source() + is_beat = a_tempo(samples) + if is_beat: + samples += click + #print ('tick') # avoid print in audio callback + audiobuf = samples.tobytes() + if read < hop_s: + return (audiobuf, pyaudio.paComplete) + return (audiobuf, pyaudio.paContinue) + +# create pyaudio stream with frames_per_buffer=hop_s and format=paFloat32 +p = pyaudio.PyAudio() +pyaudio_format = pyaudio.paFloat32 +frames_per_buffer = hop_s +n_channels = 1 +stream = p.open(format=pyaudio_format, channels=n_channels, rate=samplerate, + output=True, frames_per_buffer=frames_per_buffer, + stream_callback=pyaudio_callback) + +# start pyaudio stream +stream.start_stream() + +# wait for stream to finish +while stream.is_active(): + time.sleep(0.1) + +# stop pyaudio stream +stream.stop_stream() +stream.close() +# close pyaudio +p.terminate() diff --git a/python/demos/demo_timestretch.py b/python/demos/demo_timestretch.py index 8569a44..2271d01 100755 --- a/python/demos/demo_timestretch.py +++ b/python/demos/demo_timestretch.py @@ -7,17 +7,17 @@ # and synthesis in a second pass. import sys -from aubio import source, sink, pvoc, mfcc, cvec +from aubio import source, sink, pvoc, cvec from aubio import unwrap2pi, float_type import numpy as np win_s = 1024 -hop_s = win_s / 8 # 87.5 % overlap +hop_s = win_s // 8 # 87.5 % overlap warmup = win_s // hop_s - 1 if len(sys.argv) < 3: - print("Usage: %s <source_filename> <output_filename> <rate> [samplerate]".format(sys.argv[0])) + print("Usage: {:s} <source_filename> <output_filename> <rate> [samplerate]".format(sys.argv[0])) print("""Examples: # twice faster {0} track_01.mp3 track_01_faster.wav 2.0 diff --git a/python/demos/demo_timestretch_online.py b/python/demos/demo_timestretch_online.py index e682e8e..df70365 100755 --- a/python/demos/demo_timestretch_online.py +++ b/python/demos/demo_timestretch_online.py @@ -7,17 +7,17 @@ # `demo_timestretch.py` for a version following the original implementation. import sys -from aubio import source, sink, pvoc, mfcc, cvec +from aubio import source, sink, pvoc, cvec from aubio import unwrap2pi, float_type import numpy as np -win_s = 1024 -hop_s = win_s / 8 # 87.5 % overlap +win_s = 512 +hop_s = win_s // 8 # 87.5 % overlap warmup = win_s // hop_s - 1 if len(sys.argv) < 3: - print("Usage: %s <source_filename> <output_filename> <rate> [samplerate]".format(sys.argv[0])) + print("Usage: {:s} <source_filename> <output_filename> <rate> [samplerate]".format(sys.argv[0])) print("""Examples: # twice faster {0} track_01.mp3 track_01_faster.wav 2.0 @@ -92,8 +92,10 @@ while True: old_grain.norm = np.copy(cur_grain.norm) old_grain.phas = np.copy(cur_grain.phas) - block_read += 1 + # until end of file if read < hop_s: break + # increment block counter + block_read += 1 for t in range(warmup + 2): # purge the last frames from the phase vocoder new_grain.norm[:] = 0 diff --git a/python/demos/demo_tss.py b/python/demos/demo_tss.py index f8c29aa..1a56b4f 100755 --- a/python/demos/demo_tss.py +++ b/python/demos/demo_tss.py @@ -10,8 +10,7 @@ if __name__ == '__main__': samplerate = 44100 win_s = 1024 # fft size - hop_s = win_s // 4 # block size - threshold = 0.5 + hop_s = win_s // 8 # block size f = source(sys.argv[1], samplerate, hop_s) g = sink(sys.argv[2], samplerate) @@ -21,7 +20,9 @@ if __name__ == '__main__': pvb = pvoc(win_s, hop_s) # another phase vocoder t = tss(win_s, hop_s) # transient steady state separation - t.set_threshold(threshold) + t.set_threshold(0.01) + t.set_alpha(3.) + t.set_beta(4.) read = hop_s @@ -35,6 +36,7 @@ if __name__ == '__main__': h(steadstate, read) # write steady states to output del f, g, h # finish writing the files now + sys.exit(0) from demo_spectrogram import get_spectrogram from pylab import subplot, show diff --git a/python/demos/demo_wav2midi.py b/python/demos/demo_wav2midi.py new file mode 100755 index 0000000..48c2d0a --- /dev/null +++ b/python/demos/demo_wav2midi.py @@ -0,0 +1,77 @@ +#! /usr/bin/env python + +# Simple demo to extract notes from a sound file, and store them in a midi file +# using mido. +# +# Install mido: `pip instal mido` +# +# Documentation: https://mido.readthedocs.io/ + +import sys +from aubio import source, notes +from mido import Message, MetaMessage, MidiFile, MidiTrack, second2tick, bpm2tempo + +if len(sys.argv) < 3: + print("Usage: %s <filename> <output> [samplerate]" % sys.argv[0]) + sys.exit(1) + +filename = sys.argv[1] +midioutput = sys.argv[2] + +downsample = 1 +samplerate = 44100 // downsample +if len( sys.argv ) > 3: samplerate = int(sys.argv[3]) + +win_s = 512 // downsample # fft size +hop_s = 256 // downsample # hop size + +s = source(filename, samplerate, hop_s) +samplerate = s.samplerate + +tolerance = 0.8 + +notes_o = notes("default", win_s, hop_s, samplerate) + +print("%8s" % "time","[ start","vel","last ]") + +# create a midi file +mid = MidiFile() +track = MidiTrack() +mid.tracks.append(track) + +ticks_per_beat = mid.ticks_per_beat # default: 480 +bpm = 120 # default midi tempo + +tempo = bpm2tempo(bpm) +track.append(MetaMessage('set_tempo', tempo=tempo)) +track.append(MetaMessage('time_signature', numerator=4, denominator=4)) + +def frames2tick(frames, samplerate=samplerate): + sec = frames / float(samplerate) + return int(second2tick(sec, ticks_per_beat, tempo)) + +last_time = 0 + +# total number of frames read +total_frames = 0 +while True: + samples, read = s() + new_note = notes_o(samples) + if (new_note[0] != 0): + note_str = ' '.join(["%.2f" % i for i in new_note]) + print("%.6f" % (total_frames/float(samplerate)), new_note) + delta = frames2tick(total_frames) - last_time + if new_note[2] > 0: + track.append(Message('note_off', note=int(new_note[2]), + velocity=127, time=0) + ) + track.append(Message('note_on', + note=int(new_note[0]), + velocity=int(new_note[1]), + time=delta) + ) + last_time = frames2tick(total_frames) + total_frames += read + if read < hop_s: break + +mid.save(midioutput) diff --git a/python/demos/demo_yin_compare.py b/python/demos/demo_yin_compare.py new file mode 100755 index 0000000..6842368 --- /dev/null +++ b/python/demos/demo_yin_compare.py @@ -0,0 +1,175 @@ +#! /usr/bin/env python +# -*- coding: utf8 -*- + +""" Pure python implementation of the sum of squared difference + + sqd_yin: original sum of squared difference [0] + d_t(tau) = x ⊗ kernel + sqd_yinfast: sum of squared diff using complex domain [0] + sqd_yinfftslow: tappered squared diff [1] + sqd_yinfft: modified squared diff using complex domain [1] + +[0]:http://audition.ens.fr/adc/pdf/2002_JASA_YIN.pdf +[1]:https://aubio.org/phd/ +""" + +import sys +import numpy as np +import matplotlib.pyplot as plt + +def sqd_yin(samples): + """ compute original sum of squared difference + + Brute-force computation (cost o(N**2), slow).""" + B = len(samples) + W = B//2 + yin = np.zeros(W) + for j in range(W): + for tau in range(1, W): + yin[tau] += (samples[j] - samples[j+tau])**2 + return yin + +def sqd_yinfast(samples): + """ compute approximate sum of squared difference + + Using complex convolution (fast, cost o(n*log(n)) )""" + # yin_t(tau) = (r_t(0) + r_(t+tau)(0)) - 2r_t(tau) + B = len(samples) + W = B//2 + yin = np.zeros(W) + sqdiff = np.zeros(W) + kernel = np.zeros(B) + # compute r_(t+tau)(0) + squares = samples**2 + for tau in range(W): + sqdiff[tau] = squares[tau:tau+W].sum() + # add r_t(0) + sqdiff += sqdiff[0] + # compute r_t(tau) using kernel convolution in complex domain + samples_fft = np.fft.fft(samples) + kernel[1:W+1] = samples[W-1::-1] # first half, reversed + kernel_fft = np.fft.fft(kernel) + r_t_tau = np.fft.ifft(samples_fft * kernel_fft).real[W:] + # compute yin_t(tau) + yin = sqdiff - 2 * r_t_tau + return yin + +def sqd_yintapered(samples): + """ compute tappered sum of squared difference + + Brute-force computation (cost o(N**2), slow).""" + B = len(samples) + W = B//2 + yin = np.zeros(W) + for tau in range(1, W): + for j in range(W - tau): + yin[tau] += (samples[j] - samples[j+tau])**2 + return yin + +def sqd_yinfft(samples): + """ compute yinfft modified sum of squared differences + + Very fast, improved performance in transients. + + FIXME: biased.""" + B = len(samples) + W = B//2 + yin = np.zeros(W) + def hanningz(W): + return .5 * (1. - np.cos(2. * np.pi * np.arange(W) / W)) + #win = np.ones(B) + win = hanningz(B) + sqrmag = np.zeros(B) + fftout = np.fft.fft(win*samples) + sqrmag[0] = fftout[0].real**2 + for l in range(1, W): + sqrmag[l] = fftout[l].real**2 + fftout[l].imag**2 + sqrmag[B-l] = sqrmag[l] + sqrmag[W] = fftout[W].real**2 + fftout = np.fft.fft(sqrmag) + sqrsum = 2.*sqrmag[:W + 1].sum() + yin[0] = 0 + yin[1:] = sqrsum - fftout.real[1:W] + return yin / B + +def cumdiff(yin): + """ compute the cumulative mean normalized difference """ + W = len(yin) + yin[0] = 1. + cumsum = 0. + for tau in range(1, W): + cumsum += yin[tau] + if cumsum != 0: + yin[tau] *= tau/cumsum + else: + yin[tau] = 1 + return yin + +def compute_all(x): + import time + now = time.time() + + yin = sqd_yin(x) + t1 = time.time() + print ("yin took %.2fms" % ((t1-now) * 1000.)) + + yinfast = sqd_yinfast(x) + t2 = time.time() + print ("yinfast took: %.2fms" % ((t2-t1) * 1000.)) + + yintapered = sqd_yintapered(x) + t3 = time.time() + print ("yintapered took: %.2fms" % ((t3-t2) * 1000.)) + + yinfft = sqd_yinfft(x) + t4 = time.time() + print ("yinfft took: %.2fms" % ((t4-t3) * 1000.)) + + return yin, yinfast, yintapered, yinfft + +def plot_all(yin, yinfast, yintapered, yinfft): + fig, axes = plt.subplots(nrows=2, ncols=2, sharex=True, sharey='col') + + axes[0, 0].plot(yin, label='yin') + axes[0, 0].plot(yintapered, label='yintapered') + axes[0, 0].set_ylim(bottom=0) + axes[0, 0].legend() + axes[1, 0].plot(yinfast, '-', label='yinfast') + axes[1, 0].plot(yinfft, label='yinfft') + axes[1, 0].legend() + + axes[0, 1].plot(cumdiff(yin), label='yin') + axes[0, 1].plot(cumdiff(yintapered), label='yin tapered') + axes[0, 1].set_ylim(bottom=0) + axes[0, 1].legend() + axes[1, 1].plot(cumdiff(yinfast), '-', label='yinfast') + axes[1, 1].plot(cumdiff(yinfft), label='yinfft') + axes[1, 1].legend() + + fig.tight_layout() + +testfreqs = [441., 800., 10000., 40.] + +if len(sys.argv) > 1: + testfreqs = map(float,sys.argv[1:]) + +for f in testfreqs: + print ("Comparing yin implementations for sine wave at %.fHz" % f) + samplerate = 44100. + win_s = 4096 + + x = np.cos(2.*np.pi * np.arange(win_s) * f / samplerate) + + n_times = 1#00 + for n in range(n_times): + yin, yinfast, yinfftslow, yinfft = compute_all(x) + if 0: # plot difference + plt.plot(yin-yinfast) + plt.tight_layout() + plt.show() + if 1: + plt.plot(yinfftslow-yinfft) + plt.tight_layout() + plt.show() + plot_all(yin, yinfast, yinfftslow, yinfft) +plt.show() diff --git a/python/ext/aubio-docstrings.h b/python/ext/aubio-docstrings.h new file mode 100644 index 0000000..26cada0 --- /dev/null +++ b/python/ext/aubio-docstrings.h @@ -0,0 +1,143 @@ +#define PYAUBIO_dct_doc \ + "dct(size=1024)\n"\ + "\n"\ + "Compute Discrete Fourier Transforms of Type-II.\n"\ + "\n"\ + "Parameters\n"\ + "----------\n"\ + "size : int\n"\ + " size of the DCT to compute\n"\ + "\n"\ + "Example\n"\ + "-------\n"\ + ">>> d = aubio.dct(16)\n"\ + ">>> d.size\n"\ + "16\n"\ + ">>> x = aubio.fvec(np.ones(d.size))\n"\ + ">>> d(x)\n"\ + "array([4., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"\ + " dtype=float32)\n"\ + ">>> d.rdo(d(x))\n"\ + "array([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n"\ + " dtype=float32)\n"\ + "\n"\ + "References\n"\ + "----------\n"\ + "`DCT-II in Discrete Cosine Transform\n"\ + "<https://en.wikipedia.org/wiki/Discrete_cosine_transform#DCT-II>`_\n"\ + "on Wikipedia.\n" + +#define PYAUBIO_mfcc_doc \ + "mfcc(buf_size=1024, n_filters=40, n_coeffs=13, samplerate=44100)\n"\ + "\n"\ + "Compute Mel Frequency Cepstrum Coefficients (MFCC).\n"\ + "\n"\ + "`mfcc` creates a callable which takes a `cvec` as input.\n"\ + "\n"\ + "If `n_filters = 40`, the filterbank will be initialized with\n"\ + ":meth:`filterbank.set_mel_coeffs_slaney`. Otherwise, if `n_filters`\n"\ + "is greater than `0`, it will be initialized with\n"\ + ":meth:`filterbank.set_mel_coeffs` using `fmin = 0`,\n"\ + "`fmax = samplerate/`.\n"\ + "\n"\ + "Example\n"\ + "-------\n"\ + ">>> buf_size = 2048; n_filters = 128; n_coeffs = 13; samplerate = 44100\n"\ + ">>> mf = aubio.mfcc(buf_size, n_filters, n_coeffs, samplerate)\n"\ + ">>> fftgrain = aubio.cvec(buf_size)\n"\ + ">>> mf(fftgrain).shape\n"\ + "(13,)\n"\ + "" + +#define PYAUBIO_notes_doc \ + "notes(method=\"default\", buf_size=1024, hop_size=512, samplerate=44100)\n"\ + "\n"\ + "Note detection\n" + +#define PYAUBIO_onset_doc \ + "onset(method=\"default\", buf_size=1024, hop_size=512, samplerate=44100)\n"\ + "\n"\ + "Onset detection object. `method` should be one of method supported by\n"\ + ":class:`specdesc`.\n" + +#define PYAUBIO_pitch_doc \ + "pitch(method=\"default\", buf_size=1024, hop_size=512, samplerate=44100)\n"\ + "\n"\ + "Pitch detection.\n"\ + "\n"\ + "Supported methods: `yinfft`, `yin`, `yinfast`, `fcomb`, `mcomb`,\n"\ + "`schmitt`, `specacf`, `default` (`yinfft`).\n" + +#define PYAUBIO_sampler_doc \ + "sampler(hop_size=512, samplerate=44100)\n"\ + "\n"\ + "Sampler.\n" + +#define PYAUBIO_specdesc_doc \ + "specdesc(method=\"default\", buf_size=1024)\n"\ + "\n"\ + "Spectral description functions. Creates a callable that takes a\n"\ + ":class:`cvec` as input, typically created by :class:`pvoc` for\n"\ + "overlap and windowing, and returns a single float.\n"\ + "\n"\ + "`method` can be any of the values listed below. If `default` is used\n"\ + "the `hfc` function will be selected.\n"\ + "\n"\ + "Onset novelty functions:\n"\ + "\n"\ + "- `energy`: local energy,\n"\ + "- `hfc`: high frequency content,\n"\ + "- `complex`: complex domain,\n"\ + "- `phase`: phase-based method,\n"\ + "- `wphase`: weighted phase deviation,\n"\ + "- `specdiff`: spectral difference,\n"\ + "- `kl`: Kullback-Liebler,\n"\ + "- `mkl`: modified Kullback-Liebler,\n"\ + "- `specflux`: spectral flux.\n"\ + "\n"\ + "Spectral shape functions:\n"\ + "\n"\ + "- `centroid`: spectral centroid (barycenter of the norm vector),\n"\ + "- `spread`: variance around centroid,\n"\ + "- `skewness`: third order moment,\n"\ + "- `kurtosis`: a measure of the flatness of the spectrum,\n"\ + "- `slope`: decreasing rate of the amplitude,\n"\ + "- `decrease`: perceptual based measurement of the decreasing rate,\n"\ + "- `rolloff`: 95th energy percentile.\n"\ + "\n"\ + "Parameters\n"\ + "----------\n"\ + "method : str\n"\ + " Onset novelty or spectral shape function.\n"\ + "buf_size : int\n"\ + " Length of the input frame.\n"\ + "\n"\ + "Example\n"\ + "-------\n"\ + ">>> win_s = 1024; hop_s = win_s // 2\n"\ + ">>> pv = aubio.pvoc(win_s, hop_s)\n"\ + ">>> sd = aubio.specdesc(\"mkl\", win_s)\n"\ + ">>> sd(pv(aubio.fvec(hop_s))).shape\n"\ + "(1,)\n"\ + "\n"\ + "References\n"\ + "----------\n"\ + "`Detailed description "\ + "<https://aubio.org/doc/latest/specdesc_8h.html#details>`_ in\n"\ + "`aubio API documentation <https://aubio.org/doc/latest/index.html>`_.\n"\ + "" + +#define PYAUBIO_tempo_doc \ + "tempo(method=\"default\", buf_size=1024, hop_size=512, samplerate=44100)\n"\ + "\n"\ + "Tempo detection and beat tracking.\n" + +#define PYAUBIO_tss_doc \ + "tss(buf_size=1024, hop_size=512)\n"\ + "\n"\ + "Transient/Steady-state separation.\n" + +#define PYAUBIO_wavetable_doc \ + "wavetable(samplerate=44100, hop_size=512)\n"\ + "\n"\ + "Wavetable synthesis.\n" diff --git a/python/ext/aubio-types.h b/python/ext/aubio-types.h index f67b2da..4458ecc 100644 --- a/python/ext/aubio-types.h +++ b/python/ext/aubio-types.h @@ -1,6 +1,7 @@ #include <Python.h> #include <structmember.h> +#include "aubio-docstrings.h" #include "aubio-generated.h" #define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION @@ -27,7 +28,7 @@ #ifdef USE_LOCAL_AUBIO #include "aubio.h" #else -#include "aubio/aubio.h" +#include <aubio/aubio.h> #endif #define Py_default_vector_length 1024 @@ -46,6 +47,14 @@ #define AUBIO_NPY_SMPL_CHR "f" #endif +#ifndef PATH_MAX +#ifdef MAX_PATH +#define PATH_MAX MAX_PATH +#else +#define PATH_MAX 1024 +#endif +#endif + // compat with Python < 2.6 #ifndef Py_TYPE #define Py_TYPE(ob) (((PyObject*)(ob))->ob_type) diff --git a/python/ext/aubiomodule.c b/python/ext/aubiomodule.c index d3f0361..d4ed033 100644 --- a/python/ext/aubiomodule.c +++ b/python/ext/aubiomodule.c @@ -2,79 +2,203 @@ #include "aubio-types.h" #include "py-musicutils.h" +// this dummy macro is used to convince windows that a string passed as -D flag +// is just that, a string, and not a double. +#define REDEFINESTRING(x) #x +#define DEFINEDSTRING(x) REDEFINESTRING(x) + static char aubio_module_doc[] = "Python module for the aubio library"; static char Py_alpha_norm_doc[] = "" -"alpha_norm(fvec, integer) -> float\n" +"alpha_norm(vec, alpha)\n" +"\n" +"Compute `alpha` normalisation factor of vector `vec`.\n" "\n" -"Compute alpha normalisation factor on vector, given alpha\n" +"Parameters\n" +"----------\n" +"vec : fvec\n" +" input vector\n" +"alpha : float\n" +" norm factor\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" p-norm of the input vector, where `p=alpha`\n" "\n" "Example\n" "-------\n" "\n" -">>> b = alpha_norm(a, 9)"; +">>> a = aubio.fvec(np.arange(10)); alpha = 2\n" +">>> aubio.alpha_norm(a, alpha), (sum(a**alpha)/len(a))**(1./alpha)\n" +"(5.338539123535156, 5.338539126015656)\n" +"\n" +"Note\n" +"----\n" +"Computed as:\n" +"\n" +".. math::\n" +" l_{\\alpha} = \n" +" \\|\\frac{\\sum_{n=0}^{N-1}{{x_n}^{\\alpha}}}{N}\\|^{1/\\alpha}\n" +""; static char Py_bintomidi_doc[] = "" -"bintomidi(float, samplerate = integer, fftsize = integer) -> float\n" +"bintomidi(fftbin, samplerate, fftsize)\n" "\n" -"Convert bin (float) to midi (float), given the sampling rate and the FFT size\n" +"Convert FFT bin to frequency in midi note, given the sampling rate\n" +"and the size of the FFT.\n" +"\n" +"Parameters\n" +"----------\n" +"fftbin : float\n" +" input frequency bin\n" +"samplerate : float\n" +" sampling rate of the signal\n" +"fftsize : float\n" +" size of the FFT\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" Frequency converted to midi note.\n" "\n" "Example\n" "-------\n" "\n" -">>> midi = bintomidi(float, samplerate = 44100, fftsize = 1024)"; +">>> aubio.bintomidi(10, 44100, 1024)\n" +"68.62871551513672\n" +""; static char Py_miditobin_doc[] = "" -"miditobin(float, samplerate = integer, fftsize = integer) -> float\n" +"miditobin(midi, samplerate, fftsize)\n" "\n" -"Convert midi (float) to bin (float), given the sampling rate and the FFT size\n" +"Convert frequency in midi note to FFT bin, given the sampling rate\n" +"and the size of the FFT.\n" "\n" -"Example\n" +"Parameters\n" +"----------\n" +"midi : float\n" +" input frequency, in midi note\n" +"samplerate : float\n" +" sampling rate of the signal\n" +"fftsize : float\n" +" size of the FFT\n" +"\n" +"Returns\n" "-------\n" +"float\n" +" Frequency converted to FFT bin.\n" +"\n" +"Examples\n" +"--------\n" "\n" -">>> bin = miditobin(midi, samplerate = 44100, fftsize = 1024)"; +">>> aubio.miditobin(69, 44100, 1024)\n" +"10.216779708862305\n" +">>> aubio.miditobin(75.08, 32000, 512)\n" +"10.002175331115723\n" +""; static char Py_bintofreq_doc[] = "" -"bintofreq(float, samplerate = integer, fftsize = integer) -> float\n" +"bintofreq(fftbin, samplerate, fftsize)\n" "\n" -"Convert bin number (float) in frequency (Hz), given the sampling rate and the FFT size\n" +"Convert FFT bin to frequency in Hz, given the sampling rate\n" +"and the size of the FFT.\n" +"\n" +"Parameters\n" +"----------\n" +"fftbin : float\n" +" input frequency bin\n" +"samplerate : float\n" +" sampling rate of the signal\n" +"fftsize : float\n" +" size of the FFT\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" Frequency converted to Hz.\n" "\n" "Example\n" "-------\n" "\n" -">>> freq = bintofreq(bin, samplerate = 44100, fftsize = 1024)"; +">>> aubio.bintofreq(10, 44100, 1024)\n" +"430.6640625\n" +""; static char Py_freqtobin_doc[] = "" -"freqtobin(float, samplerate = integer, fftsize = integer) -> float\n" +"freqtobin(freq, samplerate, fftsize)\n" "\n" -"Convert frequency (Hz) in bin number (float), given the sampling rate and the FFT size\n" +"Convert frequency in Hz to FFT bin, given the sampling rate\n" +"and the size of the FFT.\n" "\n" -"Example\n" +"Parameters\n" +"----------\n" +"midi : float\n" +" input frequency, in midi note\n" +"samplerate : float\n" +" sampling rate of the signal\n" +"fftsize : float\n" +" size of the FFT\n" +"\n" +"Returns\n" "-------\n" +"float\n" +" Frequency converted to FFT bin.\n" "\n" -">>> bin = freqtobin(freq, samplerate = 44100, fftsize = 1024)"; +"Examples\n" +"--------\n" +"\n" +">>> aubio.freqtobin(440, 44100, 1024)\n" +"10.216779708862305\n" +""; static char Py_zero_crossing_rate_doc[] = "" -"zero_crossing_rate(fvec) -> float\n" +"zero_crossing_rate(vec)\n" +"\n" +"Compute zero-crossing rate of `vec`.\n" "\n" -"Compute Zero crossing rate of a vector\n" +"Parameters\n" +"----------\n" +"vec : fvec\n" +" input vector\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" Zero-crossing rate.\n" "\n" "Example\n" "-------\n" "\n" -">>> z = zero_crossing_rate(a)"; +">>> a = np.linspace(-1., 1., 1000, dtype=aubio.float_type)\n" +">>> aubio.zero_crossing_rate(a), 1/1000\n" +"(0.0010000000474974513, 0.001)\n" +""; static char Py_min_removal_doc[] = "" -"min_removal(fvec) -> float\n" +"min_removal(vec)\n" +"\n" +"Remove the minimum value of a vector to each of its element.\n" +"\n" +"Modifies the input vector in-place and returns a reference to it.\n" "\n" -"Remove the minimum value of a vector, in-place modification\n" +"Parameters\n" +"----------\n" +"vec : fvec\n" +" input vector\n" +"\n" +"Returns\n" +"-------\n" +"fvec\n" +" modified input vector\n" "\n" "Example\n" "-------\n" "\n" -">>> min_removal(a)"; +">>> aubio.min_removal(aubio.fvec(np.arange(1,4)))\n" +"array([0., 1., 2.], dtype=" AUBIO_NPY_SMPL_STR ")\n" +""; -extern void add_generated_objects ( PyObject *m ); extern void add_ufuncs ( PyObject *m ); extern int generated_types_ready(void); @@ -99,7 +223,7 @@ Py_alpha_norm (PyObject * self, PyObject * args) } // compute the function - result = Py_BuildValue (AUBIO_NPY_SMPL_CHR, fvec_alpha_norm (&vec, alpha)); + result = PyFloat_FromDouble(fvec_alpha_norm (&vec, alpha)); if (result == NULL) { return NULL; } @@ -113,7 +237,9 @@ Py_bintomidi (PyObject * self, PyObject * args) smpl_t input, samplerate, fftsize; smpl_t output; - if (!PyArg_ParseTuple (args, "|" AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR , &input, &samplerate, &fftsize)) { + if (!PyArg_ParseTuple (args, + "" AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR, + &input, &samplerate, &fftsize)) { return NULL; } @@ -128,7 +254,9 @@ Py_miditobin (PyObject * self, PyObject * args) smpl_t input, samplerate, fftsize; smpl_t output; - if (!PyArg_ParseTuple (args, "|" AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR , &input, &samplerate, &fftsize)) { + if (!PyArg_ParseTuple (args, + "" AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR, + &input, &samplerate, &fftsize)) { return NULL; } @@ -143,7 +271,9 @@ Py_bintofreq (PyObject * self, PyObject * args) smpl_t input, samplerate, fftsize; smpl_t output; - if (!PyArg_ParseTuple (args, "|" AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR, &input, &samplerate, &fftsize)) { + if (!PyArg_ParseTuple (args, + "" AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR, + &input, &samplerate, &fftsize)) { return NULL; } @@ -158,7 +288,9 @@ Py_freqtobin (PyObject * self, PyObject * args) smpl_t input, samplerate, fftsize; smpl_t output; - if (!PyArg_ParseTuple (args, "|" AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR, &input, &samplerate, &fftsize)) { + if (!PyArg_ParseTuple (args, + "" AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR, + &input, &samplerate, &fftsize)) { return NULL; } @@ -187,7 +319,7 @@ Py_zero_crossing_rate (PyObject * self, PyObject * args) } // compute the function - result = Py_BuildValue (AUBIO_NPY_SMPL_CHR, aubio_zero_crossing_rate (&vec)); + result = PyFloat_FromDouble(aubio_zero_crossing_rate (&vec)); if (result == NULL) { return NULL; } @@ -238,6 +370,12 @@ static PyMethodDef aubio_methods[] = { {"silence_detection", Py_aubio_silence_detection, METH_VARARGS, Py_aubio_silence_detection_doc}, {"level_detection", Py_aubio_level_detection, METH_VARARGS, Py_aubio_level_detection_doc}, {"window", Py_aubio_window, METH_VARARGS, Py_aubio_window_doc}, + {"shift", Py_aubio_shift, METH_VARARGS, Py_aubio_shift_doc}, + {"ishift", Py_aubio_ishift, METH_VARARGS, Py_aubio_ishift_doc}, + {"hztomel", Py_aubio_hztomel, METH_VARARGS|METH_KEYWORDS, Py_aubio_hztomel_doc}, + {"meltohz", Py_aubio_meltohz, METH_VARARGS|METH_KEYWORDS, Py_aubio_meltohz_doc}, + {"hztomel_htk", Py_aubio_hztomel_htk, METH_VARARGS, Py_aubio_hztomel_htk_doc}, + {"meltohz_htk", Py_aubio_meltohz_htk, METH_VARARGS, Py_aubio_meltohz_htk_doc}, {NULL, NULL, 0, NULL} /* Sentinel */ }; @@ -256,6 +394,22 @@ static struct PyModuleDef moduledef = { }; #endif +void +aubio_log_function(int level, const char *message, void *data) +{ + // remove trailing \n + char *pos; + if ((pos=strchr(message, '\n')) != NULL) { + *pos = '\0'; + } + // warning or error + if (level == AUBIO_LOG_ERR) { + PyErr_Format(PyExc_RuntimeError, "%s", message); + } else { + PyErr_WarnEx(PyExc_UserWarning, message, 1); + } +} + static PyObject * initaubio (void) { @@ -308,6 +462,7 @@ initaubio (void) PyModule_AddObject (m, "sink", (PyObject *) & Py_sinkType); PyModule_AddStringConstant(m, "float_type", AUBIO_NPY_SMPL_STR); + PyModule_AddStringConstant(m, "__version__", DEFINEDSTRING(AUBIO_VERSION)); // add generated objects add_generated_objects(m); @@ -315,6 +470,8 @@ initaubio (void) // add ufunc add_ufuncs(m); + aubio_log_set_level_function(AUBIO_LOG_ERR, aubio_log_function, NULL); + aubio_log_set_level_function(AUBIO_LOG_WRN, aubio_log_function, NULL); return m; } diff --git a/python/ext/py-cvec.c b/python/ext/py-cvec.c index 427cc46..1c9316a 100644 --- a/python/ext/py-cvec.c +++ b/python/ext/py-cvec.c @@ -19,7 +19,37 @@ typedef struct uint_t length; } Py_cvec; -static char Py_cvec_doc[] = "cvec object"; +static char Py_cvec_doc[] = "" +"cvec(size)\n" +"\n" +"A container holding spectral data.\n" +"\n" +"Create one `cvec` to store the spectral information of a window\n" +"of `size` points. The data will be stored in two vectors,\n" +":attr:`phas` and :attr:`norm`, each of shape (:attr:`length`,),\n" +"with `length = size // 2 + 1`.\n" +"\n" +"Parameters\n" +"----------\n" +"size: int\n" +" Size of spectrum to create.\n" +"\n" +"Examples\n" +"--------\n" +">>> c = aubio.cvec(1024)\n" +">>> c\n" +"aubio cvec of 513 elements\n" +">>> c.length\n" +"513\n" +">>> c.norm.dtype, c.phas.dtype\n" +"(dtype('float32'), dtype('float32'))\n" +">>> c.norm.shape, c.phas.shape\n" +"((513,), (513,))\n" +"\n" +"See Also\n" +"--------\n" +"fvec, fft, pvoc\n" +""; PyObject * @@ -106,7 +136,7 @@ Py_cvec_repr (Py_cvec * self, PyObject * unused) goto fail; } - args = Py_BuildValue ("I", self->length); + args = PyLong_FromLong(self->length); if (args == NULL) { goto fail; } @@ -142,14 +172,14 @@ Py_cvec_set_norm (Py_cvec * vec, PyObject *input, void * closure) { npy_intp length; if (!PyAubio_IsValidVector(input)) { - return 1; + return -1; } length = PyArray_SIZE ((PyArrayObject *)input); if (length != vec->length) { PyErr_Format (PyExc_ValueError, - "input array has length %ld, but cvec has length %d", length, + "input array has length %" NPY_INTP_FMT ", but cvec has length %d", length, vec->length); - return 1; + return -1; } Py_XDECREF(vec->norm); @@ -163,14 +193,14 @@ Py_cvec_set_phas (Py_cvec * vec, PyObject *input, void * closure) { npy_intp length; if (!PyAubio_IsValidVector(input)) { - return 1; + return -1; } length = PyArray_SIZE ((PyArrayObject *)input); if (length != vec->length) { PyErr_Format (PyExc_ValueError, - "input array has length %ld, but cvec has length %d", length, + "input array has length %" NPY_INTP_FMT ", but cvec has length %d", length, vec->length); - return 1; + return -1; } Py_XDECREF(vec->phas); @@ -182,7 +212,7 @@ Py_cvec_set_phas (Py_cvec * vec, PyObject *input, void * closure) static PyMemberDef Py_cvec_members[] = { // TODO remove READONLY flag and define getter/setter {"length", T_INT, offsetof (Py_cvec, length), READONLY, - "length attribute"}, + "int: Length of `norm` and `phas` vectors."}, {NULL} /* Sentinel */ }; @@ -191,11 +221,11 @@ static PyMethodDef Py_cvec_methods[] = { }; static PyGetSetDef Py_cvec_getseters[] = { - {"norm", (getter)Py_cvec_get_norm, (setter)Py_cvec_set_norm, - "Numpy vector of shape (length,) containing the magnitude", + {"norm", (getter)Py_cvec_get_norm, (setter)Py_cvec_set_norm, + "numpy.ndarray: Vector of shape `(length,)` containing the magnitude.", NULL}, - {"phas", (getter)Py_cvec_get_phas, (setter)Py_cvec_set_phas, - "Numpy vector of shape (length,) containing the phase", + {"phas", (getter)Py_cvec_get_phas, (setter)Py_cvec_set_phas, + "numpy.ndarray: Vector of shape `(length,)` containing the phase.", NULL}, {NULL} /* sentinel */ }; diff --git a/python/ext/py-fft.c b/python/ext/py-fft.c index 7485ea3..53dfbbf 100644 --- a/python/ext/py-fft.c +++ b/python/ext/py-fft.c @@ -1,6 +1,24 @@ #include "aubio-types.h" -static char Py_fft_doc[] = "fft object"; +static char Py_fft_doc[] = "" +"fft(size=1024)\n" +"\n" +"Compute Fast Fourier Transforms.\n" +"\n" +"Parameters\n" +"----------\n" +"size : int\n" +" size of the FFT to compute\n" +"\n" +"Example\n" +"-------\n" +">>> x = aubio.fvec(512)\n" +">>> f = aubio.fft(512)\n" +">>> c = f(x); c\n" +"aubio cvec of 257 elements\n" +">>> x2 = f.rdo(c); x2.shape\n" +"(512,)\n" +""; typedef struct { @@ -51,11 +69,8 @@ Py_fft_init (Py_fft * self, PyObject * args, PyObject * kwds) { self->o = new_aubio_fft (self->win_s); if (self->o == NULL) { - PyErr_Format(PyExc_RuntimeError, - "error creating fft with win_s=%d " - "(should be a power of 2 greater than 1; " - "try recompiling aubio with --enable-fftw3)", - self->win_s); + // PyErr_Format(PyExc_RuntimeError, ...) was set above by new_ which called + // AUBIO_ERR when failing return -1; } diff --git a/python/ext/py-filter.c b/python/ext/py-filter.c index df78e47..861f8cd 100644 --- a/python/ext/py-filter.c +++ b/python/ext/py-filter.c @@ -10,7 +10,58 @@ typedef struct fvec_t c_out; } Py_filter; -static char Py_filter_doc[] = "filter object"; +static char Py_filter_doc[] = "" +"digital_filter(order=7)\n" +"\n" +"Create a digital filter.\n" +""; + +static char Py_filter_set_c_weighting_doc[] = "" +"set_c_weighting(samplerate)\n" +"\n" +"Set filter coefficients to C-weighting.\n" +"\n" +"`samplerate` should be one of 8000, 11025, 16000, 22050, 24000, 32000,\n" +"44100, 48000, 88200, 96000, or 192000. `order` of the filter should be 5.\n" +"\n" +"Parameters\n" +"----------\n" +"samplerate : int\n" +" Sampling-rate of the input signal, in Hz.\n" +""; + +static char Py_filter_set_a_weighting_doc[] = "" +"set_a_weighting(samplerate)\n" +"\n" +"Set filter coefficients to A-weighting.\n" +"\n" +"`samplerate` should be one of 8000, 11025, 16000, 22050, 24000, 32000,\n" +"44100, 48000, 88200, 96000, or 192000. `order` of the filter should be 7.\n" +"\n" +"Parameters\n" +"----------\n" +"samplerate : int\n" +" Sampling-rate of the input signal.\n" +""; + +static char Py_filter_set_biquad_doc[] = "" +"set_biquad(b0, b1, b2, a1, a2)\n" +"\n" +"Set biquad coefficients. `order` of the filter should be 3.\n" +"\n" +"Parameters\n" +"----------\n" +"b0 : float\n" +" Forward filter coefficient.\n" +"b1 : float\n" +" Forward filter coefficient.\n" +"b2 : float\n" +" Forward filter coefficient.\n" +"a1 : float\n" +" Feedback filter coefficient.\n" +"a2 : float\n" +" Feedback filter coefficient.\n" +""; static PyObject * Py_filter_new (PyTypeObject * type, PyObject * args, PyObject * kwds) @@ -58,7 +109,8 @@ static void Py_filter_del (Py_filter * self) { Py_XDECREF(self->out); - del_aubio_filter (self->o); + if (self->o) + del_aubio_filter (self->o); Py_TYPE(self)->tp_free ((PyObject *) self); } @@ -156,11 +208,11 @@ static PyMemberDef Py_filter_members[] = { static PyMethodDef Py_filter_methods[] = { {"set_c_weighting", (PyCFunction) Py_filter_set_c_weighting, METH_VARARGS, - "set filter coefficients to C-weighting"}, + Py_filter_set_c_weighting_doc}, {"set_a_weighting", (PyCFunction) Py_filter_set_a_weighting, METH_VARARGS, - "set filter coefficients to A-weighting"}, + Py_filter_set_a_weighting_doc}, {"set_biquad", (PyCFunction) Py_filter_set_biquad, METH_VARARGS, - "set b0, b1, b2, a1, a2 biquad coefficients"}, + Py_filter_set_biquad_doc}, {NULL} }; diff --git a/python/ext/py-filterbank.c b/python/ext/py-filterbank.c index a4e0ea6..0bd00d0 100644 --- a/python/ext/py-filterbank.c +++ b/python/ext/py-filterbank.c @@ -1,6 +1,188 @@ #include "aubio-types.h" -static char Py_filterbank_doc[] = "filterbank object"; +static char Py_filterbank_doc[] = "" +"filterbank(n_filters=40, win_s=1024)\n" +"\n" +"Create a bank of spectral filters. Each instance is a callable\n" +"that holds a matrix of coefficients.\n" +"\n" +"See also :meth:`set_mel_coeffs`, :meth:`set_mel_coeffs_htk`,\n" +":meth:`set_mel_coeffs_slaney`, :meth:`set_triangle_bands`, and\n" +":meth:`set_coeffs`.\n" +"\n" +"Parameters\n" +"----------\n" +"n_filters : int\n" +" Number of filters to create.\n" +"win_s : int\n" +" Size of the input spectrum to process.\n" +"\n" +"Examples\n" +"--------\n" +">>> f = aubio.filterbank(128, 1024)\n" +">>> f.set_mel_coeffs(44100, 0, 10000)\n" +">>> c = aubio.cvec(1024)\n" +">>> f(c).shape\n" +"(128, )\n" +""; + +static char Py_filterbank_set_triangle_bands_doc[] ="" +"set_triangle_bands(freqs, samplerate)\n" +"\n" +"Set triangular bands. The coefficients will be set to triangular\n" +"overlapping windows using the boundaries specified by `freqs`.\n" +"\n" +"`freqs` should contain `n_filters + 2` frequencies in Hz, ordered\n" +"by value, from smallest to largest. The first element should be greater\n" +"or equal to zero; the last element should be smaller or equal to\n" +"`samplerate / 2`.\n" +"\n" +"Parameters\n" +"----------\n" +"freqs: fvec\n" +" List of frequencies, in Hz.\n" +"samplerate : float\n" +" Sampling-rate of the expected input.\n" +"\n" +"Example\n" +"-------\n" +">>> fb = aubio.filterbank(n_filters=100, win_s=2048)\n" +">>> samplerate = 44100; freqs = np.linspace(0, 20200, 102)\n" +">>> fb.set_triangle_bands(aubio.fvec(freqs), samplerate)\n" +""; + +static char Py_filterbank_set_mel_coeffs_slaney_doc[] = "" +"set_mel_coeffs_slaney(samplerate)\n" +"\n" +"Set coefficients of filterbank to match Slaney's Auditory Toolbox.\n" +"\n" +"The filter coefficients will be set as in Malcolm Slaney's\n" +"implementation. The filterbank should have been created with\n" +"`n_filters = 40`.\n" +"\n" +"This is approximately equivalent to using :meth:`set_mel_coeffs` with\n" +"`fmin = 400./3., fmax = 6853.84`.\n" +"\n" +"Parameters\n" +"----------\n" +"samplerate : float\n" +" Sampling-rate of the expected input.\n" +"\n" +"References\n" +"----------\n" +"\n" +"Malcolm Slaney, `Auditory Toolbox Version 2, Technical Report #1998-010\n" +"<https://engineering.purdue.edu/~malcolm/interval/1998-010/>`_\n" +""; + +static char Py_filterbank_set_mel_coeffs_doc[] = "" +"set_mel_coeffs(samplerate, fmin, fmax)\n" +"\n" +"Set coefficients of filterbank to linearly spaced mel scale.\n" +"\n" +"Parameters\n" +"----------\n" +"samplerate : float\n" +" Sampling-rate of the expected input.\n" +"fmin : float\n" +" Lower frequency boundary of the first filter.\n" +"fmax : float\n" +" Upper frequency boundary of the last filter.\n" +"\n" +"See also\n" +"--------\n" +"hztomel\n" +""; + +static char Py_filterbank_set_mel_coeffs_htk_doc[] = "" +"set_mel_coeffs_htk(samplerate, fmin, fmax)\n" +"\n" +"Set coefficients of the filters to be linearly spaced in the HTK mel scale.\n" +"\n" +"Parameters\n" +"----------\n" +"samplerate : float\n" +" Sampling-rate of the expected input.\n" +"fmin : float\n" +" Lower frequency boundary of the first filter.\n" +"fmax : float\n" +" Upper frequency boundary of the last filter.\n" +"\n" +"See also\n" +"--------\n" +"hztomel with `htk=True`\n" +""; + +static char Py_filterbank_get_coeffs_doc[] = "" +"get_coeffs()\n" +"\n" +"Get coefficients matrix of filterbank.\n" +"\n" +"Returns\n" +"-------\n" +"array_like\n" +" Array of shape (n_filters, win_s/2+1) containing the coefficients.\n" +""; + +static char Py_filterbank_set_coeffs_doc[] = "" +"set_coeffs(coeffs)\n" +"\n" +"Set coefficients of filterbank.\n" +"\n" +"Parameters\n" +"----------\n" +"coeffs : fmat\n" +" Array of shape (n_filters, win_s/2+1) containing the coefficients.\n" +""; + +static char Py_filterbank_set_power_doc[] = "" +"set_power(power)\n" +"\n" +"Set power applied to input spectrum of filterbank.\n" +"\n" +"Parameters\n" +"----------\n" +"power : float\n" +" Power to raise input spectrum to before computing the filters.\n" +""; + +static char Py_filterbank_get_power_doc[] = "" +"get_power()\n" +"\n" +"Get power applied to filterbank.\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" Power parameter.\n" +""; + +static char Py_filterbank_set_norm_doc[] = "" +"set_norm(norm)\n" +"\n" +"Set norm parameter. If set to `0`, the filters will not be normalized.\n" +"If set to `1`, the filters will be normalized to one. Default to `1`.\n" +"\n" +"This function should be called *before* :meth:`set_triangle_bands`,\n" +":meth:`set_mel_coeffs`, :meth:`set_mel_coeffs_htk`, or\n" +":meth:`set_mel_coeffs_slaney`.\n" +"\n" +"Parameters\n" +"----------\n" +"norm : int\n" +" `0` to disable, `1` to enable\n" +""; + +static char Py_filterbank_get_norm_doc[] = "" +"get_norm()\n" +"\n" +"Get norm parameter of filterbank.\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" Norm parameter.\n" +""; typedef struct { @@ -94,7 +276,7 @@ Py_filterbank_do(Py_filterbank * self, PyObject * args) if (self->vec.length != self->win_s / 2 + 1) { PyErr_Format(PyExc_ValueError, - "input cvec has length %d, but fft expects length %d", + "input cvec has length %d, but filterbank expects length %d", self->vec.length, self->win_s / 2 + 1); return NULL; } @@ -122,8 +304,8 @@ Py_filterbank_set_triangle_bands (Py_filterbank * self, PyObject *args) uint_t err = 0; PyObject *input; - uint_t samplerate; - if (!PyArg_ParseTuple (args, "OI", &input, &samplerate)) { + smpl_t samplerate; + if (!PyArg_ParseTuple (args, "O" AUBIO_NPY_SMPL_CHR, &input, &samplerate)) { return NULL; } @@ -138,8 +320,14 @@ Py_filterbank_set_triangle_bands (Py_filterbank * self, PyObject *args) err = aubio_filterbank_set_triangle_bands (self->o, &(self->freqs), samplerate); if (err > 0) { - PyErr_SetString (PyExc_ValueError, - "error when setting filter to A-weighting"); + if (PyErr_Occurred() == NULL) { + PyErr_SetString (PyExc_ValueError, "error running set_triangle_bands"); + } else { + // change the RuntimeError into ValueError + PyObject *type, *value, *traceback; + PyErr_Fetch(&type, &value, &traceback); + PyErr_Restore(PyExc_ValueError, value, traceback); + } return NULL; } Py_RETURN_NONE; @@ -150,15 +338,79 @@ Py_filterbank_set_mel_coeffs_slaney (Py_filterbank * self, PyObject *args) { uint_t err = 0; - uint_t samplerate; - if (!PyArg_ParseTuple (args, "I", &samplerate)) { + smpl_t samplerate; + if (!PyArg_ParseTuple (args, AUBIO_NPY_SMPL_CHR, &samplerate)) { return NULL; } err = aubio_filterbank_set_mel_coeffs_slaney (self->o, samplerate); if (err > 0) { - PyErr_SetString (PyExc_ValueError, - "error when setting filter to A-weighting"); + if (PyErr_Occurred() == NULL) { + PyErr_SetString (PyExc_ValueError, "error running set_mel_coeffs_slaney"); + } else { + // change the RuntimeError into ValueError + PyObject *type, *value, *traceback; + PyErr_Fetch(&type, &value, &traceback); + PyErr_Restore(PyExc_ValueError, value, traceback); + } + return NULL; + } + Py_RETURN_NONE; +} + +static PyObject * +Py_filterbank_set_mel_coeffs (Py_filterbank * self, PyObject *args) +{ + uint_t err = 0; + + smpl_t samplerate; + smpl_t freq_min; + smpl_t freq_max; + if (!PyArg_ParseTuple (args, AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR + AUBIO_NPY_SMPL_CHR, &samplerate, &freq_min, &freq_max)) { + return NULL; + } + + err = aubio_filterbank_set_mel_coeffs (self->o, samplerate, + freq_min, freq_max); + if (err > 0) { + if (PyErr_Occurred() == NULL) { + PyErr_SetString (PyExc_ValueError, "error running set_mel_coeffs"); + } else { + // change the RuntimeError into ValueError + PyObject *type, *value, *traceback; + PyErr_Fetch(&type, &value, &traceback); + PyErr_Restore(PyExc_ValueError, value, traceback); + } + return NULL; + } + Py_RETURN_NONE; +} + +static PyObject * +Py_filterbank_set_mel_coeffs_htk (Py_filterbank * self, PyObject *args) +{ + uint_t err = 0; + + smpl_t samplerate; + smpl_t freq_min; + smpl_t freq_max; + if (!PyArg_ParseTuple (args, AUBIO_NPY_SMPL_CHR AUBIO_NPY_SMPL_CHR + AUBIO_NPY_SMPL_CHR, &samplerate, &freq_min, &freq_max)) { + return NULL; + } + + err = aubio_filterbank_set_mel_coeffs_htk (self->o, samplerate, + freq_min, freq_max); + if (err > 0) { + if (PyErr_Occurred() == NULL) { + PyErr_SetString (PyExc_ValueError, "error running set_mel_coeffs_htk"); + } else { + // change the RuntimeError into ValueError + PyObject *type, *value, *traceback; + PyErr_Fetch(&type, &value, &traceback); + PyErr_Restore(PyExc_ValueError, value, traceback); + } return NULL; } Py_RETURN_NONE; @@ -195,15 +447,87 @@ Py_filterbank_get_coeffs (Py_filterbank * self, PyObject *unused) aubio_filterbank_get_coeffs (self->o) ); } +static PyObject * +Py_filterbank_set_power(Py_filterbank *self, PyObject *args) +{ + smpl_t power; + + if (!PyArg_ParseTuple (args, AUBIO_NPY_SMPL_CHR, &power)) { + return NULL; + } + if(aubio_filterbank_set_power (self->o, power)) { + if (PyErr_Occurred() == NULL) { + PyErr_SetString (PyExc_ValueError, + "error running filterbank.set_power"); + } else { + // change the RuntimeError into ValueError + PyObject *type, *value, *traceback; + PyErr_Fetch(&type, &value, &traceback); + PyErr_Restore(PyExc_ValueError, value, traceback); + } + return NULL; + } + Py_RETURN_NONE; +} + +static PyObject * +Py_filterbank_get_power (Py_filterbank * self, PyObject *unused) +{ + smpl_t power = aubio_filterbank_get_power(self->o); + return (PyObject *)PyFloat_FromDouble (power); +} + +static PyObject * +Py_filterbank_set_norm(Py_filterbank *self, PyObject *args) +{ + smpl_t norm; + + if (!PyArg_ParseTuple (args, AUBIO_NPY_SMPL_CHR, &norm)) { + return NULL; + } + if(aubio_filterbank_set_norm (self->o, norm)) { + if (PyErr_Occurred() == NULL) { + PyErr_SetString (PyExc_ValueError, + "error running filterbank.set_power"); + } else { + // change the RuntimeError into ValueError + PyObject *type, *value, *traceback; + PyErr_Fetch(&type, &value, &traceback); + PyErr_Restore(PyExc_ValueError, value, traceback); + } + return NULL; + } + Py_RETURN_NONE; +} + +static PyObject * +Py_filterbank_get_norm (Py_filterbank * self, PyObject *unused) +{ + smpl_t norm = aubio_filterbank_get_norm(self->o); + return (PyObject *)PyFloat_FromDouble (norm); +} + static PyMethodDef Py_filterbank_methods[] = { {"set_triangle_bands", (PyCFunction) Py_filterbank_set_triangle_bands, - METH_VARARGS, "set coefficients of filterbanks"}, + METH_VARARGS, Py_filterbank_set_triangle_bands_doc}, {"set_mel_coeffs_slaney", (PyCFunction) Py_filterbank_set_mel_coeffs_slaney, - METH_VARARGS, "set coefficients of filterbank as in Auditory Toolbox"}, + METH_VARARGS, Py_filterbank_set_mel_coeffs_slaney_doc}, + {"set_mel_coeffs", (PyCFunction) Py_filterbank_set_mel_coeffs, + METH_VARARGS, Py_filterbank_set_mel_coeffs_doc}, + {"set_mel_coeffs_htk", (PyCFunction) Py_filterbank_set_mel_coeffs_htk, + METH_VARARGS, Py_filterbank_set_mel_coeffs_htk_doc}, {"get_coeffs", (PyCFunction) Py_filterbank_get_coeffs, - METH_NOARGS, "get coefficients of filterbank"}, + METH_NOARGS, Py_filterbank_get_coeffs_doc}, {"set_coeffs", (PyCFunction) Py_filterbank_set_coeffs, - METH_VARARGS, "set coefficients of filterbank"}, + METH_VARARGS, Py_filterbank_set_coeffs_doc}, + {"set_power", (PyCFunction) Py_filterbank_set_power, + METH_VARARGS, Py_filterbank_set_power_doc}, + {"get_power", (PyCFunction) Py_filterbank_get_power, + METH_NOARGS, Py_filterbank_get_power_doc}, + {"set_norm", (PyCFunction) Py_filterbank_set_norm, + METH_VARARGS, Py_filterbank_set_norm_doc}, + {"get_norm", (PyCFunction) Py_filterbank_get_norm, + METH_NOARGS, Py_filterbank_get_norm_doc}, {NULL} }; diff --git a/python/ext/py-musicutils.c b/python/ext/py-musicutils.c index 3078e07..0827ea2 100644 --- a/python/ext/py-musicutils.c +++ b/python/ext/py-musicutils.c @@ -39,7 +39,7 @@ Py_aubio_level_lin(PyObject *self, PyObject *args) return NULL; } - level_lin = Py_BuildValue(AUBIO_NPY_SMPL_CHR, aubio_level_lin(&vec)); + level_lin = PyFloat_FromDouble(aubio_level_lin(&vec)); if (level_lin == NULL) { PyErr_SetString (PyExc_ValueError, "failed computing level_lin"); return NULL; @@ -67,7 +67,7 @@ Py_aubio_db_spl(PyObject *self, PyObject *args) return NULL; } - db_spl = Py_BuildValue(AUBIO_NPY_SMPL_CHR, aubio_db_spl(&vec)); + db_spl = PyFloat_FromDouble(aubio_db_spl(&vec)); if (db_spl == NULL) { PyErr_SetString (PyExc_ValueError, "failed computing db_spl"); return NULL; @@ -96,7 +96,7 @@ Py_aubio_silence_detection(PyObject *self, PyObject *args) return NULL; } - silence_detection = Py_BuildValue("I", aubio_silence_detection(&vec, threshold)); + silence_detection = PyLong_FromLong(aubio_silence_detection(&vec, threshold)); if (silence_detection == NULL) { PyErr_SetString (PyExc_ValueError, "failed computing silence_detection"); return NULL; @@ -125,7 +125,7 @@ Py_aubio_level_detection(PyObject *self, PyObject *args) return NULL; } - level_detection = Py_BuildValue(AUBIO_NPY_SMPL_CHR, aubio_level_detection(&vec, threshold)); + level_detection = PyFloat_FromDouble(aubio_level_detection(&vec, threshold)); if (level_detection == NULL) { PyErr_SetString (PyExc_ValueError, "failed computing level_detection"); return NULL; @@ -133,3 +133,105 @@ Py_aubio_level_detection(PyObject *self, PyObject *args) return level_detection; } + +PyObject * +Py_aubio_shift(PyObject *self, PyObject *args) +{ + PyObject *input; + fvec_t vec; + + if (!PyArg_ParseTuple (args, "O:shift", &input)) { + return NULL; + } + + if (input == NULL) { + return NULL; + } + + if (!PyAubio_ArrayToCFvec(input, &vec)) { + return NULL; + } + + fvec_shift(&vec); + + //Py_RETURN_NONE; + return (PyObject *) PyAubio_CFvecToArray(&vec); +} + +PyObject * +Py_aubio_ishift(PyObject *self, PyObject *args) +{ + PyObject *input; + fvec_t vec; + + if (!PyArg_ParseTuple (args, "O:shift", &input)) { + return NULL; + } + + if (input == NULL) { + return NULL; + } + + if (!PyAubio_ArrayToCFvec(input, &vec)) { + return NULL; + } + + fvec_ishift(&vec); + + //Py_RETURN_NONE; + return (PyObject *) PyAubio_CFvecToArray(&vec); +} + +PyObject* +Py_aubio_hztomel(PyObject *self, PyObject *args, PyObject *kwds) +{ + smpl_t v; + PyObject *htk = NULL; + static char *kwlist[] = {"f", "htk", NULL}; + if (!PyArg_ParseTupleAndKeywords(args, kwds, AUBIO_NPY_SMPL_CHR "|O", + kwlist, &v, &htk)) + { + return NULL; + } + if (htk != NULL && PyObject_IsTrue(htk) == 1) + return PyFloat_FromDouble(aubio_hztomel_htk(v)); + else + return PyFloat_FromDouble(aubio_hztomel(v)); +} + +PyObject* +Py_aubio_meltohz(PyObject *self, PyObject *args, PyObject *kwds) +{ + smpl_t v; + PyObject *htk = NULL; + static char *kwlist[] = {"m", "htk", NULL}; + if (!PyArg_ParseTupleAndKeywords(args, kwds, AUBIO_NPY_SMPL_CHR "|O", + kwlist, &v, &htk)) + { + return NULL; + } + if (htk != NULL && PyObject_IsTrue(htk) == 1) + return PyFloat_FromDouble(aubio_meltohz_htk(v)); + else + return PyFloat_FromDouble(aubio_meltohz(v)); +} + +PyObject* +Py_aubio_hztomel_htk(PyObject *self, PyObject *args) +{ + smpl_t v; + if (!PyArg_ParseTuple(args, AUBIO_NPY_SMPL_CHR, &v)) { + return NULL; + } + return PyFloat_FromDouble(aubio_hztomel_htk(v)); +} + +PyObject* +Py_aubio_meltohz_htk(PyObject *self, PyObject *args) +{ + smpl_t v; + if (!PyArg_ParseTuple(args, AUBIO_NPY_SMPL_CHR, &v)) { + return NULL; + } + return PyFloat_FromDouble(aubio_meltohz_htk(v)); +} diff --git a/python/ext/py-musicutils.h b/python/ext/py-musicutils.h index 54ee230..27698d1 100644 --- a/python/ext/py-musicutils.h +++ b/python/ext/py-musicutils.h @@ -2,73 +2,370 @@ #define PY_AUBIO_MUSICUTILS_H static char Py_aubio_window_doc[] = "" -"window(string, integer) -> fvec\n" +"window(window_type, size)\n" "\n" -"Create a window\n" +"Create a window of length `size`. `window_type` should be one\n" +"of the following:\n" "\n" -"Example\n" +"- `default` (same as `hanningz`).\n" +"- `ones`\n" +"- `rectangle`\n" +"- `hamming`\n" +"- `hanning`\n" +"- `hanningz` [1]_\n" +"- `blackman`\n" +"- `blackman_harris`\n" +"- `gaussian`\n" +"- `welch`\n" +"- `parzen`\n" +"\n" +"Parameters\n" +"----------\n" +"window_type : str\n" +" Type of window.\n" +"size : int\n" +" Length of window.\n" +"\n" +"Returns\n" "-------\n" +"fvec\n" +" Array of shape `(length,)` containing the new window.\n" +"\n" +"See Also\n" +"--------\n" +"pvoc, fft\n" +"\n" +"Examples\n" +"--------\n" +"Compute a zero-phase Hann window on `1024` points:\n" "\n" -">>> window('hanningz', 1024)\n" +">>> aubio.window('hanningz', 1024)\n" "array([ 0.00000000e+00, 9.41753387e-06, 3.76403332e-05, ...,\n" -" 8.46982002e-05, 3.76403332e-05, 9.41753387e-06], dtype=float32)"; +" 8.46982002e-05, 3.76403332e-05, 9.41753387e-06], dtype=float32)\n" +"\n" +"Plot different window types with `matplotlib <https://matplotlib.org/>`_:\n" +"\n" +">>> import matplotlib.pyplot as plt\n" +">>> modes = ['default', 'ones', 'rectangle', 'hamming', 'hanning',\n" +"... 'hanningz', 'blackman', 'blackman_harris', 'gaussian',\n" +"... 'welch', 'parzen']; n = 2048\n" +">>> for m in modes: plt.plot(aubio.window(m, n), label=m)\n" +"...\n" +">>> plt.legend(); plt.show()\n" +"\n" +"Note\n" +"----\n" +"The following examples contain the equivalent source code to compute\n" +"each type of window with `NumPy <https://numpy.org>`_:\n" +"\n" +">>> n = 1024; x = np.arange(n, dtype=aubio.float_type)\n" +">>> ones = np.ones(n).astype(aubio.float_type)\n" +">>> rectangle = 0.5 * ones\n" +">>> hanning = 0.5 - 0.5 * np.cos(2 * np.pi * x / n)\n" +">>> hanningz = 0.5 * (1 - np.cos(2 * np.pi * x / n))\n" +">>> hamming = 0.54 - 0.46 * np.cos(2.*np.pi * x / (n - 1))\n" +">>> blackman = 0.42 \\\n" +"... - 0.50 * np.cos(2 * np.pi * x / (n - 1)) \\\n" +"... + 0.08 * np.cos(4 * np.pi * x / (n - 1))\n" +">>> blackman_harris = 0.35875 \\\n" +"... - 0.48829 * np.cos(2 * np.pi * x / (n - 1)) \\\n" +"... + 0.14128 * np.cos(4 * np.pi * x / (n - 1)) \\\n" +"... + 0.01168 * np.cos(6 * np.pi * x / (n - 1))\n" +">>> gaussian = np.exp( - 0.5 * ((x - 0.5 * (n - 1)) \\\n" +"... / (0.25 * (n - 1)) )**2 )\n" +">>> welch = 1 - ((2 * x - n) / (n + 1))**2\n" +">>> parzen = 1 - np.abs((2 * x - n) / (n + 1))\n" +">>> default = hanningz\n" +"References\n" +"----------\n" +#if 0 +"`Window function <https://en.wikipedia.org/wiki/Window_function>`_ on\n" +"Wikipedia.\n" +"\n" +#endif +".. [1] Amalia de Götzen, Nicolas Bernardini, and Daniel Arfib. Traditional\n" +" (?) implementations of a phase vocoder: the tricks of the trade.\n" +" In *Proceedings of the International Conference on Digital Audio\n" +" Effects* (DAFx-00), pages 37–44, University of Verona, Italy, 2000.\n" +" (`online version <" +"https://www.cs.princeton.edu/courses/archive/spr09/cos325/Bernardini.pdf" +">`_).\n" +""; PyObject * Py_aubio_window(PyObject *self, PyObject *args); static char Py_aubio_level_lin_doc[] = "" -"level_lin(fvec) -> fvec\n" +"level_lin(x)\n" "\n" -"Compute sound level on a linear scale.\n" +"Compute sound pressure level of `x`, on a linear scale.\n" "\n" -"This gives the average of the square amplitudes.\n" +"Parameters\n" +"----------\n" +"x : fvec\n" +" input vector\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" Linear level of `x`.\n" "\n" "Example\n" "-------\n" "\n" -">>> level_Lin(numpy.ones(1024))\n" -"1.0"; +">>> aubio.level_lin(aubio.fvec(numpy.ones(1024)))\n" +"1.0\n" +"\n" +"Note\n" +"----\n" +"Computed as the average of the squared amplitudes:\n" +"\n" +".. math:: L = \\frac {\\sum_{n=0}^{N-1} {x_n}^2} {N}\n" +"\n" +"See Also\n" +"--------\n" +"db_spl, silence_detection, level_detection\n" +""; PyObject * Py_aubio_level_lin(PyObject *self, PyObject *args); static char Py_aubio_db_spl_doc[] = "" -"Compute sound pressure level (SPL) in dB\n" +"db_spl(x)\n" "\n" -"This quantity is often wrongly called 'loudness'.\n" +"Compute Sound Pressure Level (SPL) of `x`, in dB.\n" "\n" -"This gives ten times the log10 of the average of the square amplitudes.\n" +"Parameters\n" +"----------\n" +"x : fvec\n" +" input vector\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" Level of `x`, in dB SPL.\n" "\n" "Example\n" "-------\n" "\n" -">>> db_spl(numpy.ones(1024))\n" -"1.0"; +">>> aubio.db_spl(aubio.fvec(np.ones(1024)))\n" +"1.0\n" +">>> aubio.db_spl(0.7*aubio.fvec(np.ones(32)))\n" +"-3.098040819168091\n" +"\n" +"Note\n" +"----\n" +"Computed as `log10` of :py:func:`level_lin`:\n" +"\n" +".. math::\n" +"\n" +" {SPL}_{dB} = log10{\\frac {\\sum_{n=0}^{N-1}{x_n}^2} {N}}\n" +"\n" +"This quantity is often incorrectly called 'loudness'.\n" +"\n" +"See Also\n" +"--------\n" +"level_lin, silence_detection, level_detection\n" +""; PyObject * Py_aubio_db_spl(PyObject *self, PyObject *args); static char Py_aubio_silence_detection_doc[] = "" -"Check if buffer level in dB SPL is under a given threshold\n" +"silence_detection(vec, level)\n" "\n" -"Return 0 if level is under the given threshold, 1 otherwise.\n" +"Check if level of `vec`, in dB SPL, is under a given threshold.\n" "\n" -"Example\n" +"Parameters\n" +"----------\n" +"vec : fvec\n" +" input vector\n" +"level : float\n" +" level threshold, in dB SPL\n" +"\n" +"Returns\n" "-------\n" +"int\n" +" `1` if level of `vec`, in dB SPL, is under `level`,\n" +" `0` otherwise.\n" "\n" -">>> import numpy\n""" -">>> silence_detection(numpy.ones(1024, dtype=\"float32\"), -80)\n" -"0"; +"Examples\n" +"--------\n" +"\n" +">>> aubio.silence_detection(aubio.fvec(32), -100.)\n" +"1\n" +">>> aubio.silence_detection(aubio.fvec(np.ones(32)), 0.)\n" +"0\n" +"\n" +"See Also\n" +"--------\n" +"level_detection, db_spl, level_lin\n" +""; PyObject * Py_aubio_silence_detection(PyObject *self, PyObject *args); static char Py_aubio_level_detection_doc[] = "" -"Get buffer level in dB SPL if over a given threshold, 1. otherwise.\n" +"level_detection(vec, level)\n" +"\n" +"Check if `vec` is above threshold `level`, in dB SPL.\n" +"\n" +"Parameters\n" +"----------\n" +"vec : fvec\n" +" input vector\n" +"level : float\n" +" level threshold, in dB SPL\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" `1.0` if level of `vec` in dB SPL is under `level`,\n" +" `db_spl(vec)` otherwise.\n" "\n" "Example\n" "-------\n" "\n" -">>> import numpy\n""" -">>> level_detection(0.7*numpy.ones(1024, dtype=\"float32\"), -80)\n" -"0"; +">>> aubio.level_detection(0.7*aubio.fvec(np.ones(1024)), -3.)\n" +"1.0\n" +">>> aubio.level_detection(0.7*aubio.fvec(np.ones(1024)), -4.)\n" +"-3.0980708599090576\n" +"\n" +"See Also\n" +"--------\n" +"silence_detection, db_spl, level_lin\n" +""; PyObject * Py_aubio_level_detection(PyObject *self, PyObject *args); +static char Py_aubio_shift_doc[] = "" +"shift(vec)\n" +"\n" +"Swap left and right partitions of a vector, in-place.\n" +"\n" +"Parameters\n" +"----------\n" +"vec : fvec\n" +" input vector to shift\n" +"\n" +"Returns\n" +"-------\n" +"fvec\n" +" The swapped vector.\n" +"\n" +"Notes\n" +"-----\n" +"The input vector is also modified.\n" +"\n" +"For a vector of length N, the partition is split at index N - N//2.\n" +"\n" +"Example\n" +"-------\n" +"\n" +">>> aubio.shift(aubio.fvec(np.arange(3)))\n" +"array([2., 0., 1.], dtype=" AUBIO_NPY_SMPL_STR ")\n" +"\n" +"See Also\n" +"--------\n" +"ishift\n" +""; +PyObject * Py_aubio_shift(PyObject *self, PyObject *args); + +static char Py_aubio_ishift_doc[] = "" +"ishift(vec)\n" +"\n" +"Swap right and left partitions of a vector, in-place.\n" +"\n" +"Parameters\n" +"----------\n" +"vec : fvec\n" +" input vector to shift\n" +"\n" +"Returns\n" +"-------\n" +"fvec\n" +" The swapped vector.\n" +"\n" +"Notes\n" +"-----\n" +"The input vector is also modified.\n" +"\n" +"Unlike with :py:func:`shift`, the partition is split at index N//2.\n" +"\n" +"Example\n" +"-------\n" +"\n" +">>> aubio.ishift(aubio.fvec(np.arange(3)))\n" +"array([1., 2., 0.], dtype=" AUBIO_NPY_SMPL_STR ")\n" +"\n" +"See Also\n" +"--------\n" +"shift\n" +""; +PyObject * Py_aubio_ishift(PyObject *self, PyObject *args); + +static char Py_aubio_hztomel_doc[] = "" +"hztomel(f, htk=False)\n" +"\n" +"Convert a scalar from frequency to mel scale.\n" +"\n" +"Parameters\n" +"----------\n" +"m : float\n" +" input frequency, in Hz\n" +"htk : bool\n" +" if `True`, use Htk mel scale instead of Slaney.\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" output mel\n" +"\n" +"See Also\n" +"--------\n" +"meltohz\n" +""; +PyObject * Py_aubio_hztomel(PyObject *self, PyObject *args); + +static char Py_aubio_meltohz_doc[] = "" +"meltohz(m, htk=False)\n" +"\n" +"Convert a scalar from mel scale to frequency.\n" +"\n" +"Parameters\n" +"----------\n" +"m : float\n" +" input mel\n" +"htk : bool\n" +" if `True`, use Htk mel scale instead of Slaney.\n" +"\n" +"Returns\n" +"-------\n" +"float\n" +" output frequency, in Hz\n" +"\n" +"See Also\n" +"--------\n" +"hztomel\n" +""; +PyObject * Py_aubio_meltohz(PyObject *self, PyObject *args); + +static char Py_aubio_hztomel_htk_doc[] = "" +"hztomel_htk(m)\n" +"\n" +"Same as `hztomel(m, htk=True)`\n" +"\n" +"See Also\n" +"--------\n" +"hztomel\n" +""; +PyObject * Py_aubio_hztomel_htk(PyObject *self, PyObject *args); + +static char Py_aubio_meltohz_htk_doc[] = "" +"meltohz_htk(m)\n" +"\n" +"Same as `meltohz(m, htk=True)`\n" +"\n" +"See Also\n" +"--------\n" +"meltohz\n" +""; +PyObject * Py_aubio_meltohz_htk(PyObject *self, PyObject *args); + #endif /* PY_AUBIO_MUSICUTILS_H */ diff --git a/python/ext/py-phasevoc.c b/python/ext/py-phasevoc.c index bcd3d8b..4d36fb3 100644 --- a/python/ext/py-phasevoc.c +++ b/python/ext/py-phasevoc.c @@ -1,6 +1,58 @@ #include "aubio-types.h" -static char Py_pvoc_doc[] = "pvoc object"; +static char Py_pvoc_doc[] = "" +"pvoc(win_s=512, hop_s=256)\n" +"\n" +"Phase vocoder.\n" +"\n" +"`pvoc` creates callable object implements a phase vocoder [1]_,\n" +"using the tricks detailed in [2]_.\n" +"\n" +"The call function takes one input of type `fvec` and of size\n" +"`hop_s`, and returns a `cvec` of length `win_s//2+1`.\n" +"\n" +"Parameters\n" +"----------\n" +"win_s : int\n" +" number of channels in the phase-vocoder.\n" +"hop_s : int\n" +" number of samples expected between each call\n" +"\n" +"Examples\n" +"--------\n" +">>> x = aubio.fvec(256)\n" +">>> pv = aubio.pvoc(512, 256)\n" +">>> pv(x)\n" +"aubio cvec of 257 elements\n" +"\n" +"Default values for hop_s and win_s are provided:\n" +"\n" +">>> pv = aubio.pvoc()\n" +">>> pv.win_s, pv.hop_s\n" +"512, 256\n" +"\n" +"A `cvec` can be resynthesised using `rdo()`:\n" +"\n" +">>> pv = aubio.pvoc(512, 256)\n" +">>> y = aubio.cvec(512)\n" +">>> x_reconstructed = pv.rdo(y)\n" +">>> x_reconstructed.shape\n" +"(256,)\n" +"\n" +"References\n" +"----------\n" +".. [1] James A. Moorer. The use of the phase vocoder in computer music\n" +" applications. `Journal of the Audio Engineering Society`,\n" +" 26(1/2):42–45, 1978.\n" +".. [2] Amalia de Götzen, Nicolas Bernardini, and Daniel Arfib. Traditional\n" +" (?) implementations of a phase vocoder: the tricks of the trade.\n" +" In `Proceedings of the International Conference on Digital Audio\n" +" Effects` (DAFx-00), pages 37–44, University of Verona, Italy, 2000.\n" +" (`online version <" +"https://www.cs.princeton.edu/courses/archive/spr09/cos325/Bernardini.pdf" +">`_).\n" +""; + typedef struct { @@ -38,10 +90,6 @@ Py_pvoc_new (PyTypeObject * type, PyObject * args, PyObject * kwds) self->win_s = Py_default_vector_length; self->hop_s = Py_default_vector_length/2; - if (self == NULL) { - return NULL; - } - if (win_s > 0) { self->win_s = win_s; } else if (win_s < 0) { @@ -66,9 +114,8 @@ Py_pvoc_init (Py_pvoc * self, PyObject * args, PyObject * kwds) { self->o = new_aubio_pvoc ( self->win_s, self->hop_s); if (self->o == NULL) { - PyErr_Format(PyExc_RuntimeError, - "failed creating pvoc with win_s=%d, hop_s=%d", - self->win_s, self->hop_s); + // PyErr_Format(PyExc_RuntimeError, ...) was set above by new_ which called + // AUBIO_ERR when failing return -1; } @@ -122,9 +169,11 @@ Py_pvoc_do(Py_pvoc * self, PyObject * args) static PyMemberDef Py_pvoc_members[] = { {"win_s", T_INT, offsetof (Py_pvoc, win_s), READONLY, - "size of the window"}, + "int: Size of phase vocoder analysis windows, in samples.\n" + ""}, {"hop_s", T_INT, offsetof (Py_pvoc, hop_s), READONLY, - "size of the hop"}, + "int: Interval between two analysis, in samples.\n" + ""}, { NULL } // sentinel }; @@ -156,9 +205,67 @@ Py_pvoc_rdo(Py_pvoc * self, PyObject * args) return self->routput; } +static PyObject * +Pyaubio_pvoc_set_window (Py_pvoc *self, PyObject *args) +{ + uint_t err = 0; + char_t *window = NULL; + + if (!PyArg_ParseTuple (args, "s", &window)) { + return NULL; + } + err = aubio_pvoc_set_window (self->o, window); + + if (err > 0) { + PyErr_SetString (PyExc_ValueError, "error running aubio_pvoc_set_window"); + return NULL; + } + Py_RETURN_NONE; +} + static PyMethodDef Py_pvoc_methods[] = { {"rdo", (PyCFunction) Py_pvoc_rdo, METH_VARARGS, - "synthesis of spectral grain"}, + "rdo(fftgrain)\n" + "\n" + "Read a new spectral grain and resynthesise the next `hop_s`\n" + "output samples.\n" + "\n" + "Parameters\n" + "----------\n" + "fftgrain : cvec\n" + " new input `cvec` to synthesize from, should be of size `win_s/2+1`\n" + "\n" + "Returns\n" + "-------\n" + "fvec\n" + " re-synthesised output of shape `(hop_s,)`\n" + "\n" + "Example\n" + "-------\n" + ">>> pv = aubio.pvoc(2048, 512)\n" + ">>> out = pv.rdo(aubio.cvec(2048))\n" + ">>> out.shape\n" + "(512,)\n" + ""}, + {"set_window", (PyCFunction) Pyaubio_pvoc_set_window, METH_VARARGS, + "set_window(window_type)\n" + "\n" + "Set window function\n" + "\n" + "Parameters\n" + "----------\n" + "window_type : str\n" + " the window type to use for this phase vocoder\n" + "\n" + "Raises\n" + "------\n" + "ValueError\n" + " If an unknown window type was given.\n" + "\n" + "See Also\n" + "--------\n" + "window : create a window.\n" + ""}, {NULL} }; diff --git a/python/ext/py-sink.c b/python/ext/py-sink.c index 9fada06..83c7ddd 100644 --- a/python/ext/py-sink.c +++ b/python/ext/py-sink.c @@ -12,53 +12,78 @@ typedef struct } Py_sink; static char Py_sink_doc[] = "" -" __new__(path, samplerate = 44100, channels = 1)\n" +"sink(path, samplerate=44100, channels=1)\n" "\n" -" Create a new sink, opening the given path for writing.\n" +"Write audio samples to file.\n" "\n" -" Examples\n" -" --------\n" +"Parameters\n" +"----------\n" +"path : str\n" +" Pathname of the file to be opened for writing.\n" +"samplerate : int\n" +" Sampling rate of the file, in Hz.\n" +"channels : int\n" +" Number of channels to create the file with.\n" "\n" -" Create a new sink at 44100Hz, mono:\n" +"Examples\n" +"--------\n" "\n" -" >>> sink('/tmp/t.wav')\n" +"Create a new sink at 44100Hz, mono:\n" "\n" -" Create a new sink at 8000Hz, mono:\n" +">>> snk = aubio.sink('out.wav')\n" "\n" -" >>> sink('/tmp/t.wav', samplerate = 8000)\n" +"Create a new sink at 32000Hz, stereo, write 100 samples into it:\n" "\n" -" Create a new sink at 32000Hz, stereo:\n" +">>> snk = aubio.sink('out.wav', samplerate=16000, channels=3)\n" +">>> snk(aubio.fvec(100), 100)\n" "\n" -" >>> sink('/tmp/t.wav', samplerate = 32000, channels = 2)\n" +"Open a new sink at 48000Hz, stereo, write `1234` samples into it:\n" "\n" -" Create a new sink at 32000Hz, 5 channels:\n" +">>> with aubio.sink('out.wav', samplerate=48000, channels=2) as src:\n" +"... snk(aubio.fvec(1024), 1024)\n" +"... snk(aubio.fvec(210), 210)\n" +"...\n" "\n" -" >>> sink('/tmp/t.wav', channels = 5, samplerate = 32000)\n" -"\n" -" __call__(vec, write)\n" -" x(vec,write) <==> x.do(vec, write)\n" -"\n" -" Write vector to sink.\n" -"\n" -" See also\n" -" --------\n" -" aubio.sink.do\n" +"See also\n" +"--------\n" +"source: read audio samples from a file.\n" "\n"; static char Py_sink_do_doc[] = "" -"x.do(vec, write) <==> x(vec, write)\n" +"do(vec, write)\n" "\n" -"write monophonic vector to sink"; +"Write a single channel vector to sink.\n" +"\n" +"Parameters\n" +"----------\n" +"vec : fvec\n" +" input vector `(n,)` where `n >= 0`.\n" +"write : int\n" +" Number of samples to write.\n" +""; static char Py_sink_do_multi_doc[] = "" -"x.do_multi(mat, write)\n" +"do_multi(mat, write)\n" +"\n" +"Write a matrix containing vectors from multiple channels to sink.\n" "\n" -"write polyphonic vector to sink"; +"Parameters\n" +"----------\n" +"mat : numpy.ndarray\n" +" input matrix of shape `(channels, n)`, where `n >= 0`.\n" +"write : int\n" +" Number of frames to write.\n" +""; static char Py_sink_close_doc[] = "" -"x.close()\n" +"close()\n" "\n" -"close this sink now"; +"Close this sink now.\n" +"\n" +"By default, the sink will be closed before being deleted.\n" +"Explicitly closing a sink can be useful to control the number\n" +"of files simultaneously opened.\n" +""; static PyObject * Py_sink_new (PyTypeObject * pytype, PyObject * args, PyObject * kwds) @@ -80,27 +105,20 @@ Py_sink_new (PyTypeObject * pytype, PyObject * args, PyObject * kwds) return NULL; } - self->uri = "none"; + self->uri = NULL; if (uri != NULL) { - self->uri = uri; + self->uri = (char_t *)malloc(sizeof(char_t) * (strnlen(uri, PATH_MAX) + 1)); + strncpy(self->uri, uri, strnlen(uri, PATH_MAX) + 1); } self->samplerate = Py_aubio_default_samplerate; - if ((sint_t)samplerate > 0) { + if (samplerate != 0) { self->samplerate = samplerate; - } else if ((sint_t)samplerate < 0) { - PyErr_SetString (PyExc_ValueError, - "can not use negative value for samplerate"); - return NULL; } self->channels = 1; - if ((sint_t)channels > 0) { + if (channels != 0) { self->channels = channels; - } else if ((sint_t)channels < 0) { - PyErr_SetString (PyExc_ValueError, - "can not use negative or null value for channels"); - return NULL; } return (PyObject *) self; @@ -109,17 +127,20 @@ Py_sink_new (PyTypeObject * pytype, PyObject * args, PyObject * kwds) static int Py_sink_init (Py_sink * self, PyObject * args, PyObject * kwds) { - if (self->channels == 1) { - self->o = new_aubio_sink ( self->uri, self->samplerate ); - } else { - self->o = new_aubio_sink ( self->uri, 0 ); - aubio_sink_preset_channels ( self->o, self->channels ); - aubio_sink_preset_samplerate ( self->o, self->samplerate ); - } + self->o = new_aubio_sink ( self->uri, 0 ); if (self->o == NULL) { - PyErr_SetString (PyExc_RuntimeError, "error creating sink with this uri"); + // error string was set in new_aubio_sink + return -1; + } + if (aubio_sink_preset_channels(self->o, self->channels) != 0) { + // error string was set in aubio_sink_preset_channels + return -1; + } + if (aubio_sink_preset_samplerate(self->o, self->samplerate) != 0) { + // error string was set in aubio_sink_preset_samplerate return -1; } + self->samplerate = aubio_sink_get_samplerate ( self->o ); self->channels = aubio_sink_get_channels ( self->o ); @@ -129,8 +150,13 @@ Py_sink_init (Py_sink * self, PyObject * args, PyObject * kwds) static void Py_sink_del (Py_sink *self, PyObject *unused) { - del_aubio_sink(self->o); - free(self->mwrite_data.data); + if (self->o) { + del_aubio_sink(self->o); + free(self->mwrite_data.data); + } + if (self->uri) { + free(self->uri); + } Py_TYPE(self)->tp_free((PyObject *) self); } @@ -189,11 +215,11 @@ Py_sink_do_multi(Py_sink * self, PyObject * args) static PyMemberDef Py_sink_members[] = { {"uri", T_STRING, offsetof (Py_sink, uri), READONLY, - "path at which the sink was created"}, + "str (read-only): Path at which the sink was created."}, {"samplerate", T_INT, offsetof (Py_sink, samplerate), READONLY, - "samplerate at which the sink was created"}, + "int (read-only): Samplerate at which the sink was created."}, {"channels", T_INT, offsetof (Py_sink, channels), READONLY, - "number of channels with which the sink was created"}, + "int (read-only): Number of channels with which the sink was created."}, { NULL } // sentinel }; @@ -204,10 +230,25 @@ Pyaubio_sink_close (Py_sink *self, PyObject *unused) Py_RETURN_NONE; } +static char Pyaubio_sink_enter_doc[] = ""; +static PyObject* Pyaubio_sink_enter(Py_sink *self, PyObject *unused) { + Py_INCREF(self); + return (PyObject*)self; +} + +static char Pyaubio_sink_exit_doc[] = ""; +static PyObject* Pyaubio_sink_exit(Py_sink *self, PyObject *unused) { + return Pyaubio_sink_close(self, unused); +} + static PyMethodDef Py_sink_methods[] = { {"do", (PyCFunction) Py_sink_do, METH_VARARGS, Py_sink_do_doc}, {"do_multi", (PyCFunction) Py_sink_do_multi, METH_VARARGS, Py_sink_do_multi_doc}, {"close", (PyCFunction) Pyaubio_sink_close, METH_NOARGS, Py_sink_close_doc}, + {"__enter__", (PyCFunction)Pyaubio_sink_enter, METH_NOARGS, + Pyaubio_sink_enter_doc}, + {"__exit__", (PyCFunction)Pyaubio_sink_exit, METH_VARARGS, + Pyaubio_sink_exit_doc}, {NULL} /* sentinel */ }; diff --git a/python/ext/py-source.c b/python/ext/py-source.c index 8280868..7e9d48b 100644 --- a/python/ext/py-source.c +++ b/python/ext/py-source.c @@ -16,68 +16,316 @@ typedef struct } Py_source; static char Py_source_doc[] = "" -" __new__(path, samplerate = 0, hop_size = 512, channels = 1)\n" +"source(path, samplerate=0, hop_size=512, channels=0)\n" "\n" -" Create a new source, opening the given path for reading.\n" +"Read audio samples from a media file.\n" "\n" -" Examples\n" -" --------\n" +"`source` open the file specified in `path` and creates a callable\n" +"returning `hop_size` new audio samples at each invocation.\n" "\n" -" Create a new source, using the original samplerate, with hop_size = 512:\n" +"If `samplerate=0` (default), the original sampling rate of `path`\n" +"will be used. Otherwise, the output audio samples will be\n" +"resampled at the desired sampling-rate.\n" "\n" -" >>> source('/tmp/t.wav')\n" +"If `channels=0` (default), the original number of channels\n" +"in `path` will be used. Otherwise, the output audio samples\n" +"will be down-mixed or up-mixed to the desired number of\n" +"channels.\n" "\n" -" Create a new source, resampling the original to 8000Hz:\n" +"If `path` is a URL, a remote connection will be attempted to\n" +"open the resource and stream data from it.\n" "\n" -" >>> source('/tmp/t.wav', samplerate = 8000)\n" +"The parameter `hop_size` determines how many samples should be\n" +"read at each consecutive calls.\n" "\n" -" Create a new source, resampling it at 32000Hz, hop_size = 32:\n" +"Parameters\n" +"----------\n" +"path : str\n" +" pathname (or URL) of the file to be opened for reading\n" +"samplerate : int, optional\n" +" sampling rate of the file\n" +"hop_size : int, optional\n" +" number of samples to be read per iteration\n" +"channels : int, optional\n" +" number of channels of the file\n" "\n" -" >>> source('/tmp/t.wav', samplerate = 32000, hop_size = 32)\n" +"Examples\n" +"--------\n" +"By default, when only `path` is given, the file will be opened\n" +"with its original sampling rate and channel:\n" "\n" -" Create a new source, using its original samplerate:\n" +">>> src = aubio.source('stereo.wav')\n" +">>> src.uri, src.samplerate, src.channels, src.duration\n" +"('stereo.wav', 48000, 2, 86833)\n" "\n" -" >>> source('/tmp/t.wav', samplerate = 0)\n" +"A typical loop to read all samples from a local file could\n" +"look like this:\n" "\n" -" __call__()\n" -" vec, read = x() <==> vec, read = x.do()\n" +">>> src = aubio.source('stereo.wav')\n" +">>> total_read = 0\n" +">>> while True:\n" +"... samples, read = src()\n" +"... # do something with samples\n" +"... total_read += read\n" +"... if read < src.hop_size:\n" +"... break\n" +"...\n" "\n" -" Read vector from source.\n" +"In a more Pythonic way, it can also look like this:\n" "\n" -" See also\n" -" --------\n" -" aubio.source.do\n" -"\n"; +">>> total_read = 0\n" +">>> with aubio.source('stereo.wav') as src:\n" +"... for frames in src:\n" +"... total_read += samples.shape[-1]\n" +"...\n" +"\n" +".. rubric:: Basic interface\n" +"\n" +"`source` is a **callable**; its :meth:`__call__` method\n" +"returns a tuple containing:\n" +"\n" +"- a vector of shape `(hop_size,)`, filled with the `read` next\n" +" samples available, zero-padded if `read < hop_size`\n" +"- `read`, an integer indicating the number of samples read\n" +"\n" +"To read the first `hop_size` samples from the source, simply call\n" +"the instance itself, with no argument:\n" +"\n" +">>> src = aubio.source('song.ogg')\n" +">>> samples, read = src()\n" +">>> samples.shape, read, src.hop_size\n" +"((512,), 512, 512)\n" +"\n" +"The first call returned the slice of samples `[0 : hop_size]`.\n" +"The next call will return samples `[hop_size: 2*hop_size]`.\n" +"\n" +"After several invocations of :meth:`__call__`, when reaching the end\n" +"of the opened stream, `read` might become less than `hop_size`:\n" +"\n" +">>> samples, read = src()\n" +">>> samples.shape, read\n" +"((512,), 354)\n" +"\n" +"The end of the vector `samples` is filled with zeros.\n" +"\n" +"After the end of the stream, `read` will be `0` since no more\n" +"samples are available:\n" +"\n" +">>> samples, read = src()\n" +">>> samples.shape, read\n" +"((512,), 0)\n" +"\n" +"**Note**: when the source has more than one channels, they\n" +"are be down-mixed to mono when invoking :meth:`__call__`.\n" +"To read from each individual channel, see :meth:`__next__`.\n" +"\n" +".. rubric:: ``for`` statements\n" +"\n" +"The `source` objects are **iterables**. This allows using them\n" +"directly in a ``for`` loop, which calls :meth:`__next__` until\n" +"the end of the stream is reached:\n" +"\n" +">>> src = aubio.source('stereo.wav')\n" +">>> for frames in src:\n" +">>> print (frames.shape)\n" +"...\n" +"(2, 512)\n" +"(2, 512)\n" +"(2, 230)\n" +"\n" +"**Note**: When `next(self)` is called on a source with multiple\n" +"channels, an array of shape `(channels, read)` is returned,\n" +"unlike with :meth:`__call__` which always returns the down-mixed\n" +"channels.\n" +"\n" +"If the file is opened with a single channel, `next(self)` returns\n" +"an array of shape `(read,)`:\n" +"\n" +">>> src = aubio.source('stereo.wav', channels=1)\n" +">>> next(src).shape\n" +"(512,)\n" +"\n" +".. rubric:: ``with`` statements\n" +"\n" +"The `source` objects are **context managers**, which allows using\n" +"them in ``with`` statements:\n" +"\n" +">>> with aubio.source('audiotrack.wav') as source:\n" +"... n_frames=0\n" +"... for samples in source:\n" +"... n_frames += len(samples)\n" +"... print('read', n_frames, 'samples in', samples.shape[0], 'channels',\n" +"... 'from file \"%%s\"' %% source.uri)\n" +"...\n" +"read 239334 samples in 2 channels from file \"audiotrack.wav\"\n" +"\n" +"The file will be closed before exiting the statement.\n" +"\n" +"See also the methods implementing the context manager,\n" +":meth:`__enter__` and :meth:`__exit__`.\n" +"\n" +".. rubric:: Seeking and closing\n" +"\n" +"At any time, :meth:`seek` can be used to move to any position in\n" +"the file. For instance, to rewind to the start of the stream:\n" +"\n" +">>> src.seek(0)\n" +"\n" +"The opened file will be automatically closed when the object falls\n" +"out of scope and is scheduled for garbage collection.\n" +"\n" +"In some cases, it is useful to manually :meth:`close` a given source,\n" +"for instance to limit the number of simultaneously opened files:\n" +"\n" +">>> src.close()\n" +"\n" +".. rubric:: Input formats\n" +"\n" +"Depending on how aubio was compiled, :class:`source` may or may not\n" +"open certain **files format**. Below are some examples that assume\n" +"support for compressed files and remote urls was compiled in:\n" +"\n" +"- open a local file using its original sampling rate and channels,\n" +" and with the default hop size:\n" +"\n" +">>> s = aubio.source('sample.wav')\n" +">>> s.uri, s.samplerate, s.channels, s.hop_size\n" +"('sample.wav', 44100, 2, 512)\n" +"\n" +"- open a local compressed audio file, resampling to 32000Hz if needed:\n" +"\n" +">>> s = aubio.source('song.mp3', samplerate=32000)\n" +">>> s.uri, s.samplerate, s.channels, s.hop_size\n" +"('song.mp3', 32000, 2, 512)\n" +"\n" +"- open a local video file, down-mixing and resampling it to 16kHz:\n" +"\n" +">>> s = aubio.source('movie.mp4', samplerate=16000, channels=1)\n" +">>> s.uri, s.samplerate, s.channels, s.hop_size\n" +"('movie.mp4', 16000, 1, 512)\n" +"\n" +"- open a remote resource, with hop_size = 1024:\n" +"\n" +">>> s = aubio.source('https://aubio.org/drum.ogg', hop_size=1024)\n" +">>> s.uri, s.samplerate, s.channels, s.hop_size\n" +"('https://aubio.org/drum.ogg', 48000, 2, 1024)\n" +"\n" +"See Also\n" +"--------\n" +"sink: write audio samples to a file.\n" +""; static char Py_source_get_samplerate_doc[] = "" -"x.get_samplerate() -> source samplerate\n" +"get_samplerate()\n" +"\n" +"Get sampling rate of source.\n" "\n" -"Get samplerate of source."; +"Returns\n" +"-------\n" +"int\n" +" Sampling rate, in Hz.\n" +""; static char Py_source_get_channels_doc[] = "" -"x.get_channels() -> number of channels\n" +"get_channels()\n" +"\n" +"Get number of channels in source.\n" "\n" -"Get number of channels in source."; +"Returns\n" +"-------\n" +"int\n" +" Number of channels.\n" +""; static char Py_source_do_doc[] = "" -"vec, read = x.do() <==> vec, read = x()\n" +"source.do()\n" +"\n" +"Read vector of audio samples.\n" +"\n" +"If the audio stream in the source has more than one channel,\n" +"the channels will be down-mixed.\n" "\n" -"Read monophonic vector from source."; +"Returns\n" +"-------\n" +"samples : numpy.ndarray\n" +" `fvec` of size `hop_size` containing the new samples.\n" +"read : int\n" +" Number of samples read from the source, equals to `hop_size`\n" +" before the end-of-file is reached, less when it is reached,\n" +" and `0` after.\n" +"\n" +"See Also\n" +"--------\n" +"do_multi\n" +"\n" +"Examples\n" +"--------\n" +">>> src = aubio.source('sample.wav', hop_size=1024)\n" +">>> src.do()\n" +"(array([-0.00123001, -0.00036685, 0.00097106, ..., -0.2031033 ,\n" +" -0.2025854 , -0.20221856], dtype=" AUBIO_NPY_SMPL_STR "), 1024)\n" +""; static char Py_source_do_multi_doc[] = "" -"mat, read = x.do_multi()\n" +"do_multi()\n" +"\n" +"Read multiple channels of audio samples.\n" +"\n" +"If the source was opened with the same number of channels\n" +"found in the stream, each channel will be read individually.\n" +"\n" +"If the source was opened with less channels than the number\n" +"of channels in the stream, only the first channels will be read.\n" "\n" -"Read polyphonic vector from source."; +"If the source was opened with more channels than the number\n" +"of channel in the original stream, the first channels will\n" +"be duplicated on the additional output channel.\n" +"\n" +"Returns\n" +"-------\n" +"samples : numpy.ndarray\n" +" NumPy array of shape `(hop_size, channels)` containing the new\n" +" audio samples.\n" +"read : int\n" +" Number of samples read from the source, equals to `hop_size`\n" +" before the end-of-file is reached, less when it is reached,\n" +" and `0` after.\n" +"\n" +"See Also\n" +"--------\n" +"do\n" +"\n" +"Examples\n" +"--------\n" +">>> src = aubio.source('sample.wav')\n" +">>> src.do_multi()\n" +"(array([[ 0.00668335, 0.0067749 , 0.00714111, ..., -0.05737305,\n" +" -0.05856323, -0.06018066],\n" +" [-0.00842285, -0.0072937 , -0.00576782, ..., -0.09405518,\n" +" -0.09558105, -0.09725952]], dtype=" AUBIO_NPY_SMPL_STR "), 512)\n" +""; static char Py_source_close_doc[] = "" -"x.close()\n" +"close()\n" "\n" -"Close this source now."; +"Close this source now.\n" +"\n" +".. note:: Closing twice a source will **not** raise any exception.\n" +""; static char Py_source_seek_doc[] = "" -"x.seek(position)\n" +"seek(position)\n" +"\n" +"Seek to position in file.\n" "\n" -"Seek to resampled frame position."; +"If the source was not opened with its original sampling-rate,\n" +"`position` corresponds to the position in the re-sampled file.\n" +"\n" +"Parameters\n" +"----------\n" +"position : str\n" +" position to seek to, in samples\n" +""; static PyObject * Py_source_new (PyTypeObject * pytype, PyObject * args, PyObject * kwds) @@ -100,9 +348,10 @@ Py_source_new (PyTypeObject * pytype, PyObject * args, PyObject * kwds) return NULL; } - self->uri = "none"; + self->uri = NULL; if (uri != NULL) { - self->uri = uri; + self->uri = (char_t *)malloc(sizeof(char_t) * (strnlen(uri, PATH_MAX) + 1)); + strncpy(self->uri, uri, strnlen(uri, PATH_MAX) + 1); } self->samplerate = 0; @@ -140,8 +389,8 @@ Py_source_init (Py_source * self, PyObject * args, PyObject * kwds) { self->o = new_aubio_source ( self->uri, self->samplerate, self->hop_size ); if (self->o == NULL) { - PyErr_Format (PyExc_RuntimeError, "error creating source with \"%s\"", - self->uri); + // PyErr_Format(PyExc_RuntimeError, ...) was set above by new_ which called + // AUBIO_ERR when failing return -1; } self->samplerate = aubio_source_get_samplerate ( self->o ); @@ -163,6 +412,9 @@ Py_source_del (Py_source *self, PyObject *unused) del_aubio_source(self->o); free(self->c_mread_to.data); } + if (self->uri) { + free(self->uri); + } Py_XDECREF(self->read_to); Py_XDECREF(self->mread_to); Py_TYPE(self)->tp_free((PyObject *) self); @@ -184,6 +436,10 @@ Py_source_do(Py_source * self, PyObject * args) /* compute _do function */ aubio_source_do (self->o, &(self->c_read_to), &read); + if (PyErr_Occurred() != NULL) { + return NULL; + } + outputs = PyTuple_New(2); PyTuple_SetItem( outputs, 0, self->read_to ); PyTuple_SetItem( outputs, 1, (PyObject *)PyLong_FromLong(read)); @@ -205,6 +461,10 @@ Py_source_do_multi(Py_source * self, PyObject * args) /* compute _do function */ aubio_source_do_multi (self->o, &(self->c_mread_to), &read); + if (PyErr_Occurred() != NULL) { + return NULL; + } + outputs = PyTuple_New(2); PyTuple_SetItem( outputs, 0, self->mread_to); PyTuple_SetItem( outputs, 1, (PyObject *)PyLong_FromLong(read)); @@ -213,15 +473,29 @@ Py_source_do_multi(Py_source * self, PyObject * args) static PyMemberDef Py_source_members[] = { {"uri", T_STRING, offsetof (Py_source, uri), READONLY, - "path at which the source was created"}, + "str (read-only): pathname or URL"}, {"samplerate", T_INT, offsetof (Py_source, samplerate), READONLY, - "samplerate at which the source is viewed"}, + "int (read-only): sampling rate"}, {"channels", T_INT, offsetof (Py_source, channels), READONLY, - "number of channels found in the source"}, + "int (read-only): number of channels"}, {"hop_size", T_INT, offsetof (Py_source, hop_size), READONLY, - "number of consecutive frames that will be read at each do or do_multi call"}, + "int (read-only): number of samples read per iteration"}, {"duration", T_INT, offsetof (Py_source, duration), READONLY, - "total number of frames in the source (estimated)"}, + "int (read-only): total number of frames in the source\n" + "\n" + "Can be estimated, for instance if the opened stream is\n" + "a compressed media or a remote resource.\n" + "\n" + "Example\n" + "-------\n" + ">>> n = 0\n" + ">>> src = aubio.source('track1.mp3')\n" + ">>> for samples in src:\n" + "... n += samples.shape[-1]\n" + "...\n" + ">>> n, src.duration\n" + "(9638784, 9616561)\n" + ""}, { NULL } // sentinel }; @@ -242,7 +516,7 @@ Pyaubio_source_get_channels (Py_source *self, PyObject *unused) static PyObject * Pyaubio_source_close (Py_source *self, PyObject *unused) { - aubio_source_close (self->o); + if (aubio_source_close(self->o) != 0) return NULL; Py_RETURN_NONE; } @@ -272,6 +546,69 @@ Pyaubio_source_seek (Py_source *self, PyObject *args) Py_RETURN_NONE; } +static char Pyaubio_source_enter_doc[] = ""; +static PyObject* Pyaubio_source_enter(Py_source *self, PyObject *unused) { + Py_INCREF(self); + return (PyObject*)self; +} + +static char Pyaubio_source_exit_doc[] = ""; +static PyObject* Pyaubio_source_exit(Py_source *self, PyObject *unused) { + return Pyaubio_source_close(self, unused); +} + +static PyObject* Pyaubio_source_iter(PyObject *self) { + Py_INCREF(self); + return (PyObject*)self; +} + +static PyObject* Pyaubio_source_iter_next(Py_source *self) { + PyObject *done, *size; + if (self->channels == 1) { + done = Py_source_do(self, NULL); + } else { + done = Py_source_do_multi(self, NULL); + } + if (!PyTuple_Check(done)) { + PyErr_Format(PyExc_ValueError, + "error when reading source: not opened?"); + return NULL; + } + size = PyTuple_GetItem(done, 1); + if (size != NULL && PyLong_Check(size)) { + if (PyLong_AsLong(size) == (long)self->hop_size) { + PyObject *vec = PyTuple_GetItem(done, 0); + return vec; + } else if (PyLong_AsLong(size) > 0) { + // short read, return a shorter array + PyObject *vec = PyTuple_GetItem(done, 0); + // take a copy to prevent resizing internal arrays + PyArrayObject *shortread = (PyArrayObject*)PyArray_FROM_OTF(vec, + NPY_NOTYPE, NPY_ARRAY_ENSURECOPY); + PyArray_Dims newdims; + PyObject *reshaped; + newdims.len = PyArray_NDIM(shortread); + newdims.ptr = PyArray_DIMS(shortread); + // mono or multiple channels? + if (newdims.len == 1) { + newdims.ptr[0] = PyLong_AsLong(size); + } else { + newdims.ptr[1] = PyLong_AsLong(size); + } + reshaped = PyArray_Newshape(shortread, &newdims, NPY_CORDER); + Py_DECREF(shortread); + Py_DECREF(vec); + return reshaped; + } else { + PyErr_SetNone(PyExc_StopIteration); + return NULL; + } + } else { + PyErr_SetNone(PyExc_StopIteration); + return NULL; + } +} + static PyMethodDef Py_source_methods[] = { {"get_samplerate", (PyCFunction) Pyaubio_source_get_samplerate, METH_NOARGS, Py_source_get_samplerate_doc}, @@ -285,6 +622,10 @@ static PyMethodDef Py_source_methods[] = { METH_NOARGS, Py_source_close_doc}, {"seek", (PyCFunction) Pyaubio_source_seek, METH_VARARGS, Py_source_seek_doc}, + {"__enter__", (PyCFunction)Pyaubio_source_enter, METH_NOARGS, + Pyaubio_source_enter_doc}, + {"__exit__", (PyCFunction)Pyaubio_source_exit, METH_VARARGS, + Pyaubio_source_exit_doc}, {NULL} /* sentinel */ }; @@ -314,8 +655,8 @@ PyTypeObject Py_sourceType = { 0, 0, 0, - 0, - 0, + Pyaubio_source_iter, + (unaryfunc)Pyaubio_source_iter_next, Py_source_methods, Py_source_members, 0, diff --git a/python/ext/ufuncs.c b/python/ext/ufuncs.c index 8a4e917..d373d72 100644 --- a/python/ext/ufuncs.c +++ b/python/ext/ufuncs.c @@ -58,7 +58,22 @@ static char Py_aubio_unary_types[] = { //NPY_OBJECT, NPY_OBJECT, }; -static char Py_unwrap2pi_doc[] = "map angle to unit circle [-pi, pi["; +// Note: these docstrings should *not* include the function signatures + +static char Py_unwrap2pi_doc[] = "" +"\n" +"Map angle to unit circle :math:`[-\\pi, \\pi[`.\n" +"\n" +"Parameters\n" +"----------\n" +"x : numpy.ndarray\n" +" input array\n" +"\n" +"Returns\n" +"-------\n" +"numpy.ndarray\n" +" values clamped to the unit circle :math:`[-\\pi, \\pi[`\n" +""; static void* Py_unwrap2pi_data[] = { (void *)aubio_unwrap2pi, @@ -67,14 +82,40 @@ static void* Py_unwrap2pi_data[] = { //(void *)unwrap2pio, }; -static char Py_freqtomidi_doc[] = "convert frequency to midi"; +static char Py_freqtomidi_doc[] = "" +"\n" +"Convert frequency `[0; 23000[` to midi `[0; 128[`.\n" +"\n" +"Parameters\n" +"----------\n" +"x : numpy.ndarray\n" +" Array of frequencies, in Hz.\n" +"\n" +"Returns\n" +"-------\n" +"numpy.ndarray\n" +" Converted frequencies, in midi note.\n" +""; static void* Py_freqtomidi_data[] = { (void *)aubio_freqtomidi, (void *)aubio_freqtomidi, }; -static char Py_miditofreq_doc[] = "convert midi to frequency"; +static char Py_miditofreq_doc[] = "" +"\n" +"Convert midi `[0; 128[` to frequency `[0, 23000]`.\n" +"\n" +"Parameters\n" +"----------\n" +"x : numpy.ndarray\n" +" Array of frequencies, in midi note.\n" +"\n" +"Returns\n" +"-------\n" +"numpy.ndarray\n" +" Converted frequencies, in Hz\n" +""; static void* Py_miditofreq_data[] = { (void *)aubio_miditofreq, diff --git a/python/lib/__init__.py b/python/lib/__init__.py deleted file mode 100644 index e69de29..0000000 --- a/python/lib/__init__.py +++ /dev/null diff --git a/python/lib/aubio/__init__.py b/python/lib/aubio/__init__.py index 316f961..ebe0fa8 100644 --- a/python/lib/aubio/__init__.py +++ b/python/lib/aubio/__init__.py @@ -1,18 +1,85 @@ #! /usr/bin/env python +# -*- coding: utf8 -*- + +""" +aubio +===== + +Provides a number of classes and functions for music and audio signal +analysis. + +How to use the documentation +---------------------------- + +Documentation of the python module is available as docstrings provided +within the code, and a reference guide available online from `the +aubio homepage <https://aubio.org/documentation>`_. + +The docstrings examples are written assuming `aubio` and `numpy` have been +imported with: + +>>> import aubio +>>> import numpy as np +""" import numpy -from ._aubio import * +from ._aubio import __version__ as version from ._aubio import float_type +from ._aubio import * from .midiconv import * from .slicing import * + class fvec(numpy.ndarray): - """a numpy vector holding audio samples""" + """fvec(input_arg=1024) + A vector holding float samples. + + If `input_arg` is an `int`, a 1-dimensional vector of length `input_arg` + will be created and filled with zeros. Otherwise, if `input_arg` is an + `array_like` object, it will be converted to a 1-dimensional vector of + type :data:`float_type`. + + Parameters + ---------- + input_arg : `int` or `array_like` + Can be a positive integer, or any object that can be converted to + a numpy array with :func:`numpy.array`. + + Examples + -------- + >>> aubio.fvec(10) + array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], dtype=float32) + >>> aubio.fvec([0,1,2]) + array([0., 1., 2.], dtype=float32) + >>> a = np.arange(10); type(a), type(aubio.fvec(a)) + (<class 'numpy.ndarray'>, <class 'numpy.ndarray'>) + >>> a.dtype, aubio.fvec(a).dtype + (dtype('int64'), dtype('float32')) - def __new__(cls, input_arg=1024, **kwargs): + Notes + ----- + + In the Python world, `fvec` is simply a subclass of + :class:`numpy.ndarray`. In practice, any 1-dimensional `numpy.ndarray` of + `dtype` :data:`float_type` may be passed to methods accepting + `fvec` as parameter. For instance, `sink()` or `pvoc()`. + + See Also + -------- + cvec : a container holding spectral data + numpy.ndarray : parent class of :class:`fvec` + numpy.zeros : create a numpy array filled with zeros + numpy.array : create a numpy array from an existing object + """ + def __new__(cls, input_arg=1024): if isinstance(input_arg, int): if input_arg == 0: raise ValueError("vector length of 1 or more expected") - return numpy.zeros(input_arg, dtype=float_type, **kwargs) + return numpy.zeros(input_arg, dtype=float_type, order='C') else: - return numpy.array(input_arg, dtype=float_type, **kwargs) + np_input = numpy.array(input_arg, dtype=float_type, order='C') + if len(np_input.shape) != 1: + raise ValueError("input_arg should have shape (n,)") + if np_input.shape[0] == 0: + raise ValueError("vector length of 1 or more expected") + return np_input diff --git a/python/lib/aubio/cmd.py b/python/lib/aubio/cmd.py new file mode 100644 index 0000000..05780ea --- /dev/null +++ b/python/lib/aubio/cmd.py @@ -0,0 +1,623 @@ +#! /usr/bin/env python +# -*- coding: utf-8 -*- + +"""aubio command line tool + +This file was written by Paul Brossier <piem@aubio.org> and is released under +the GNU/GPL v3. + +Note: this script is mostly about parsing command line arguments. For more +readable code examples, check out the `python/demos` folder.""" + +import sys +import argparse +import warnings +import aubio + +def aubio_parser(): + epilog = 'use "%(prog)s <command> --help" for more info about each command' + parser = argparse.ArgumentParser(epilog=epilog) + parser.add_argument('-V', '--version', help="show version", + action="store_true", dest="show_version") + + subparsers = parser.add_subparsers(title='commands', dest='command', + parser_class= AubioArgumentParser, + metavar="") + + parser_add_subcommand_help(subparsers) + + parser_add_subcommand_onset(subparsers) + parser_add_subcommand_pitch(subparsers) + parser_add_subcommand_beat(subparsers) + parser_add_subcommand_tempo(subparsers) + parser_add_subcommand_notes(subparsers) + parser_add_subcommand_mfcc(subparsers) + parser_add_subcommand_melbands(subparsers) + parser_add_subcommand_quiet(subparsers) + parser_add_subcommand_cut(subparsers) + + return parser + +def parser_add_subcommand_help(subparsers): + # global help subcommand + subparsers.add_parser('help', + help='show help message', + formatter_class = argparse.ArgumentDefaultsHelpFormatter) + +def parser_add_subcommand_onset(subparsers): + # onset subcommand + subparser = subparsers.add_parser('onset', + help='estimate time of onsets (beginning of sound event)', + formatter_class = argparse.ArgumentDefaultsHelpFormatter) + subparser.add_input() + subparser.add_buf_hop_size() + helpstr = "onset novelty function" + helpstr += " <default|energy|hfc|complex|phase|specdiff|kl|mkl|specflux>" + subparser.add_method(helpstr=helpstr) + subparser.add_threshold() + subparser.add_silence() + subparser.add_minioi() + subparser.add_time_format() + subparser.add_verbose_help() + subparser.set_defaults(process=process_onset) + +def parser_add_subcommand_pitch(subparsers): + # pitch subcommand + subparser = subparsers.add_parser('pitch', + help='estimate fundamental frequency (monophonic)') + subparser.add_input() + subparser.add_buf_hop_size(buf_size=2048) + helpstr = "pitch detection method <default|yinfft|yin|mcomb|fcomb|schmitt>" + subparser.add_method(helpstr=helpstr) + subparser.add_threshold() + subparser.add_pitch_unit() + subparser.add_silence() + subparser.add_time_format() + subparser.add_verbose_help() + subparser.set_defaults(process=process_pitch) + +def parser_add_subcommand_beat(subparsers): + # beat subcommand + subparser = subparsers.add_parser('beat', + help='estimate location of beats') + subparser.add_input() + subparser.add_buf_hop_size(buf_size=1024, hop_size=512) + subparser.add_time_format() + subparser.add_verbose_help() + subparser.set_defaults(process=process_beat) + +def parser_add_subcommand_tempo(subparsers): + # tempo subcommand + subparser = subparsers.add_parser('tempo', + help='estimate overall tempo in bpm') + subparser.add_input() + subparser.add_buf_hop_size(buf_size=1024, hop_size=512) + subparser.add_time_format() + subparser.add_verbose_help() + subparser.set_defaults(process=process_tempo) + +def parser_add_subcommand_notes(subparsers): + # notes subcommand + subparser = subparsers.add_parser('notes', + help='estimate midi-like notes (monophonic)') + subparser.add_input() + subparser.add_buf_hop_size() + subparser.add_silence() + subparser.add_release_drop() + subparser.add_time_format() + subparser.add_verbose_help() + subparser.set_defaults(process=process_notes) + +def parser_add_subcommand_mfcc(subparsers): + # mfcc subcommand + subparser = subparsers.add_parser('mfcc', + help='extract Mel-Frequency Cepstrum Coefficients') + subparser.add_input() + subparser.add_buf_hop_size() + subparser.add_time_format() + subparser.add_verbose_help() + subparser.set_defaults(process=process_mfcc) + +def parser_add_subcommand_melbands(subparsers): + # melbands subcommand + subparser = subparsers.add_parser('melbands', + help='extract energies in Mel-frequency bands') + subparser.add_input() + subparser.add_buf_hop_size() + subparser.add_time_format() + subparser.add_verbose_help() + subparser.set_defaults(process=process_melbands) + +def parser_add_subcommand_quiet(subparsers): + # quiet subcommand + subparser = subparsers.add_parser('quiet', + help='extract timestamps of quiet and loud regions') + subparser.add_input() + subparser.add_hop_size() + subparser.add_silence() + subparser.add_time_format() + subparser.add_verbose_help() + subparser.set_defaults(process=process_quiet) + +def parser_add_subcommand_cut(subparsers): + # cut subcommand + subparser = subparsers.add_parser('cut', + help='slice at timestamps') + subparser.add_input() + helpstr = "onset novelty function" + helpstr += " <default|energy|hfc|complex|phase|specdiff|kl|mkl|specflux>" + subparser.add_method(helpstr=helpstr) + subparser.add_buf_hop_size() + subparser.add_silence() + subparser.add_threshold(default=0.3) + subparser.add_minioi() + subparser.add_slicer_options() + subparser.add_time_format() + subparser.add_verbose_help() + subparser.set_defaults(process=process_cut) + +class AubioArgumentParser(argparse.ArgumentParser): + + def add_input(self): + self.add_argument("source_uri", default=None, nargs='?', + help="input sound file to analyse", metavar = "<source_uri>") + self.add_argument("-i", "--input", dest = "source_uri2", + help="input sound file to analyse", metavar = "<source_uri>") + self.add_argument("-r", "--samplerate", + metavar = "<freq>", type=int, + action="store", dest="samplerate", default=0, + help="samplerate at which the file should be represented") + + def add_verbose_help(self): + self.add_argument("-v", "--verbose", + action="count", dest="verbose", default=1, + help="make lots of noise [default]") + self.add_argument("-q", "--quiet", + action="store_const", dest="verbose", const=0, + help="be quiet") + + def add_buf_hop_size(self, buf_size=512, hop_size=256): + self.add_buf_size(buf_size=buf_size) + self.add_hop_size(hop_size=hop_size) + + def add_buf_size(self, buf_size=512): + self.add_argument("-B", "--bufsize", + action="store", dest="buf_size", default=buf_size, + metavar = "<size>", type=int, + help="buffer size [default=%d]" % buf_size) + + def add_hop_size(self, hop_size=256): + self.add_argument("-H", "--hopsize", + metavar = "<size>", type=int, + action="store", dest="hop_size", default=hop_size, + help="overlap size [default=%d]" % hop_size) + + def add_method(self, method='default', helpstr='method'): + self.add_argument("-m", "--method", + metavar = "<method>", type=str, + action="store", dest="method", default=method, + help="%s [default=%s]" % (helpstr, method)) + + def add_threshold(self, default=None): + self.add_argument("-t", "--threshold", + metavar = "<threshold>", type=float, + action="store", dest="threshold", default=default, + help="threshold [default=%s]" % default) + + def add_silence(self): + self.add_argument("-s", "--silence", + metavar = "<value>", type=float, + action="store", dest="silence", default=-70, + help="silence threshold") + + def add_release_drop(self): + self.add_argument("-d", "--release-drop", + metavar = "<value>", type=float, + action="store", dest="release_drop", default=10, + help="release drop threshold") + + def add_minioi(self, default="12ms"): + self.add_argument("-M", "--minioi", + metavar = "<value>", type=str, + action="store", dest="minioi", default=default, + help="minimum Inter-Onset Interval [default=%s]" % default) + + def add_pitch_unit(self, default="Hz"): + help_str = "frequency unit, should be one of Hz, midi, bin, cent" + help_str += " [default=%s]" % default + self.add_argument("-u", "--pitch-unit", + metavar = "<value>", type=str, + action="store", dest="pitch_unit", default=default, + help=help_str) + + def add_time_format(self): + helpstr = "select time values output format (samples, ms, seconds)" + helpstr += " [default=seconds]" + self.add_argument("-T", "--time-format", + metavar='format', + dest="time_format", + default=None, + help=helpstr) + + def add_slicer_options(self): + self.add_argument("-o", "--output", type = str, + metavar = "<outputdir>", + action="store", dest="output_directory", default=None, + help="specify path where slices of the original file should" + " be created") + self.add_argument("--cut-until-nsamples", type = int, + metavar = "<samples>", + action = "store", dest = "cut_until_nsamples", default = None, + help="how many extra samples should be added at the end of" + " each slice") + self.add_argument("--cut-every-nslices", type = int, + metavar = "<samples>", + action = "store", dest = "cut_every_nslices", default = None, + help="how many slices should be groupped together at each cut") + self.add_argument("--cut-until-nslices", type = int, + metavar = "<slices>", + action = "store", dest = "cut_until_nslices", default = None, + help="how many extra slices should be added at the end of" + " each slice") + self.add_argument("--create-first", + action = "store_true", dest = "create_first", default = False, + help="always include first slice") + +# some utilities + +def samples2seconds(n_frames, samplerate): + return "%f\t" % (n_frames / float(samplerate)) + +def samples2milliseconds(n_frames, samplerate): + return "%f\t" % (1000. * n_frames / float(samplerate)) + +def samples2samples(n_frames, _samplerate): + return "%d\t" % n_frames + +def timefunc(mode): + if mode is None or mode == 'seconds' or mode == 's': + return samples2seconds + elif mode == 'ms' or mode == 'milliseconds': + return samples2milliseconds + elif mode == 'samples': + return samples2samples + else: + raise ValueError("invalid time format '%s'" % mode) + +# definition of processing classes + +class default_process(object): + def __init__(self, args): + if 'time_format' in args: + self.time2string = timefunc(args.time_format) + if args.verbose > 2 and hasattr(self, 'options'): + name = type(self).__name__.split('_')[1] + optstr = ' '.join(['running', name, 'with options', + repr(self.options), '\n']) + sys.stderr.write(optstr) + def flush(self, frames_read, samplerate): + # optionally called at the end of process + pass + + def parse_options(self, args, valid_opts): + # get any valid options found in a dictionnary of arguments + options = {k: v for k, v in vars(args).items() if k in valid_opts} + self.options = options + + def remap_pvoc_options(self, options): + # FIXME: we need to remap buf_size to win_s, hop_size to hop_s + # adjust python/ext/py-phasevoc.c to understand buf_size/hop_size + if 'buf_size' in options: + options['win_s'] = options['buf_size'] + del options['buf_size'] + if 'hop_size' in options: + options['hop_s'] = options['hop_size'] + del options['hop_size'] + self.options = options + +class process_onset(default_process): + valid_opts = ['method', 'hop_size', 'buf_size', 'samplerate'] + def __init__(self, args): + self.parse_options(args, self.valid_opts) + self.onset = aubio.onset(**self.options) + if args.threshold is not None: + self.onset.set_threshold(args.threshold) + if args.minioi: + if args.minioi.endswith('ms'): + self.onset.set_minioi_ms(float(args.minioi[:-2])) + elif args.minioi.endswith('s'): + self.onset.set_minioi_s(float(args.minioi[:-1])) + else: + self.onset.set_minioi(int(args.minioi)) + if args.silence: + self.onset.set_silence(args.silence) + super(process_onset, self).__init__(args) + def __call__(self, block): + return self.onset(block) + def repr_res(self, res, _frames_read, samplerate): + if res[0] != 0: + outstr = self.time2string(self.onset.get_last(), samplerate) + sys.stdout.write(outstr + '\n') + +class process_pitch(default_process): + valid_opts = ['method', 'hop_size', 'buf_size', 'samplerate'] + def __init__(self, args): + self.parse_options(args, self.valid_opts) + self.pitch = aubio.pitch(**self.options) + if args.pitch_unit is not None: + self.pitch.set_unit(args.pitch_unit) + if args.threshold is not None: + self.pitch.set_tolerance(args.threshold) + if args.silence is not None: + self.pitch.set_silence(args.silence) + super(process_pitch, self).__init__(args) + def __call__(self, block): + return self.pitch(block) + def repr_res(self, res, frames_read, samplerate): + fmt_out = self.time2string(frames_read, samplerate) + sys.stdout.write(fmt_out + "%.6f\n" % res[0]) + +class process_beat(default_process): + valid_opts = ['method', 'hop_size', 'buf_size', 'samplerate'] + def __init__(self, args): + self.parse_options(args, self.valid_opts) + self.tempo = aubio.tempo(**self.options) + super(process_beat, self).__init__(args) + def __call__(self, block): + return self.tempo(block) + def repr_res(self, res, _frames_read, samplerate): + if res[0] != 0: + outstr = self.time2string(self.tempo.get_last(), samplerate) + sys.stdout.write(outstr + '\n') + +class process_tempo(process_beat): + def __init__(self, args): + super(process_tempo, self).__init__(args) + self.beat_locations = [] + def repr_res(self, res, _frames_read, samplerate): + if res[0] != 0: + self.beat_locations.append(self.tempo.get_last_s()) + def flush(self, frames_read, samplerate): + import numpy as np + if len(self.beat_locations) < 2: + outstr = "unknown bpm" + else: + bpms = 60. / np.diff(self.beat_locations) + median_bpm = np.mean(bpms) + if len(self.beat_locations) < 10: + outstr = "%.2f bpm (uncertain)" % median_bpm + else: + outstr = "%.2f bpm" % median_bpm + sys.stdout.write(outstr + '\n') + +class process_notes(default_process): + valid_opts = ['method', 'hop_size', 'buf_size', 'samplerate'] + def __init__(self, args): + self.parse_options(args, self.valid_opts) + self.notes = aubio.notes(**self.options) + if args.silence is not None: + self.notes.set_silence(args.silence) + if args.release_drop is not None: + self.notes.set_release_drop(args.release_drop) + super(process_notes, self).__init__(args) + def __call__(self, block): + return self.notes(block) + def repr_res(self, res, frames_read, samplerate): + if res[2] != 0: # note off + fmt_out = self.time2string(frames_read, samplerate) + sys.stdout.write(fmt_out + '\n') + if res[0] != 0: # note on + lastmidi = res[0] + fmt_out = "%f\t" % lastmidi + fmt_out += self.time2string(frames_read, samplerate) + sys.stdout.write(fmt_out) # + '\t') + def flush(self, frames_read, samplerate): + eof = self.time2string(frames_read, samplerate) + sys.stdout.write(eof + '\n') + +class process_mfcc(default_process): + def __init__(self, args): + valid_opts1 = ['hop_size', 'buf_size'] + self.parse_options(args, valid_opts1) + self.remap_pvoc_options(self.options) + self.pv = aubio.pvoc(**self.options) + + valid_opts2 = ['buf_size', 'n_filters', 'n_coeffs', 'samplerate'] + self.parse_options(args, valid_opts2) + self.mfcc = aubio.mfcc(**self.options) + + # remember all options + self.parse_options(args, list(set(valid_opts1 + valid_opts2))) + + super(process_mfcc, self).__init__(args) + + def __call__(self, block): + fftgrain = self.pv(block) + return self.mfcc(fftgrain) + def repr_res(self, res, frames_read, samplerate): + fmt_out = self.time2string(frames_read, samplerate) + fmt_out += ' '.join(["% 9.7f" % f for f in res.tolist()]) + sys.stdout.write(fmt_out + '\n') + +class process_melbands(default_process): + def __init__(self, args): + self.args = args + valid_opts = ['hop_size', 'buf_size'] + self.parse_options(args, valid_opts) + self.remap_pvoc_options(self.options) + self.pv = aubio.pvoc(**self.options) + + valid_opts = ['buf_size', 'n_filters'] + self.parse_options(args, valid_opts) + self.remap_pvoc_options(self.options) + self.filterbank = aubio.filterbank(**self.options) + self.filterbank.set_mel_coeffs_slaney(args.samplerate) + + super(process_melbands, self).__init__(args) + def __call__(self, block): + fftgrain = self.pv(block) + return self.filterbank(fftgrain) + def repr_res(self, res, frames_read, samplerate): + fmt_out = self.time2string(frames_read, samplerate) + fmt_out += ' '.join(["% 9.7f" % f for f in res.tolist()]) + sys.stdout.write(fmt_out + '\n') + +class process_quiet(default_process): + def __init__(self, args): + self.args = args + valid_opts = ['hop_size', 'silence'] + self.parse_options(args, valid_opts) + self.wassilence = 1 + + if args.silence is not None: + self.silence = args.silence + super(process_quiet, self).__init__(args) + + def __call__(self, block): + if aubio.silence_detection(block, self.silence) == 1: + if self.wassilence != 1: + self.wassilence = 1 + return 2 # newly found silence + return 1 # silence again + else: + if self.wassilence != 0: + self.wassilence = 0 + return -1 # newly found noise + return 0 # noise again + + def repr_res(self, res, frames_read, samplerate): + fmt_out = None + if res == -1: + fmt_out = "NOISY: " + if res == 2: + fmt_out = "QUIET: " + if fmt_out is not None: + fmt_out += self.time2string(frames_read, samplerate) + sys.stdout.write(fmt_out + '\n') + +class process_cut(process_onset): + def __init__(self, args): + super(process_cut, self).__init__(args) + self.slices = [] + self.options = args + + def __call__(self, block): + ret = super(process_cut, self).__call__(block) + if ret: + self.slices.append(self.onset.get_last()) + return ret + + def flush(self, frames_read, samplerate): + _cut_slice(self.options, self.slices) + duration = float(frames_read) / float(samplerate) + base_info = '%(source_file)s' % \ + {'source_file': self.options.source_uri} + base_info += ' (total %(duration).2fs at %(samplerate)dHz)\n' % \ + {'duration': duration, 'samplerate': samplerate} + info = "created %d slices from " % len(self.slices) + info += base_info + sys.stderr.write(info) + +def _cut_slice(options, timestamps): + # cutting pass + nstamps = len(timestamps) + if nstamps > 0: + # generate output files + timestamps_end = None + if options.cut_every_nslices: + timestamps = timestamps[::options.cut_every_nslices] + nstamps = len(timestamps) + if options.cut_until_nslices and options.cut_until_nsamples: + msg = "using cut_until_nslices, but cut_until_nsamples is set" + warnings.warn(msg) + if options.cut_until_nsamples: + lag = options.cut_until_nsamples + timestamps_end = [t + lag for t in timestamps[1:]] + timestamps_end += [1e120] + if options.cut_until_nslices: + slice_lag = options.cut_until_nslices + timestamps_end = [t for t in timestamps[1 + slice_lag:]] + timestamps_end += [1e120] * (options.cut_until_nslices + 1) + aubio.slice_source_at_stamps(options.source_uri, + timestamps, timestamps_end = timestamps_end, + output_dir = options.output_directory, + samplerate = options.samplerate, + create_first = options.create_first) + +def main(): + parser = aubio_parser() + if sys.version_info[0] != 3: + # on py2, create a dummy ArgumentParser to workaround the + # optional subcommand issue. See https://bugs.python.org/issue9253 + # This ensures that: + # - version string is shown when only '-V' is passed + # - help is printed if '-V' is passed with any other argument + # - any other argument get forwarded to the real parser + parser_root = argparse.ArgumentParser(add_help=False) + parser_root.add_argument('-V', '--version', help="show version", + action="store_true", dest="show_version") + args, extras = parser_root.parse_known_args() + if not args.show_version: # no -V, forward to parser + args = parser.parse_args(extras, namespace=args) + elif len(extras) != 0: # -V with other arguments, print help + parser.print_help() + sys.exit(1) + else: # in py3, we can simply use parser directly + args = parser.parse_args() + if 'show_version' in args and args.show_version: + sys.stdout.write('aubio version ' + aubio.version + '\n') + sys.exit(0) + elif 'verbose' in args and args.verbose > 3: + sys.stderr.write('aubio version ' + aubio.version + '\n') + if 'command' not in args or args.command is None \ + or args.command in ['help']: + # no command given, print help and return 1 + parser.print_help() + if args.command and args.command in ['help']: + sys.exit(0) + else: + sys.exit(1) + elif not args.source_uri and not args.source_uri2: + sys.stderr.write("Error: a source is required\n") + parser.print_help() + sys.exit(1) + elif args.source_uri2 is not None: + args.source_uri = args.source_uri2 + try: + # open source_uri + with aubio.source(args.source_uri, hop_size=args.hop_size, + samplerate=args.samplerate) as a_source: + # always update args.samplerate to native samplerate, in case + # source was opened with args.samplerate=0 + args.samplerate = a_source.samplerate + # create the processor for this subcommand + processor = args.process(args) + frames_read = 0 + while True: + # read new block from source + block, read = a_source() + # execute processor on this block + res = processor(block) + # print results for this block + if args.verbose > 0: + processor.repr_res(res, frames_read, a_source.samplerate) + # increment total number of frames read + frames_read += read + # exit loop at end of file + if read < a_source.hop_size: + break + # flush the processor if needed + processor.flush(frames_read, a_source.samplerate) + if args.verbose > 1: + fmt_string = "read {:.2f}s" + fmt_string += " ({:d} samples in {:d} blocks of {:d})" + fmt_string += " from {:s} at {:d}Hz\n" + sys.stderr.write(fmt_string.format( + frames_read / float(a_source.samplerate), + frames_read, + frames_read // a_source.hop_size + 1, + a_source.hop_size, + a_source.uri, + a_source.samplerate)) + except KeyboardInterrupt: + sys.exit(1) diff --git a/python/lib/aubio/cut.py b/python/lib/aubio/cut.py new file mode 100644 index 0000000..a31e38d --- /dev/null +++ b/python/lib/aubio/cut.py @@ -0,0 +1,163 @@ +#! /usr/bin/env python + +""" this file was written by Paul Brossier + it is released under the GNU/GPL license. +""" + +import sys +from aubio.cmd import AubioArgumentParser, _cut_slice + +def aubio_cut_parser(): + parser = AubioArgumentParser() + parser.add_input() + parser.add_argument("-O", "--onset-method", + action="store", dest="onset_method", default='default', + metavar = "<onset_method>", + help="onset detection method [default=default] \ + complexdomain|hfc|phase|specdiff|energy|kl|mkl") + # cutting methods + parser.add_argument("-b", "--beat", + action="store_true", dest="beat", default=False, + help="slice at beat locations") + """ + parser.add_argument("-S", "--silencecut", + action="store_true", dest="silencecut", default=False, + help="use silence locations") + parser.add_argument("-s", "--silence", + metavar = "<value>", + action="store", dest="silence", default=-70, + help="silence threshold [default=-70]") + """ + # algorithm parameters + parser.add_buf_hop_size() + parser.add_argument("-t", "--threshold", "--onset-threshold", + metavar = "<threshold>", type=float, + action="store", dest="threshold", default=0.3, + help="onset peak picking threshold [default=0.3]") + parser.add_argument("-c", "--cut", + action="store_true", dest="cut", default=False, + help="cut input sound file at detected labels") + parser.add_minioi() + + """ + parser.add_argument("-D", "--delay", + action = "store", dest = "delay", type = float, + metavar = "<seconds>", default=0, + help="number of seconds to take back [default=system]\ + default system delay is 3*hopsize/samplerate") + parser.add_argument("-C", "--dcthreshold", + metavar = "<value>", + action="store", dest="dcthreshold", default=1., + help="onset peak picking DC component [default=1.]") + parser.add_argument("-L", "--localmin", + action="store_true", dest="localmin", default=False, + help="use local minima after peak detection") + parser.add_argument("-d", "--derivate", + action="store_true", dest="derivate", default=False, + help="derivate onset detection function") + parser.add_argument("-z", "--zerocross", + metavar = "<value>", + action="store", dest="zerothres", default=0.008, + help="zero-crossing threshold for slicing [default=0.00008]") + # plotting functions + parser.add_argument("-p", "--plot", + action="store_true", dest="plot", default=False, + help="draw plot") + parser.add_argument("-x", "--xsize", + metavar = "<size>", + action="store", dest="xsize", default=1., + type=float, help="define xsize for plot") + parser.add_argument("-y", "--ysize", + metavar = "<size>", + action="store", dest="ysize", default=1., + type=float, help="define ysize for plot") + parser.add_argument("-f", "--function", + action="store_true", dest="func", default=False, + help="print detection function") + parser.add_argument("-n", "--no-onsets", + action="store_true", dest="nplot", default=False, + help="do not plot detected onsets") + parser.add_argument("-O", "--outplot", + metavar = "<output_image>", + action="store", dest="outplot", default=None, + help="save plot to output.{ps,png}") + parser.add_argument("-F", "--spectrogram", + action="store_true", dest="spectro", default=False, + help="add spectrogram to the plot") + """ + parser.add_slicer_options() + parser.add_verbose_help() + return parser + + +def _cut_analyze(options): + hopsize = options.hop_size + bufsize = options.buf_size + samplerate = options.samplerate + source_uri = options.source_uri + + # analyze pass + from aubio import onset, tempo, source + + s = source(source_uri, samplerate, hopsize) + if samplerate == 0: + samplerate = s.samplerate + options.samplerate = samplerate + + if options.beat: + o = tempo(options.onset_method, bufsize, hopsize, + samplerate=samplerate) + else: + o = onset(options.onset_method, bufsize, hopsize, + samplerate=samplerate) + if options.minioi: + if options.minioi.endswith('ms'): + o.set_minioi_ms(int(options.minioi[:-2])) + elif options.minioi.endswith('s'): + o.set_minioi_s(int(options.minioi[:-1])) + else: + o.set_minioi(int(options.minioi)) + o.set_threshold(options.threshold) + + timestamps = [] + total_frames = 0 + while True: + samples, read = s() + if o(samples): + timestamps.append(o.get_last()) + if options.verbose: + print("%.4f" % o.get_last_s()) + total_frames += read + if read < hopsize: + break + del s + return timestamps, total_frames + +def main(): + parser = aubio_cut_parser() + options = parser.parse_args() + if not options.source_uri and not options.source_uri2: + sys.stderr.write("Error: no file name given\n") + parser.print_help() + sys.exit(1) + elif options.source_uri2 is not None: + options.source_uri = options.source_uri2 + + # analysis + timestamps, total_frames = _cut_analyze(options) + + # print some info + duration = float(total_frames) / float(options.samplerate) + base_info = '%(source_uri)s' % {'source_uri': options.source_uri} + base_info += ' (total %(duration).2fs at %(samplerate)dHz)\n' % \ + {'duration': duration, 'samplerate': options.samplerate} + + info = "found %d timestamps in " % len(timestamps) + info += base_info + sys.stderr.write(info) + + if options.cut: + _cut_slice(options, timestamps) + info = "created %d slices from " % len(timestamps) + info += base_info + sys.stderr.write(info) diff --git a/python/lib/aubio/midiconv.py b/python/lib/aubio/midiconv.py index 80f28d0..99b3c0b 100644 --- a/python/lib/aubio/midiconv.py +++ b/python/lib/aubio/midiconv.py @@ -1,9 +1,11 @@ # -*- coding: utf-8 -*- """ utilities to convert midi note number to and from note names """ -__all__ = ['note2midi', 'midi2note', 'freq2note'] - import sys +from ._aubio import freqtomidi, miditofreq + +__all__ = ['note2midi', 'midi2note', 'freq2note', 'note2freq'] + py3 = sys.version_info[0] == 3 if py3: str_instances = str @@ -12,18 +14,65 @@ else: str_instances = (str, unicode) int_instances = (int, long) + def note2midi(note): - " convert note name to midi note number, e.g. [C-1, G9] -> [0, 127] " - _valid_notenames = {'C': 0, 'D': 2, 'E': 4, 'F': 5, 'G': 7, 'A': 9, 'B': 11} - _valid_modifiers = {None: 0, u'♮': 0, '#': +1, u'♯': +1, u'\udd2a': +2, - 'b': -1, u'♭': -1, u'\ufffd': -2} + """Convert note name to midi note number. + + Input string `note` should be composed of one note root + and one octave, with optionally one modifier in between. + + List of valid components: + + - note roots: `C`, `D`, `E`, `F`, `G`, `A`, `B`, + - modifiers: `b`, `#`, as well as unicode characters + `𝄫`, `♭`, `♮`, `♯` and `𝄪`, + - octave numbers: `-1` -> `11`. + + Parameters + ---------- + note : str + note name + + Returns + ------- + int + corresponding midi note number + + Examples + -------- + >>> aubio.note2midi('C#4') + 61 + >>> aubio.note2midi('B♭5') + 82 + + Raises + ------ + TypeError + If `note` was not a string. + ValueError + If an error was found while converting `note`. + + See Also + -------- + midi2note, freqtomidi, miditofreq + """ + _valid_notenames = {'C': 0, 'D': 2, 'E': 4, 'F': 5, 'G': 7, + 'A': 9, 'B': 11} + _valid_modifiers = { + u'𝄫': -2, # double flat + u'♭': -1, 'b': -1, '\u266d': -1, # simple flat + u'♮': 0, '\u266e': 0, None: 0, # natural + '#': +1, u'♯': +1, '\u266f': +1, # sharp + u'𝄪': +2, # double sharp + } _valid_octaves = range(-1, 10) if not isinstance(note, str_instances): - raise TypeError("a string is required, got %s (%s)" % (note, str(type(note)))) + msg = "a string is required, got {:s} ({:s})" + raise TypeError(msg.format(str(type(note)), repr(note))) if len(note) not in range(2, 5): - raise ValueError("string of 2 to 4 characters expected, got %d (%s)" \ - % (len(note), note)) - notename, modifier, octave = [None]*3 + msg = "string of 2 to 4 characters expected, got {:d} ({:s})" + raise ValueError(msg.format(len(note), note)) + notename, modifier, octave = [None] * 3 if len(note) == 4: notename, modifier, octave_sign, octave = note @@ -46,21 +95,97 @@ def note2midi(note): if octave not in _valid_octaves: raise ValueError("%s is not a valid octave" % octave) - midi = 12 + octave * 12 + _valid_notenames[notename] + _valid_modifiers[modifier] + midi = (octave + 1) * 12 + _valid_notenames[notename] \ + + _valid_modifiers[modifier] if midi > 127: raise ValueError("%s is outside of the range C-2 to G8" % note) return midi + def midi2note(midi): - " convert midi note number to note name, e.g. [0, 127] -> [C-1, G9] " + """Convert midi note number to note name. + + Parameters + ---------- + midi : int [0, 128] + input midi note number + + Returns + ------- + str + note name + + Examples + -------- + >>> aubio.midi2note(70) + 'A#4' + >>> aubio.midi2note(59) + 'B3' + + Raises + ------ + TypeError + If `midi` was not an integer. + ValueError + If `midi` is out of the range `[0, 128]`. + + See Also + -------- + note2midi, miditofreq, freqtomidi + """ if not isinstance(midi, int_instances): raise TypeError("an integer is required, got %s" % midi) if midi not in range(0, 128): - raise ValueError("an integer between 0 and 127 is excepted, got %d" % midi) - _valid_notenames = ['C', 'C#', 'D', 'D#', 'E', 'F', 'F#', 'G', 'G#', 'A', 'A#', 'B'] + msg = "an integer between 0 and 127 is excepted, got {:d}" + raise ValueError(msg.format(midi)) + _valid_notenames = ['C', 'C#', 'D', 'D#', 'E', 'F', 'F#', 'G', 'G#', + 'A', 'A#', 'B'] return _valid_notenames[midi % 12] + str(int(midi / 12) - 1) + def freq2note(freq): - " convert frequency in Hz to nearest note name, e.g. [0, 22050.] -> [C-1, G9] " - from aubio import freqtomidi - return midi2note(int(freqtomidi(freq))) + """Convert frequency in Hz to nearest note name. + + Parameters + ---------- + freq : float [0, 23000[ + input frequency, in Hz + + Returns + ------- + str + name of the nearest note + + Example + ------- + >>> aubio.freq2note(440) + 'A4' + >>> aubio.freq2note(220.1) + 'A3' + """ + nearest_note = int(freqtomidi(freq) + .5) + return midi2note(nearest_note) + + +def note2freq(note): + """Convert note name to corresponding frequency, in Hz. + + Parameters + ---------- + note : str + input note name + + Returns + ------- + freq : float [0, 23000[ + frequency, in Hz + + Example + ------- + >>> aubio.note2freq('A4') + 440 + >>> aubio.note2freq('A3') + 220.1 + """ + midi = note2midi(note) + return miditofreq(midi) diff --git a/python/lib/aubio/slicing.py b/python/lib/aubio/slicing.py index fa9d2e3..4d9964c 100644 --- a/python/lib/aubio/slicing.py +++ b/python/lib/aubio/slicing.py @@ -5,26 +5,91 @@ from aubio import source, sink _max_timestamp = 1e120 + def slice_source_at_stamps(source_file, timestamps, timestamps_end=None, - output_dir=None, samplerate=0, hopsize=256): - """ slice a sound file at given timestamps """ + output_dir=None, samplerate=0, hopsize=256, + create_first=False): + """Slice a sound file at given timestamps. + + This function reads `source_file` and creates slices, new smaller + files each starting at `t` in `timestamps`, a list of integer + corresponding to time locations in `source_file`, in samples. + + If `timestamps_end` is unspecified, the slices will end at + `timestamps_end[n] = timestamps[n+1]-1`, or the end of file. + Otherwise, `timestamps_end` should be a list with the same length + as `timestamps` containing the locations of the end of each slice. + + If `output_dir` is unspecified, the new slices will be written in + the current directory. If `output_dir` is a string, new slices + will be written in `output_dir`, after creating the directory if + required. + + The default `samplerate` is 0, meaning the original sampling rate + of `source_file` will be used. When using a sampling rate + different to the one of the original files, `timestamps` and + `timestamps_end` should be expressed in the re-sampled signal. + + The `hopsize` parameter simply tells :class:`source` to use this + hopsize and does not change the output slices. + + If `create_first` is True and `timestamps` does not start with `0`, the + first slice from `0` to `timestamps[0] - 1` will be automatically added. + + Parameters + ---------- + source_file : str + path of the resource to slice + timestamps : :obj:`list` of :obj:`int` + time stamps at which to slice, in samples + timestamps_end : :obj:`list` of :obj:`int` (optional) + time stamps at which to end the slices + output_dir : str (optional) + output directory to write the slices to + samplerate : int (optional) + samplerate to read the file at + hopsize : int (optional) + number of samples read from source per iteration + create_first : bool (optional) + always create the slice at the start of the file + + Examples + -------- + Create two slices: the first slice starts at the beginning of the + input file `loop.wav` and lasts exactly one second, starting at + sample `0` and ending at sample `44099`; the second slice starts + at sample `44100` and lasts until the end of the input file: + + >>> aubio.slice_source_at_stamps('loop.wav', [0, 44100]) + + Create one slice, from 1 second to 2 seconds: + + >>> aubio.slice_source_at_stamps('loop.wav', [44100], [44100 * 2 - 1]) + + Notes + ----- + Slices may be overlapping. If `timestamps_end` is `1` element + shorter than `timestamps`, the last slice will end at the end of + the file. + """ - if timestamps is None or len(timestamps) == 0: + if not timestamps: raise ValueError("no timestamps given") - if timestamps[0] != 0: + if timestamps[0] != 0 and create_first: timestamps = [0] + timestamps if timestamps_end is not None: timestamps_end = [timestamps[1] - 1] + timestamps_end if timestamps_end is not None: - if len(timestamps_end) != len(timestamps): + if len(timestamps_end) == len(timestamps) - 1: + timestamps_end = timestamps_end + [_max_timestamp] + elif len(timestamps_end) != len(timestamps): raise ValueError("len(timestamps_end) != len(timestamps)") else: timestamps_end = [t - 1 for t in timestamps[1:]] + [_max_timestamp] regions = list(zip(timestamps, timestamps_end)) - #print regions source_base_name, _ = os.path.splitext(os.path.basename(source_file)) if output_dir is not None: @@ -32,8 +97,8 @@ def slice_source_at_stamps(source_file, timestamps, timestamps_end=None, os.makedirs(output_dir) source_base_name = os.path.join(output_dir, source_base_name) - def new_sink_name(source_base_name, timestamp, samplerate): - """ create a sink based on a timestamp in samples, converted in seconds """ + def _new_sink_name(source_base_name, timestamp, samplerate): + # create name based on a timestamp in samples, converted in seconds timestamp_seconds = timestamp / float(samplerate) return source_base_name + "_%011.6f" % timestamp_seconds + '.wav' @@ -48,16 +113,17 @@ def slice_source_at_stamps(source_file, timestamps, timestamps_end=None, # get hopsize new samples from source vec, read = _source.do_multi() # if the total number of frames read will exceed the next region start - if len(regions) and total_frames + read >= regions[0][0]: - #print "getting", regions[0], "at", total_frames + while regions and total_frames + read >= regions[0][0]: # get next region start_stamp, end_stamp = regions.pop(0) # create a name for the sink - new_sink_path = new_sink_name(source_base_name, start_stamp, samplerate) + new_sink_path = _new_sink_name(source_base_name, start_stamp, + samplerate) # create its sink _sink = sink(new_sink_path, samplerate, _source.channels) # create a dictionary containing all this - new_slice = {'start_stamp': start_stamp, 'end_stamp': end_stamp, 'sink': _sink} + new_slice = {'start_stamp': start_stamp, 'end_stamp': end_stamp, + 'sink': _sink} # append the dictionary to the current list of slices slices.append(new_slice) @@ -69,18 +135,19 @@ def slice_source_at_stamps(source_file, timestamps, timestamps_end=None, start = max(start_stamp - total_frames, 0) # number of samples yet to written be until end of region remaining = end_stamp - total_frames + 1 - #print current_slice, remaining, start # not enough frames remaining, time to split if remaining < read: if remaining > start: # write remaining samples from current region _sink.do_multi(vec[:, start:remaining], remaining - start) - #print "closing region", "remaining", remaining # close this file _sink.close() elif read > start: # write all the samples _sink.do_multi(vec[:, start:read], read - start) total_frames += read + # remove old slices + slices = list(filter(lambda s: s['end_stamp'] > total_frames, + slices)) if read < hopsize: break diff --git a/python/lib/gen_code.py b/python/lib/gen_code.py index 1f9b9dc..b48f9a0 100644 --- a/python/lib/gen_code.py +++ b/python/lib/gen_code.py @@ -2,6 +2,7 @@ aubiodefvalue = { # we have some clean up to do 'buf_size': 'Py_default_vector_length', 'win_s': 'Py_default_vector_length', + 'size': 'Py_default_vector_length', # and here too 'hop_size': 'Py_default_vector_length / 2', 'hop_s': 'Py_default_vector_length / 2', @@ -81,7 +82,8 @@ objoutsize = { 'specdesc': '1', 'tempo': '1', 'filterbank': 'self->n_filters', - 'tss': 'self->hop_size', + 'tss': 'self->buf_size', + 'dct': 'self->size', } objinputsize = { @@ -93,6 +95,7 @@ objinputsize = { 'specdesc': 'self->buf_size / 2 + 1', 'tempo': 'self->hop_size', 'wavetable': 'self->hop_size', + 'tss': 'self->buf_size / 2 + 1', } def get_name(proto): @@ -175,6 +178,10 @@ class MappedObject(object): self.do_inputs = [get_params_types_names(self.do_proto)[1]] self.do_outputs = get_params_types_names(self.do_proto)[2:] struct_output_str = ["PyObject *{0[name]}; {1} c_{0[name]}".format(i, i['type'][:-1]) for i in self.do_outputs] + if len(self.prototypes['rdo']): + rdo_outputs = get_params_types_names(prototypes['rdo'][0])[2:] + struct_output_str += ["PyObject *{0[name]}; {1} c_{0[name]}".format(i, i['type'][:-1]) for i in rdo_outputs] + self.outputs += rdo_outputs self.struct_outputs = ";\n ".join(struct_output_str) #print ("input_params: ", map(split_type, get_input_params(self.do_proto))) @@ -182,17 +189,26 @@ class MappedObject(object): def gen_code(self): out = "" - out += self.gen_struct() - out += self.gen_doc() - out += self.gen_new() - out += self.gen_init() - out += self.gen_del() - out += self.gen_do() - out += self.gen_memberdef() - out += self.gen_set() - out += self.gen_get() - out += self.gen_methodef() - out += self.gen_typeobject() + try: + out += self.gen_struct() + out += self.gen_doc() + out += self.gen_new() + out += self.gen_init() + out += self.gen_del() + out += self.gen_do() + if len(self.prototypes['rdo']): + self.do_proto = self.prototypes['rdo'][0] + self.do_inputs = [get_params_types_names(self.do_proto)[1]] + self.do_outputs = get_params_types_names(self.do_proto)[2:] + out += self.gen_do(method='rdo') + out += self.gen_memberdef() + out += self.gen_set() + out += self.gen_get() + out += self.gen_methodef() + out += self.gen_typeobject() + except Exception as e: + print ("Failed generating code for", self.shortname) + raise return out def gen_struct(self): @@ -215,11 +231,21 @@ typedef struct{{ return out.format(do_inputs_list = do_inputs_list, **self.__dict__) def gen_doc(self): + sig = [] + for p in self.input_params: + name = p['name'] + defval = aubiodefvalue[name].replace('"','\\\"') + sig.append("{name}={defval}".format(defval=defval, name=name)) out = """ -// TODO: add documentation -static char Py_{shortname}_doc[] = \"undefined\"; +#ifndef PYAUBIO_{shortname}_doc +#define PYAUBIO_{shortname}_doc "{shortname}({sig})" +#endif /* PYAUBIO_{shortname}_doc */ + +static char Py_{shortname}_doc[] = "" +PYAUBIO_{shortname}_doc +""; """ - return out.format(**self.__dict__) + return out.format(sig=', '.join(sig), **self.__dict__) def gen_new(self): out = """ @@ -305,7 +331,7 @@ Py_{shortname}_init (Py_{shortname} * self, PyObject * args, PyObject * kwds) out += """ // return -1 and set error string on failure if (self->o == NULL) {{ - PyErr_Format (PyExc_Exception, "failed creating {shortname}"); + PyErr_Format (PyExc_RuntimeError, "failed creating {shortname}"); return -1; }} """.format(paramchars = paramchars, paramvals = paramvals, **self.__dict__) @@ -347,32 +373,36 @@ Py_{shortname}_del (Py_{shortname} * self, PyObject * unused) for input_param in self.do_inputs: if input_param['type'] == 'fmat_t *': out += """ - free(self->{0[name]}.data);""".format(input_param) + free(self->{0[name]}.data);""".format(input_param) for o in self.outputs: name = o['name'] del_out = delfromtype_fn[o['type']] out += """ - {del_out}(self->{name});""".format(del_out = del_out, name = name) + if (self->{name}) {{ + {del_out}(self->{name}); + }}""".format(del_out = del_out, name = name) del_fn = get_name(self.del_proto) out += """ - if (self->o) {{ - {del_fn}(self->o); - }} - Py_TYPE(self)->tp_free((PyObject *) self); + if (self->o) {{ + {del_fn}(self->o); + }} + Py_TYPE(self)->tp_free((PyObject *) self); }} """.format(del_fn = del_fn) return out - def gen_do(self): + def gen_do(self, method = 'do'): out = """ // do {shortname} static PyObject* -Py_{shortname}_do (Py_{shortname} * self, PyObject * args) -{{""".format(**self.__dict__) +Pyaubio_{shortname}_{method} (Py_{shortname} * self, PyObject * args) +{{""".format(method = method, **self.__dict__) input_params = self.do_inputs output_params = self.do_outputs #print input_params #print output_params + out += """ + PyObject *outputs;""" for input_param in input_params: out += """ PyObject *py_{0};""".format(input_param['name']) @@ -415,12 +445,25 @@ Py_{shortname}_do (Py_{shortname} * self, PyObject * args) out += """ {do_fn}(self->o, {inputs}, {c_outputs}); +""".format( + do_fn = do_fn, + inputs = inputs, c_outputs = c_outputs, + ) + if len(self.do_outputs) > 1: + out += """ + outputs = PyTuple_New({:d});""".format(len(self.do_outputs)) + for i, p in enumerate(self.do_outputs): + out += """ + PyTuple_SetItem( outputs, {i}, self->{p[name]});""".format(i = i, p = p) + else: + out += """ + outputs = self->{p[name]};""".format(p = self.do_outputs[0]) + out += """ - return {outputs}; + return outputs; }} """.format( - do_fn = do_fn, - inputs = inputs, outputs = outputs, c_outputs = c_outputs, + outputs = outputs, ) return out @@ -429,30 +472,51 @@ Py_{shortname}_do (Py_{shortname} * self, PyObject * args) // {shortname} setters """.format(**self.__dict__) for set_param in self.prototypes['set']: - params = get_params_types_names(set_param)[1] - paramtype = params['type'] + params = get_params_types_names(set_param)[1:] + param = self.shortname.split('_set_')[-1] + paramdecls = "".join([""" + {0} {1};""".format(p['type'], p['name']) for p in params]) method_name = get_name(set_param) param = method_name.split('aubio_'+self.shortname+'_set_')[-1] - pyparamtype = pyargparse_chars[paramtype] + refs = ", ".join(["&%s" % p['name'] for p in params]) + paramlist = ", ".join(["%s" % p['name'] for p in params]) + if len(params): + paramlist = "," + paramlist + pyparamtypes = ''.join([pyargparse_chars[p['type']] for p in params]) out += """ static PyObject * Pyaubio_{shortname}_set_{param} (Py_{shortname} *self, PyObject *args) {{ uint_t err = 0; - {paramtype} {param}; + {paramdecls} +""".format(param = param, paramdecls = paramdecls, **self.__dict__) + + if len(refs) and len(pyparamtypes): + out += """ - if (!PyArg_ParseTuple (args, "{pyparamtype}", &{param})) {{ + if (!PyArg_ParseTuple (args, "{pyparamtypes}", {refs})) {{ return NULL; }} - err = aubio_{shortname}_set_{param} (self->o, {param}); +""".format(pyparamtypes = pyparamtypes, refs = refs) + + out += """ + err = aubio_{shortname}_set_{param} (self->o {paramlist}); if (err > 0) {{ - PyErr_SetString (PyExc_ValueError, "error running aubio_{shortname}_set_{param}"); + if (PyErr_Occurred() == NULL) {{ + PyErr_SetString (PyExc_ValueError, "error running aubio_{shortname}_set_{param}"); + }} else {{ + // change the RuntimeError into ValueError + PyObject *type, *value, *traceback; + PyErr_Fetch(&type, &value, &traceback); + PyErr_Restore(PyExc_ValueError, value, traceback); + }} return NULL; }} Py_RETURN_NONE; }} -""".format(param = param, paramtype = paramtype, pyparamtype = pyparamtype, **self.__dict__) +""".format(param = param, refs = refs, paramdecls = paramdecls, + pyparamtypes = pyparamtypes, paramlist = paramlist, **self.__dict__) return out def gen_get(self): @@ -493,6 +557,12 @@ static PyMethodDef Py_{shortname}_methods[] = {{""".format(**self.__dict__) out += """ {{"{shortname}", (PyCFunction) Py{name}, METH_NOARGS, ""}},""".format(name = name, shortname = shortname) + for m in self.prototypes['rdo']: + name = get_name(m) + shortname = name.replace('aubio_%s_' % self.shortname, '') + out += """ + {{"{shortname}", (PyCFunction) Py{name}, + METH_VARARGS, ""}},""".format(name = name, shortname = shortname) out += """ {NULL} /* sentinel */ }; @@ -518,7 +588,7 @@ PyTypeObject Py_{shortname}Type = {{ 0, 0, 0, - (ternaryfunc)Py_{shortname}_do, + (ternaryfunc)Pyaubio_{shortname}_do, 0, 0, 0, diff --git a/python/lib/gen_external.py b/python/lib/gen_external.py index 1fe8362..1425c9b 100644 --- a/python/lib/gen_external.py +++ b/python/lib/gen_external.py @@ -1,5 +1,10 @@ import distutils.ccompiler -import sys, os, subprocess, glob +import sys +import os +import subprocess +import glob +from distutils.sysconfig import customize_compiler +from gen_code import MappedObject header = os.path.join('src', 'aubio.h') output_path = os.path.join('python', 'gen') @@ -8,45 +13,44 @@ source_header = """// this file is generated! do not modify #include "aubio-types.h" """ -skip_objects = [ - # already in ext/ - 'fft', - 'pvoc', - 'filter', - 'filterbank', - #'resampler', - # AUBIO_UNSTABLE - 'hist', - 'parameter', - 'scale', - 'beattracking', - 'resampler', - 'peakpicker', - 'pitchfcomb', - 'pitchmcomb', - 'pitchschmitt', - 'pitchspecacf', - 'pitchyin', - 'pitchyinfft', - 'sink', - 'sink_apple_audio', - 'sink_sndfile', - 'sink_wavwrite', - #'mfcc', - 'source', - 'source_apple_audio', - 'source_sndfile', - 'source_avcodec', - 'source_wavread', - #'sampler', - 'audio_unit', - - 'tss', - ] +default_skip_objects = [ + # already in ext/ + 'fft', + 'pvoc', + 'filter', + 'filterbank', + # AUBIO_UNSTABLE + 'hist', + 'parameter', + 'scale', + 'beattracking', + 'resampler', + 'peakpicker', + 'pitchfcomb', + 'pitchmcomb', + 'pitchschmitt', + 'pitchspecacf', + 'pitchyin', + 'pitchyinfft', + 'pitchyinfast', + 'sink', + 'sink_apple_audio', + 'sink_sndfile', + 'sink_wavwrite', + #'mfcc', + 'source', + 'source_apple_audio', + 'source_sndfile', + 'source_avcodec', + 'source_wavread', + #'sampler', + 'audio_unit', + 'spectral_whitening', +] + def get_preprocessor(): # findout which compiler to use - from distutils.sysconfig import customize_compiler compiler_name = distutils.ccompiler.get_default_compiler() compiler = distutils.ccompiler.new_compiler(compiler=compiler_name) try: @@ -61,26 +65,50 @@ def get_preprocessor(): print("Warning: failed initializing compiler ({:s})".format(repr(e))) cpp_cmd = None - if hasattr(compiler, 'preprocessor'): # for unixccompiler + if hasattr(compiler, 'preprocessor'): # for unixccompiler cpp_cmd = compiler.preprocessor - elif hasattr(compiler, 'compiler'): # for ccompiler + elif hasattr(compiler, 'compiler'): # for ccompiler cpp_cmd = compiler.compiler.split() cpp_cmd += ['-E'] - elif hasattr(compiler, 'cc'): # for msvccompiler + elif hasattr(compiler, 'cc'): # for msvccompiler cpp_cmd = compiler.cc.split() cpp_cmd += ['-E'] + # On win-amd64 (py3.x), the default compiler is cross-compiling, from x86 + # to amd64 with %WIN_SDK_ROOT%\x86_amd64\cl.exe, but using this binary as a + # pre-processor generates no output, so we use %WIN_SDK_ROOT%\cl.exe + # instead. + if len(cpp_cmd) > 1 and 'cl.exe' in cpp_cmd[-2]: + plat = os.path.basename(os.path.dirname(cpp_cmd[-2])) + if plat == 'x86_amd64': + print('workaround on win64 to avoid empty pre-processor output') + cpp_cmd[-2] = cpp_cmd[-2].replace('x86_amd64', '') + elif True in ['amd64' in f for f in cpp_cmd]: + print('warning: not using workaround for', cpp_cmd[0], plat) + if not cpp_cmd: print("Warning: could not guess preprocessor, using env's CC") cpp_cmd = os.environ.get('CC', 'cc').split() cpp_cmd += ['-E'] - + if 'emcc' in cpp_cmd: + cpp_cmd += ['-x', 'c'] # emcc defaults to c++, force C language return cpp_cmd -def get_cpp_objects(header=header): + +def get_c_declarations(header=header, usedouble=False): + ''' return a dense and preprocessed string of all c declarations implied by aubio.h + ''' + cpp_output = get_cpp_output(header=header, usedouble=usedouble) + return filter_cpp_output (cpp_output) + + +def get_cpp_output(header=header, usedouble=False): + ''' find and run a C pre-processor on aubio.h ''' cpp_cmd = get_preprocessor() macros = [('AUBIO_UNSTABLE', 1)] + if usedouble: + macros += [('HAVE_AUBIO_DOUBLE', 1)] if not os.path.isfile(header): raise Exception("could not find include file " + header) @@ -91,59 +119,115 @@ def get_cpp_objects(header=header): print("Running command: {:s}".format(" ".join(cpp_cmd))) proc = subprocess.Popen(cpp_cmd, - stderr=subprocess.PIPE, - stdout=subprocess.PIPE) + stderr=subprocess.PIPE, + stdout=subprocess.PIPE) assert proc, 'Proc was none' cpp_output = proc.stdout.read() err_output = proc.stderr.read() + if err_output: + print("Warning: preprocessor produced errors or warnings:\n%s" \ + % err_output.decode('utf8')) if not cpp_output: - raise Exception("preprocessor output is empty:\n%s" % err_output) - elif err_output: - print ("Warning: preprocessor produced warnings:\n%s" % err_output) + raise_msg = "preprocessor output is empty! Running command " \ + + "\"%s\" failed" % " ".join(cpp_cmd) + if err_output: + raise_msg += " with stderr: \"%s\"" % err_output.decode('utf8') + else: + raise_msg += " with no stdout or stderr" + raise Exception(raise_msg) if not isinstance(cpp_output, list): cpp_output = [l.strip() for l in cpp_output.decode('utf8').split('\n')] - cpp_output = filter(lambda y: len(y) > 1, cpp_output) + return cpp_output + +def filter_cpp_output(cpp_raw_output): + ''' prepare cpp-output for parsing ''' + cpp_output = filter(lambda y: len(y) > 1, cpp_raw_output) cpp_output = list(filter(lambda y: not y.startswith('#'), cpp_output)) i = 1 while 1: - if i >= len(cpp_output): break - if cpp_output[i-1].endswith(',') or cpp_output[i-1].endswith('{') or cpp_output[i].startswith('}'): - cpp_output[i] = cpp_output[i-1] + ' ' + cpp_output[i] - cpp_output.pop(i-1) + if i >= len(cpp_output): + break + if ('{' in cpp_output[i - 1]) and ('}' not in cpp_output[i - 1]) or (';' not in cpp_output[i - 1]): + cpp_output[i] = cpp_output[i - 1] + ' ' + cpp_output[i] + cpp_output.pop(i - 1) + elif ('}' in cpp_output[i]): + cpp_output[i] = cpp_output[i - 1] + ' ' + cpp_output[i] + cpp_output.pop(i - 1) else: i += 1 - typedefs = filter(lambda y: y.startswith ('typedef struct _aubio'), cpp_output) + # clean pointer notations + tmp = [] + for l in cpp_output: + tmp += [l.replace(' *', ' * ')] + cpp_output = tmp + + return cpp_output + +def get_cpp_objects_from_c_declarations(c_declarations, skip_objects=None): + if skip_objects is None: + skip_objects = default_skip_objects + typedefs = filter(lambda y: y.startswith('typedef struct _aubio'), c_declarations) cpp_objects = [a.split()[3][:-1] for a in typedefs] + cpp_objects_filtered = filter(lambda y: not y[6:-2] in skip_objects, cpp_objects) + return cpp_objects_filtered + + +def get_all_func_names_from_lib(lib): + ''' return flat string of all function used in lib + ''' + res = [] + for _, v in lib.items(): + if isinstance(v, dict): + res += get_all_func_names_from_lib(v) + elif isinstance(v, list): + for elem in v: + e = elem.split('(') + if len(e) < 2: + continue # not a function + fname_part = e[0].strip().split(' ') + fname = fname_part[-1] + if fname: + res += [fname] + else: + raise NameError('gen_lib : weird function: ' + str(e)) - return cpp_output, cpp_objects + return res -def generate_external(header=header, output_path=output_path, usedouble=False, overwrite=True): - if not os.path.isdir(output_path): os.mkdir(output_path) - elif not overwrite: return sorted(glob.glob(os.path.join(output_path, '*.c'))) - sources_list = [] - cpp_output, cpp_objects = get_cpp_objects(header) + +def generate_lib_from_c_declarations(cpp_objects, c_declarations): + ''' returns a lib from given cpp_object names + + a lib is a dict grouping functions by family (onset,pitch...) + each eement is itself a dict of functions grouped by puposes as : + struct, new, del, do, get, set and other + ''' lib = {} for o in cpp_objects: - if o[:6] != 'aubio_': - continue - shortname = o[6:-2] - if shortname in skip_objects: - continue - lib[shortname] = {'struct': [], 'new': [], 'del': [], 'do': [], 'get': [], 'set': [], 'other': []} + shortname = o + if o[:6] == 'aubio_': + shortname = o[6:-2] # without aubio_ prefix and _t suffix + + lib[shortname] = {'struct': [], 'new': [], 'del': [], 'do': [], 'rdo': [], 'get': [], 'set': [], 'other': []} lib[shortname]['longname'] = o lib[shortname]['shortname'] = shortname - for fn in cpp_output: - if o[:-1] in fn: - #print "found", o[:-1], "in", fn + + fullshortname = o[:-2] # name without _t suffix + + for fn in c_declarations: + func_name = fn.split('(')[0].strip().split(' ')[-1] + if func_name.startswith(fullshortname + '_') or func_name.endswith(fullshortname): + # print "found", shortname, "in", fn if 'typedef struct ' in fn: lib[shortname]['struct'].append(fn) elif '_do' in fn: lib[shortname]['do'].append(fn) + elif '_rdo' in fn: + lib[shortname]['rdo'].append(fn) elif 'new_' in fn: lib[shortname]['new'].append(fn) elif 'del_' in fn: @@ -153,41 +237,52 @@ def generate_external(header=header, output_path=output_path, usedouble=False, o elif '_set_' in fn: lib[shortname]['set'].append(fn) else: - #print "no idea what to do about", fn + # print "no idea what to do about", fn lib[shortname]['other'].append(fn) + return lib + - """ - for fn in cpp_output: +def print_c_declarations_results(lib, c_declarations): + for fn in c_declarations: found = 0 for o in lib: for family in lib[o]: if fn in lib[o][family]: found = 1 if found == 0: - print "missing", fn + print("missing", fn) for o in lib: for family in lib[o]: if type(lib[o][family]) == str: - print ( "{:15s} {:10s} {:s}".format(o, family, lib[o][family] ) ) + print("{:15s} {:10s} {:s}".format(o, family, lib[o][family])) elif len(lib[o][family]) == 1: - print ( "{:15s} {:10s} {:s}".format(o, family, lib[o][family][0] ) ) + print("{:15s} {:10s} {:s}".format(o, family, lib[o][family][0])) else: - print ( "{:15s} {:10s} {:d}".format(o, family, len(lib[o][family]) ) ) - """ + print("{:15s} {:10s} {:s}".format(o, family, lib[o][family])) - try: - from .gen_code import MappedObject - except (SystemError, ValueError): - from gen_code import MappedObject + +def generate_external(header=header, output_path=output_path, usedouble=False, overwrite=True): + if not os.path.isdir(output_path): + os.mkdir(output_path) + elif not overwrite: + return sorted(glob.glob(os.path.join(output_path, '*.c'))) + + c_declarations = get_c_declarations(header, usedouble=usedouble) + cpp_objects = get_cpp_objects_from_c_declarations(c_declarations) + + lib = generate_lib_from_c_declarations(cpp_objects, c_declarations) + # print_c_declarations_results(lib, c_declarations) + + sources_list = [] for o in lib: out = source_header - mapped = MappedObject(lib[o], usedouble = usedouble) + mapped = MappedObject(lib[o], usedouble=usedouble) out += mapped.gen_code() output_file = os.path.join(output_path, 'gen-%s.c' % o) with open(output_file, 'w') as f: f.write(out) - print ("wrote %s" % output_file ) + print("wrote %s" % output_file) sources_list.append(output_file) out = source_header @@ -199,23 +294,23 @@ int generated_types_ready (void) {{ return ({pycheck_types}); }} -""".format(pycheck_types = check_types) +""".format(pycheck_types=check_types) add_types = "".join([""" Py_INCREF (&Py_{name}Type); - PyModule_AddObject(m, "{name}", (PyObject *) & Py_{name}Type);""".format(name = o) for o in lib]) + PyModule_AddObject(m, "{name}", (PyObject *) & Py_{name}Type);""".format(name=o) for o in lib]) out += """ void add_generated_objects ( PyObject *m ) {{ {add_types} }} -""".format(add_types = add_types) +""".format(add_types=add_types) output_file = os.path.join(output_path, 'aubio-generated.c') with open(output_file, 'w') as f: f.write(out) - print ("wrote %s" % output_file ) + print("wrote %s" % output_file) sources_list.append(output_file) objlist = "".join(["extern PyTypeObject Py_%sType;\n" % p for p in lib]) @@ -233,17 +328,19 @@ void add_generated_objects ( PyObject *m ) {objlist} int generated_objects ( void ); void add_generated_objects( PyObject *m ); -""".format(objlist = objlist) +""".format(objlist=objlist) output_file = os.path.join(output_path, 'aubio-generated.h') with open(output_file, 'w') as f: f.write(out) - print ("wrote %s" % output_file ) + print("wrote %s" % output_file) # no need to add header to list of sources return sorted(sources_list) if __name__ == '__main__': - if len(sys.argv) > 1: header = sys.argv[1] - if len(sys.argv) > 2: output_path = sys.argv[2] + if len(sys.argv) > 1: + header = sys.argv[1] + if len(sys.argv) > 2: + output_path = sys.argv[2] generate_external(header, output_path) diff --git a/python/lib/moresetuptools.py b/python/lib/moresetuptools.py index 348338c..299d1f9 100644 --- a/python/lib/moresetuptools.py +++ b/python/lib/moresetuptools.py @@ -2,7 +2,9 @@ # import sys, os, glob, subprocess import distutils, distutils.command.clean, distutils.dir_util -from .gen_external import generate_external, header, output_path +from gen_external import generate_external, header, output_path + +from this_version import get_aubio_version # inspired from https://gist.github.com/abergmeier/9488990 def add_packages(packages, ext=None, **kw): @@ -21,6 +23,7 @@ def add_packages(packages, ext=None, **kw): } for package in packages: + print("checking for {:s}".format(package)) cmd = ['pkg-config', '--libs', '--cflags', package] try: tokens = subprocess.check_output(cmd) @@ -54,35 +57,45 @@ def add_local_aubio_lib(ext): def add_local_aubio_sources(ext): """ build aubio inside python module instead of linking against libaubio """ - print("Warning: libaubio was not built with waf, adding src/") - # create an empty header, macros will be passed on the command line - fake_config_header = os.path.join('python', 'ext', 'config.h') - distutils.file_util.write_file(fake_config_header, "") + print("Info: libaubio was not installed or built locally with waf, adding src/") aubio_sources = sorted(glob.glob(os.path.join('src', '**.c'))) aubio_sources += sorted(glob.glob(os.path.join('src', '*', '**.c'))) ext.sources += aubio_sources + +def add_local_macros(ext, usedouble = False): + if usedouble: + ext.define_macros += [('HAVE_AUBIO_DOUBLE', 1)] # define macros (waf puts them in build/src/config.h) for define_macro in ['HAVE_STDLIB_H', 'HAVE_STDIO_H', 'HAVE_MATH_H', 'HAVE_STRING_H', - 'HAVE_C99_VARARGS_MACROS', - 'HAVE_LIMITS_H', 'HAVE_MEMCPY_HACKS']: + 'HAVE_ERRNO_H', 'HAVE_C99_VARARGS_MACROS', + 'HAVE_LIMITS_H', 'HAVE_STDARG_H', + 'HAVE_MEMCPY_HACKS']: ext.define_macros += [(define_macro, 1)] +def add_external_deps(ext, usedouble = False): # loof for additional packages print("Info: looking for *optional* additional packages") - packages = ['libavcodec', 'libavformat', 'libavutil', 'libavresample', - 'jack', - 'sndfile', 'samplerate', + packages = ['libavcodec', 'libavformat', 'libavutil', + 'libswresample', 'libavresample', + 'sndfile', #'fftw3f', ] + # samplerate only works with float + if usedouble is False: + packages += ['samplerate'] + else: + print("Info: not adding libsamplerate in double precision mode") add_packages(packages, ext=ext) if 'avcodec' in ext.libraries \ and 'avformat' in ext.libraries \ - and 'avutil' in ext.libraries \ - and 'avresample' in ext.libraries: - ext.define_macros += [('HAVE_LIBAV', 1)] - if 'jack' in ext.libraries: - ext.define_macros += [('HAVE_JACK', 1)] + and 'avutil' in ext.libraries: + if 'swresample' in ext.libraries: + ext.define_macros += [('HAVE_SWRESAMPLE', 1)] + elif 'avresample' in ext.libraries: + ext.define_macros += [('HAVE_AVRESAMPLE', 1)] + if 'swresample' in ext.libraries or 'avresample' in ext.libraries: + ext.define_macros += [('HAVE_LIBAV', 1)] if 'sndfile' in ext.libraries: ext.define_macros += [('HAVE_SNDFILE', 1)] if 'samplerate' in ext.libraries: @@ -103,41 +116,78 @@ def add_local_aubio_sources(ext): ext.define_macros += [('HAVE_WAVWRITE', 1)] ext.define_macros += [('HAVE_WAVREAD', 1)] - # TODO: - # add cblas + + # TODO: add cblas if 0: ext.libraries += ['cblas'] ext.define_macros += [('HAVE_ATLAS_CBLAS_H', 1)] def add_system_aubio(ext): # use pkg-config to find aubio's location - add_packages(['aubio'], ext) + aubio_version = get_aubio_version() + add_packages(['aubio = ' + aubio_version], ext) if 'aubio' not in ext.libraries: - print("Error: libaubio not found") + print("Info: aubio " + aubio_version + " was not found by pkg-config") + else: + print("Info: using system aubio " + aubio_version + " found in " + ' '.join(ext.library_dirs)) + +def add_libav_on_win(ext): + """ no pkg-config on windows, simply assume these libs are available """ + ext.libraries += ['avformat', 'avutil', 'avcodec', 'swresample'] + for define_macro in ['HAVE_LIBAV', 'HAVE_SWRESAMPLE']: + ext.define_macros += [(define_macro, 1)] class CleanGenerated(distutils.command.clean.clean): def run(self): if os.path.isdir(output_path): distutils.dir_util.remove_tree(output_path) - config = os.path.join('python', 'ext', 'config.h') - distutils.command.clean.clean.run(self) -class GenerateCommand(distutils.cmd.Command): - description = 'generate gen/gen-*.c files from ../src/aubio.h' - user_options = [ +from distutils.command.build_ext import build_ext as _build_ext +class build_ext(_build_ext): + + user_options = _build_ext.user_options + [ # The format is (long option, short option, description). ('enable-double', None, 'use HAVE_AUBIO_DOUBLE=1 (default: 0)'), ] def initialize_options(self): + _build_ext.initialize_options(self) self.enable_double = False def finalize_options(self): + _build_ext.finalize_options(self) if self.enable_double: self.announce( 'will generate code for aubio compiled with HAVE_AUBIO_DOUBLE=1', level=distutils.log.INFO) - def run(self): - self.announce( 'Generating code', level=distutils.log.INFO) - generated_object_files = generate_external(header, output_path, usedouble=self.enable_double) + def build_extension(self, extension): + if self.enable_double or 'HAVE_AUBIO_DOUBLE' in os.environ: + enable_double = True + else: + enable_double = False + # seack for aubio headers and lib in PKG_CONFIG_PATH + add_system_aubio(extension) + # the lib was not installed on this system + if 'aubio' not in extension.libraries: + # use local src/aubio.h + if os.path.isfile(os.path.join('src', 'aubio.h')): + add_local_aubio_header(extension) + add_local_macros(extension, usedouble=enable_double) + # look for a local waf build + if os.path.isfile(os.path.join('build','src', 'fvec.c.1.o')): + add_local_aubio_lib(extension) + else: + # check for external dependencies + add_external_deps(extension, usedouble=enable_double) + # force adding libav on windows + if os.name == 'nt' and ('WITH_LIBAV' in os.environ \ + or 'CONDA_PREFIX' in os.environ): + add_libav_on_win(extension) + # add libaubio sources and look for optional deps with pkg-config + add_local_aubio_sources(extension) + # generate files python/gen/*.c, python/gen/aubio-generated.h + extension.include_dirs += [ output_path ] + extension.sources += generate_external(header, output_path, overwrite = False, + usedouble=enable_double) + return _build_ext.build_extension(self, extension) diff --git a/python/scripts/aubiocut b/python/scripts/aubiocut deleted file mode 100755 index eec68da..0000000 --- a/python/scripts/aubiocut +++ /dev/null @@ -1,206 +0,0 @@ -#! /usr/bin/env python - -""" this file was written by Paul Brossier - it is released under the GNU/GPL license. -""" - -import sys -#from aubio.task import * - -usage = "usage: %s [options] -i soundfile" % sys.argv[0] -usage += "\n help: %s -h" % sys.argv[0] - -def parse_args(): - from optparse import OptionParser - parser = OptionParser(usage=usage) - parser.add_option("-i", "--input", action = "store", dest = "source_file", - help="input sound file to analyse", metavar = "<source_file>") - parser.add_option("-O","--onset-method", - action="store", dest="onset_method", default='default', - metavar = "<onset_method>", - help="onset detection method [default=default] \ - complexdomain|hfc|phase|specdiff|energy|kl|mkl") - # cutting methods - parser.add_option("-b","--beat", - action="store_true", dest="beat", default=False, - help="use beat locations") - """ - parser.add_option("-S","--silencecut", - action="store_true", dest="silencecut", default=False, - help="use silence locations") - parser.add_option("-s","--silence", - metavar = "<value>", - action="store", dest="silence", default=-70, - help="silence threshold [default=-70]") - """ - # algorithm parameters - parser.add_option("-r", "--samplerate", - metavar = "<freq>", type='int', - action="store", dest="samplerate", default=0, - help="samplerate at which the file should be represented") - parser.add_option("-B","--bufsize", - action="store", dest="bufsize", default=512, - metavar = "<size>", type='int', - help="buffer size [default=512]") - parser.add_option("-H","--hopsize", - metavar = "<size>", type='int', - action="store", dest="hopsize", default=256, - help="overlap size [default=256]") - parser.add_option("-t","--onset-threshold", - metavar = "<value>", type="float", - action="store", dest="threshold", default=0.3, - help="onset peak picking threshold [default=0.3]") - parser.add_option("-c","--cut", - action="store_true", dest="cut", default=False, - help="cut input sound file at detected labels \ - best used with option -L") - - # minioi - parser.add_option("-M","--minioi", - metavar = "<value>", type='string', - action="store", dest="minioi", default="12ms", - help="minimum inter onset interval [default=12ms]") - - """ - parser.add_option("-D","--delay", - action = "store", dest = "delay", type = "float", - metavar = "<seconds>", default=0, - help="number of seconds to take back [default=system]\ - default system delay is 3*hopsize/samplerate") - parser.add_option("-C","--dcthreshold", - metavar = "<value>", - action="store", dest="dcthreshold", default=1., - help="onset peak picking DC component [default=1.]") - parser.add_option("-L","--localmin", - action="store_true", dest="localmin", default=False, - help="use local minima after peak detection") - parser.add_option("-d","--derivate", - action="store_true", dest="derivate", default=False, - help="derivate onset detection function") - parser.add_option("-z","--zerocross", - metavar = "<value>", - action="store", dest="zerothres", default=0.008, - help="zero-crossing threshold for slicing [default=0.00008]") - """ - # plotting functions - """ - parser.add_option("-p","--plot", - action="store_true", dest="plot", default=False, - help="draw plot") - parser.add_option("-x","--xsize", - metavar = "<size>", - action="store", dest="xsize", default=1., - type='float', help="define xsize for plot") - parser.add_option("-y","--ysize", - metavar = "<size>", - action="store", dest="ysize", default=1., - type='float', help="define ysize for plot") - parser.add_option("-f","--function", - action="store_true", dest="func", default=False, - help="print detection function") - parser.add_option("-n","--no-onsets", - action="store_true", dest="nplot", default=False, - help="do not plot detected onsets") - parser.add_option("-O","--outplot", - metavar = "<output_image>", - action="store", dest="outplot", default=None, - help="save plot to output.{ps,png}") - parser.add_option("-F","--spectrogram", - action="store_true", dest="spectro", default=False, - help="add spectrogram to the plot") - """ - parser.add_option("-o","--output", type = str, - metavar = "<outputdir>", - action="store", dest="output_directory", default=None, - help="specify path where slices of the original file should be created") - parser.add_option("--cut-until-nsamples", type = int, - metavar = "<samples>", - action = "store", dest = "cut_until_nsamples", default = None, - help="how many extra samples should be added at the end of each slice") - parser.add_option("--cut-until-nslices", type = int, - metavar = "<slices>", - action = "store", dest = "cut_until_nslices", default = None, - help="how many extra slices should be added at the end of each slice") - - parser.add_option("-v","--verbose", - action="store_true", dest="verbose", default=True, - help="make lots of noise [default]") - parser.add_option("-q","--quiet", - action="store_false", dest="verbose", default=True, - help="be quiet") - (options, args) = parser.parse_args() - if not options.source_file: - import os.path - if len(args) == 1: - options.source_file = args[0] - else: - print ("no file name given\n" + usage) - sys.exit(1) - return options, args - -if __name__ == '__main__': - options, args = parse_args() - - hopsize = options.hopsize - bufsize = options.bufsize - samplerate = options.samplerate - source_file = options.source_file - - from aubio import onset, tempo, source, sink - - s = source(source_file, samplerate, hopsize) - if samplerate == 0: samplerate = s.get_samplerate() - - if options.beat: - o = tempo(options.onset_method, bufsize, hopsize) - else: - o = onset(options.onset_method, bufsize, hopsize) - if options.minioi: - if options.minioi.endswith('ms'): - o.set_minioi_ms(int(options.minioi[:-2])) - elif options.minioi.endswith('s'): - o.set_minioi_s(int(options.minioi[:-1])) - else: - o.set_minioi(int(options.minioi)) - o.set_threshold(options.threshold) - - timestamps = [] - total_frames = 0 - # analyze pass - while True: - samples, read = s() - if o(samples): - timestamps.append (o.get_last()) - if options.verbose: print ("%.4f" % o.get_last_s()) - total_frames += read - if read < hopsize: break - del s - # print some info - nstamps = len(timestamps) - duration = float (total_frames) / float(samplerate) - info = 'found %(nstamps)d timestamps in %(source_file)s' % locals() - info += ' (total %(duration).2fs at %(samplerate)dHz)\n' % locals() - sys.stderr.write(info) - - # cutting pass - if options.cut and nstamps > 0: - # generate output files - from aubio.slicing import slice_source_at_stamps - timestamps_end = None - if options.cut_until_nslices and options.cut_until_nsamples: - print ("warning: using cut_until_nslices, but cut_until_nsamples is set") - if options.cut_until_nsamples: - timestamps_end = [t + options.cut_until_nsamples for t in timestamps[1:]] - timestamps_end += [ 1e120 ] - if options.cut_until_nslices: - timestamps_end = [t for t in timestamps[1 + options.cut_until_nslices:]] - timestamps_end += [ 1e120 ] * (options.cut_until_nslices + 1) - slice_source_at_stamps(source_file, timestamps, timestamps_end = timestamps_end, - output_dir = options.output_directory, - samplerate = samplerate) - - # print some info - duration = float (total_frames) / float(samplerate) - info = 'created %(nstamps)d slices from %(source_file)s' % locals() - info += ' (total %(duration).2fs at %(samplerate)dHz)\n' % locals() - sys.stderr.write(info) diff --git a/python/tests/__init__.py b/python/tests/__init__.py deleted file mode 100644 index 8b13789..0000000 --- a/python/tests/__init__.py +++ /dev/null @@ -1 +0,0 @@ - diff --git a/python/tests/_tools.py b/python/tests/_tools.py new file mode 100644 index 0000000..4121656 --- /dev/null +++ b/python/tests/_tools.py @@ -0,0 +1,41 @@ +""" +This file imports test methods from different testing modules, in this +order: + + - try importing 'pytest' + - if it fails, fallback to 'numpy.testing' + +Nose2 support was removed because of lacking assertWarns on py2.7. + +""" + +import sys + +_has_pytest = False + +# check if we have pytest +try: + import pytest + parametrize = pytest.mark.parametrize + assert_raises = pytest.raises + assert_warns = pytest.warns + skipTest = pytest.skip + _has_pytest = True + def run_module_suite(): + import sys, pytest + pytest.main(sys.argv) +except: + pass + +# otherwise fallback on numpy.testing +if not _has_pytest: + from numpy.testing import dec, assert_raises, assert_warns + from numpy.testing import SkipTest + parametrize = dec.parametrize + def skipTest(msg): + raise SkipTest(msg) + from numpy.testing import run_module_suite + +# always use numpy's assert_equal +import numpy +assert_equal = numpy.testing.assert_equal diff --git a/python/tests/run_all_tests b/python/tests/run_all_tests deleted file mode 100755 index bc6bb8c..0000000 --- a/python/tests/run_all_tests +++ /dev/null @@ -1,5 +0,0 @@ -#! /usr/bin/env python - -if __name__ == '__main__': - import nose2.main - nose2.discover() diff --git a/python/tests/test_aubio.py b/python/tests/test_aubio.py index cac8397..5768cac 100755 --- a/python/tests/test_aubio.py +++ b/python/tests/test_aubio.py @@ -1,6 +1,5 @@ #! /usr/bin/env python -from unittest import main from numpy.testing import TestCase class aubiomodule_test_case(TestCase): @@ -9,6 +8,11 @@ class aubiomodule_test_case(TestCase): """ try importing aubio """ import aubio + def test_version(self): + """ test aubio.version """ + import aubio + self.assertEqual('0', aubio.version[0]) + if __name__ == '__main__': + from unittest import main main() - diff --git a/python/tests/test_aubio_cmd.py b/python/tests/test_aubio_cmd.py new file mode 100755 index 0000000..471ac85 --- /dev/null +++ b/python/tests/test_aubio_cmd.py @@ -0,0 +1,34 @@ +#! /usr/bin/env python + +from numpy.testing import TestCase +import aubio.cmd + +class aubio_cmd(TestCase): + + def setUp(self): + self.a_parser = aubio.cmd.aubio_parser() + + def test_default_creation(self): + try: + assert self.a_parser.parse_args(['-V']).show_version + except SystemExit: + url = 'https://bugs.python.org/issue9253' + self.skipTest('subcommand became optional in py3, see %s' % url) + +class aubio_cmd_utils(TestCase): + + def test_samples2seconds(self): + self.assertEqual(aubio.cmd.samples2seconds(3200, 32000), + "0.100000\t") + + def test_samples2milliseconds(self): + self.assertEqual(aubio.cmd.samples2milliseconds(3200, 32000), + "100.000000\t") + + def test_samples2samples(self): + self.assertEqual(aubio.cmd.samples2samples(3200, 32000), + "3200\t") + +if __name__ == '__main__': + from unittest import main + main() diff --git a/python/tests/test_aubio_cut.py b/python/tests/test_aubio_cut.py new file mode 100755 index 0000000..01ad2c6 --- /dev/null +++ b/python/tests/test_aubio_cut.py @@ -0,0 +1,16 @@ +#! /usr/bin/env python + +import aubio.cut +from numpy.testing import TestCase + +class aubio_cut(TestCase): + + def setUp(self): + self.a_parser = aubio.cut.aubio_cut_parser() + + def test_default_creation(self): + assert self.a_parser.parse_args(['-v']).verbose + +if __name__ == '__main__': + from unittest import main + main() diff --git a/python/tests/test_cvec.py b/python/tests/test_cvec.py index 53bf8df..73ee654 100755 --- a/python/tests/test_cvec.py +++ b/python/tests/test_cvec.py @@ -1,6 +1,5 @@ #! /usr/bin/env python -from unittest import main import numpy as np from numpy.testing import TestCase, assert_equal from aubio import cvec, fvec, float_type @@ -11,9 +10,8 @@ class aubio_cvec_test_case(TestCase): def test_vector_created_with_zeroes(self): a = cvec(10) - assert_equal(a.norm.shape[0], 10 / 2 + 1) - assert_equal(a.phas.shape[0], 10 / 2 + 1) - _ = a.norm[0] + assert_equal(a.norm.shape[0], 10 // 2 + 1) + assert_equal(a.phas.shape[0], 10 // 2 + 1) assert_equal(a.norm, 0.) assert_equal(a.phas, 0.) @@ -142,4 +140,5 @@ class aubio_cvec_wrong_norm_input(TestCase): a.norm = np.zeros((512//2+1, 2), dtype = float_type) if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_dct.py b/python/tests/test_dct.py new file mode 100755 index 0000000..c9b2ba7 --- /dev/null +++ b/python/tests/test_dct.py @@ -0,0 +1,72 @@ +#! /usr/bin/env python + + +import numpy as np +from numpy.testing import TestCase, assert_almost_equal +import aubio + +precomputed_arange = [ 9.89949512, -6.44232273, 0., -0.67345482, 0., + -0.20090288, 0., -0.05070186] + +precomputed_some_ones = [ 4.28539848, 0.2469689, -0.14625292, -0.58121818, + -0.83483052, -0.75921834, -0.35168475, 0.24087936, + 0.78539824, 1.06532764, 0.97632152, 0.57164496, 0.03688532, + -0.39446154, -0.54619485, -0.37771079] + +class aubio_dct(TestCase): + + def test_init(self): + """ test that aubio.dct() is created with expected size """ + a_dct = aubio.dct() + self.assertEqual(a_dct.size, 1024) + + def test_arange(self): + """ test that dct(arange(8)) is computed correctly + + >>> from scipy.fftpack import dct + >>> a_in = np.arange(8).astype(aubio.float_type) + >>> precomputed = dct(a_in, norm='ortho') + """ + N = len(precomputed_arange) + a_dct = aubio.dct(8) + a_in = np.arange(8).astype(aubio.float_type) + a_expected = aubio.fvec(precomputed_arange) + assert_almost_equal(a_dct(a_in), a_expected, decimal=5) + + def test_some_ones(self): + """ test that dct(somevector) is computed correctly """ + a_dct = aubio.dct(16) + a_in = np.ones(16).astype(aubio.float_type) + a_in[1] = 0 + a_in[3] = np.pi + a_expected = aubio.fvec(precomputed_some_ones) + assert_almost_equal(a_dct(a_in), a_expected, decimal=6) + + def test_reconstruction(self): + """ test that some_ones vector can be recontructed """ + a_dct = aubio.dct(16) + a_in = np.ones(16).astype(aubio.float_type) + a_in[1] = 0 + a_in[3] = np.pi + a_dct_in = a_dct(a_in) + a_dct_reconstructed = a_dct.rdo(a_dct_in) + assert_almost_equal(a_dct_reconstructed, a_in, decimal=6) + + def test_negative_size(self): + """ test that creation fails with a negative size """ + with self.assertRaises(ValueError): + aubio.dct(-1) + + def test_wrong_size(self): + """ test that creation fails with a non power-of-two size """ + # supports for non 2** fft sizes only when compiled with fftw3 + size = 13 + try: + with self.assertRaises(RuntimeError): + aubio.dct(size) + except AssertionError: + self.skipTest('creating aubio.dct with size %d did not fail' % size) + +if __name__ == '__main__': + from unittest import main + main() diff --git a/python/tests/test_fft.py b/python/tests/test_fft.py index a8f82b9..abe95d3 100755 --- a/python/tests/test_fft.py +++ b/python/tests/test_fft.py @@ -1,6 +1,5 @@ #! /usr/bin/env python -from unittest import main from numpy.testing import TestCase from numpy.testing import assert_equal, assert_almost_equal import numpy as np @@ -142,6 +141,37 @@ class aubio_fft_test_case(TestCase): assert_almost_equal ( r[0], impulse, decimal = 6) assert_almost_equal ( r[1:], 0) +class aubio_fft_odd_sizes(TestCase): + + def test_reconstruct_with_odd_size(self): + win_s = 29 + self.recontruct(win_s, 'odd sizes not supported') + + def test_reconstruct_with_radix15(self): + win_s = 2 ** 4 * 15 + self.recontruct(win_s, 'radix 15 supported') + + def test_reconstruct_with_radix5(self): + win_s = 2 ** 4 * 5 + self.recontruct(win_s, 'radix 5 supported') + + def test_reconstruct_with_radix3(self): + win_s = 2 ** 4 * 3 + self.recontruct(win_s, 'radix 3 supported') + + def recontruct(self, win_s, skipMessage): + try: + f = fft(win_s) + except RuntimeError: + self.skipTest(skipMessage) + input_signal = fvec(win_s) + input_signal[win_s//2] = 1 + c = f(input_signal) + output_signal = f.rdo(c) + assert_almost_equal(input_signal, output_signal) + +class aubio_fft_wrong_params(TestCase): + def test_large_input_timegrain(self): win_s = 1024 f = fft(win_s) @@ -170,26 +200,16 @@ class aubio_fft_test_case(TestCase): with self.assertRaises(ValueError): f.rdo(s) -class aubio_fft_wrong_params(TestCase): - def test_wrong_buf_size(self): win_s = -1 with self.assertRaises(ValueError): fft(win_s) - def test_buf_size_not_power_of_two(self): - # when compiled with fftw3, aubio supports non power of two fft sizes - win_s = 320 - try: - with self.assertRaises(RuntimeError): - fft(win_s) - except AssertionError: - self.skipTest('creating aubio.fft with size %d did not fail' % win_s) - def test_buf_size_too_small(self): win_s = 1 with self.assertRaises(RuntimeError): fft(win_s) if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_filter.py b/python/tests/test_filter.py index 6a7a0de..576ae11 100755 --- a/python/tests/test_filter.py +++ b/python/tests/test_filter.py @@ -1,9 +1,8 @@ #! /usr/bin/env python -from unittest import main from numpy.testing import TestCase, assert_equal, assert_almost_equal from aubio import fvec, digital_filter -from .utils import array_from_text_file +from utils import array_from_text_file class aubio_filter_test_case(TestCase): @@ -77,6 +76,16 @@ class aubio_filter_test_case(TestCase): with self.assertRaises(ValueError): f.set_biquad(0., 0., 0, 0., 0.) + def test_all_available_presets(self): + f = digital_filter(7) + for sr in [8000, 11025, 16000, 22050, 24000, 32000, + 44100, 48000, 88200, 96000, 192000]: + f.set_a_weighting(sr) + f = digital_filter(5) + for sr in [8000, 11025, 16000, 22050, 24000, 32000, + 44100, 48000, 88200, 96000, 192000]: + f.set_c_weighting(sr) + class aubio_filter_wrong_params(TestCase): def test_negative_order(self): @@ -84,4 +93,5 @@ class aubio_filter_wrong_params(TestCase): digital_filter(-1) if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_filterbank.py b/python/tests/test_filterbank.py index 3245008..67a8558 100755 --- a/python/tests/test_filterbank.py +++ b/python/tests/test_filterbank.py @@ -1,11 +1,10 @@ #! /usr/bin/env python -from unittest import main -from numpy.testing import TestCase -from numpy.testing import assert_equal, assert_almost_equal import numpy as np +from numpy.testing import TestCase, assert_equal, assert_almost_equal + from aubio import cvec, filterbank, float_type -from .utils import array_from_text_file +from utils import array_from_text_file class aubio_filterbank_test_case(TestCase): @@ -62,6 +61,13 @@ class aubio_filterbank_test_case(TestCase): f.set_mel_coeffs_slaney(16000) assert_almost_equal ( expected, f.get_coeffs() ) + def test_mfcc_coeffs_get_coeffs(self): + f = filterbank(40, 512) + coeffs = f.get_coeffs() + self.assertIsInstance(coeffs, np.ndarray) + assert_equal (coeffs, 0) + assert_equal (np.shape(coeffs), (40, 512 / 2 + 1)) + class aubio_filterbank_wrong_values(TestCase): def test_negative_window(self): @@ -81,4 +87,5 @@ class aubio_filterbank_wrong_values(TestCase): f(cvec(256)) if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_filterbank_mel.py b/python/tests/test_filterbank_mel.py index 1ce38e9..0771264 100755 --- a/python/tests/test_filterbank_mel.py +++ b/python/tests/test_filterbank_mel.py @@ -1,10 +1,10 @@ #! /usr/bin/env python -from unittest import main -from numpy.testing import TestCase -from numpy.testing import assert_equal, assert_almost_equal -from numpy import array, shape -from aubio import cvec, filterbank, float_type +import numpy as np +from numpy.testing import TestCase, assert_equal, assert_almost_equal +from _tools import assert_warns + +from aubio import fvec, cvec, filterbank, float_type class aubio_filterbank_mel_test_case(TestCase): @@ -12,38 +12,160 @@ class aubio_filterbank_mel_test_case(TestCase): f = filterbank(40, 512) f.set_mel_coeffs_slaney(16000) a = f.get_coeffs() - assert_equal(shape (a), (40, 512/2 + 1) ) + assert_equal(np.shape (a), (40, 512/2 + 1) ) def test_other_slaney(self): f = filterbank(40, 512*2) f.set_mel_coeffs_slaney(44100) - _ = f.get_coeffs() + self.assertIsInstance(f.get_coeffs(), np.ndarray) #print "sum is", sum(sum(a)) for win_s in [256, 512, 1024, 2048, 4096]: f = filterbank(40, win_s) f.set_mel_coeffs_slaney(32000) - _ = f.get_coeffs() #print "sum is", sum(sum(a)) + self.assertIsInstance(f.get_coeffs(), np.ndarray) def test_triangle_freqs_zeros(self): f = filterbank(9, 1024) freq_list = [40, 80, 200, 400, 800, 1600, 3200, 6400, 12800, 15000, 24000] - freqs = array(freq_list, dtype = float_type) + freqs = np.array(freq_list, dtype = float_type) f.set_triangle_bands(freqs, 48000) - _ = f.get_coeffs().T assert_equal ( f(cvec(1024)), 0) + self.assertIsInstance(f.get_coeffs(), np.ndarray) def test_triangle_freqs_ones(self): f = filterbank(9, 1024) freq_list = [40, 80, 200, 400, 800, 1600, 3200, 6400, 12800, 15000, 24000] - freqs = array(freq_list, dtype = float_type) + freqs = np.array(freq_list, dtype = float_type) f.set_triangle_bands(freqs, 48000) - _ = f.get_coeffs().T + self.assertIsInstance(f.get_coeffs(), np.ndarray) spec = cvec(1024) spec.norm[:] = 1 assert_almost_equal ( f(spec), [ 0.02070313, 0.02138672, 0.02127604, 0.02135417, 0.02133301, 0.02133301, 0.02133311, 0.02133334, 0.02133345]) + def test_triangle_freqs_with_zeros(self): + """make sure set_triangle_bands works when list starts with 0""" + freq_list = [0, 40, 80] + freqs = np.array(freq_list, dtype = float_type) + f = filterbank(len(freqs)-2, 1024) + f.set_triangle_bands(freqs, 48000) + assert_equal ( f(cvec(1024)), 0) + self.assertIsInstance(f.get_coeffs(), np.ndarray) + + def test_triangle_freqs_with_wrong_negative(self): + """make sure set_triangle_bands fails when list contains a negative""" + freq_list = [-10, 0, 80] + f = filterbank(len(freq_list)-2, 1024) + with self.assertRaises(ValueError): + f.set_triangle_bands(fvec(freq_list), 48000) + + def test_triangle_freqs_with_wrong_ordering(self): + """make sure set_triangle_bands fails when list not ordered""" + freq_list = [0, 80, 40] + f = filterbank(len(freq_list)-2, 1024) + with self.assertRaises(ValueError): + f.set_triangle_bands(fvec(freq_list), 48000) + + def test_triangle_freqs_with_large_freq(self): + """make sure set_triangle_bands warns when freq > nyquist""" + samplerate = 22050 + freq_list = [0, samplerate//4, samplerate // 2 + 1] + f = filterbank(len(freq_list)-2, 1024) + with assert_warns(UserWarning): + f.set_triangle_bands(fvec(freq_list), samplerate) + + def test_triangle_freqs_with_not_enough_filters(self): + """make sure set_triangle_bands warns when not enough filters""" + samplerate = 22050 + freq_list = [0, 100, 1000, 4000, 8000, 10000] + f = filterbank(len(freq_list)-3, 1024) + with assert_warns(UserWarning): + f.set_triangle_bands(fvec(freq_list), samplerate) + + def test_triangle_freqs_with_too_many_filters(self): + """make sure set_triangle_bands warns when too many filters""" + samplerate = 22050 + freq_list = [0, 100, 1000, 4000, 8000, 10000] + f = filterbank(len(freq_list)-1, 1024) + with assert_warns(UserWarning): + f.set_triangle_bands(fvec(freq_list), samplerate) + + def test_triangle_freqs_with_double_value(self): + """make sure set_triangle_bands works with 2 duplicate freqs""" + samplerate = 22050 + freq_list = [0, 100, 1000, 4000, 4000, 4000, 10000] + f = filterbank(len(freq_list)-2, 1024) + with assert_warns(UserWarning): + f.set_triangle_bands(fvec(freq_list), samplerate) + + def test_triangle_freqs_with_triple(self): + """make sure set_triangle_bands works with 3 duplicate freqs""" + samplerate = 22050 + freq_list = [0, 100, 1000, 4000, 4000, 4000, 10000] + f = filterbank(len(freq_list)-2, 1024) + with assert_warns(UserWarning): + f.set_triangle_bands(fvec(freq_list), samplerate) + + + def test_triangle_freqs_without_norm(self): + """make sure set_triangle_bands works without """ + samplerate = 22050 + freq_list = fvec([0, 100, 1000, 10000]) + f = filterbank(len(freq_list) - 2, 1024) + f.set_norm(0) + f.set_triangle_bands(freq_list, samplerate) + expected = f.get_coeffs() + f.set_norm(1) + f.set_triangle_bands(fvec(freq_list), samplerate) + assert_almost_equal(f.get_coeffs().T, + expected.T * 2. / (freq_list[2:] - freq_list[:-2])) + + def test_triangle_freqs_wrong_norm(self): + f = filterbank(10, 1024) + with self.assertRaises(ValueError): + f.set_norm(-1) + + def test_triangle_freqs_with_power(self): + f = filterbank(9, 1024) + freqs = fvec([40, 80, 200, 400, 800, 1600, 3200, 6400, 12800, 15000, + 24000]) + f.set_power(2) + f.set_triangle_bands(freqs, 48000) + spec = cvec(1024) + spec.norm[:] = .1 + expected = fvec([0.02070313, 0.02138672, 0.02127604, 0.02135417, + 0.02133301, 0.02133301, 0.02133311, 0.02133334, 0.02133345]) + expected /= 100. + assert_almost_equal(f(spec), expected) + + def test_mel_coeffs(self): + f = filterbank(40, 1024) + f.set_mel_coeffs(44100, 0, 44100 / 2) + + def test_zero_fmax(self): + f = filterbank(40, 1024) + f.set_mel_coeffs(44100, 0, 0) + + def test_wrong_mel_coeffs(self): + f = filterbank(40, 1024) + with self.assertRaises(ValueError): + f.set_mel_coeffs_slaney(0) + with self.assertRaises(ValueError): + f.set_mel_coeffs(44100, 0, -44100 / 2) + with self.assertRaises(ValueError): + f.set_mel_coeffs(44100, -0.1, 44100 / 2) + with self.assertRaises(ValueError): + f.set_mel_coeffs(-44100, 0.1, 44100 / 2) + with self.assertRaises(ValueError): + f.set_mel_coeffs_htk(-1, 0, 0) + + def test_mel_coeffs_htk(self): + f = filterbank(40, 1024) + f.set_mel_coeffs_htk(44100, 0, 44100 / 2) + + if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_fvec.py b/python/tests/test_fvec.py index 4e50f0f..765c9fe 100755 --- a/python/tests/test_fvec.py +++ b/python/tests/test_fvec.py @@ -1,6 +1,5 @@ #! /usr/bin/env python -from unittest import main import numpy as np from numpy.testing import TestCase, assert_equal, assert_almost_equal from aubio import fvec, zero_crossing_rate, alpha_norm, min_removal @@ -60,6 +59,14 @@ class aubio_fvec_wrong_values(TestCase): self.assertRaises(IndexError, a.__getitem__, 3) self.assertRaises(IndexError, a.__getitem__, 2) + def test_wrong_dimensions(self): + a = np.array([[[1, 2], [3, 4]]], dtype=float_type) + self.assertRaises(ValueError, fvec, a) + + def test_wrong_size(self): + a = np.ndarray([0,], dtype=float_type) + self.assertRaises(ValueError, fvec, a) + class aubio_wrong_fvec_input(TestCase): """ uses min_removal to test PyAubio_IsValidVector """ @@ -140,4 +147,5 @@ class aubio_fvec_test_memory(TestCase): del c if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_fvec_shift.py b/python/tests/test_fvec_shift.py new file mode 100644 index 0000000..c7a0315 --- /dev/null +++ b/python/tests/test_fvec_shift.py @@ -0,0 +1,35 @@ +#! /usr/bin/env python + +import numpy as np +from numpy.testing import TestCase, assert_equal +import aubio + +class aubio_shift_test_case(TestCase): + + def run_shift_ishift(self, n): + ramp = np.arange(n, dtype=aubio.float_type) + # construct expected output + # even length: [5. 6. 7. 8. 9. 0. 1. 2. 3. 4.] + # odd length: [4. 5. 6. 0. 1. 2. 3.] + half = n - n//2 + expected = np.concatenate([np.arange(half, n), np.arange(half)]) + # shift in place, returns modified copy + assert_equal(aubio.shift(ramp), expected) + # check input was changed as expected + assert_equal(ramp, expected) + # construct expected output + expected = np.arange(n) + # revert shift in place, returns modifed copy + assert_equal(aubio.ishift(ramp), expected) + # check input was shifted back + assert_equal(ramp, expected) + + def test_can_shift_fvec(self): + self.run_shift_ishift(10) + + def test_can_shift_fvec_odd(self): + self.run_shift_ishift(7) + +if __name__ == '__main__': + from unittest import main + main() diff --git a/python/tests/test_hztomel.py b/python/tests/test_hztomel.py new file mode 100755 index 0000000..a1f4f8e --- /dev/null +++ b/python/tests/test_hztomel.py @@ -0,0 +1,121 @@ +#! /usr/bin/env python + +from unittest import main +from numpy.testing import TestCase +from numpy.testing import assert_equal, assert_almost_equal +from _tools import assert_warns +from utils import is32bit +import numpy as np +import aubio + +from aubio import hztomel, meltohz +from aubio import hztomel_htk, meltohz_htk + +class aubio_hztomel_test_case(TestCase): + + def test_hztomel(self): + assert_equal(hztomel(0.), 0.) + assert_almost_equal(hztomel(400. / 3.), 2., decimal=5) + assert_almost_equal(hztomel(1000. / 3), 5.) + # on 32bit, some of these tests fails unless compiling with -ffloat-store + try: + assert_equal(hztomel(200.), 3.) + except AssertionError: + if not is32bit(): raise + assert_almost_equal(hztomel(200.), 3., decimal=5) + assert_almost_equal(hztomel(1000.), 15) + assert_almost_equal(hztomel(6400), 42, decimal=5) + assert_almost_equal(hztomel(40960), 69, decimal=5) + + for m in np.linspace(0, 1000, 100): + assert_almost_equal(hztomel(meltohz(m)) - m, 0, decimal=3) + + def test_meltohz(self): + assert_equal(meltohz(0.), 0.) + assert_almost_equal(meltohz(2), 400. / 3., decimal=4) + try: + assert_equal(meltohz(3.), 200.) + except AssertionError: + if not is32bit(): raise + assert_almost_equal(meltohz(3.), 200., decimal=5) + assert_almost_equal(meltohz(5), 1000. / 3., decimal=4) + assert_almost_equal(meltohz(15), 1000., decimal=4) + assert_almost_equal(meltohz(42), 6400., decimal=2) + assert_almost_equal(meltohz(69), 40960., decimal=1) + + for f in np.linspace(0, 20000, 1000): + assert_almost_equal(meltohz(hztomel(f)) - f, 0, decimal=1) + + def test_meltohz_negative(self): + with assert_warns(UserWarning): + assert_equal(meltohz(-1), 0) + + def test_hztomel_negative(self): + with assert_warns(UserWarning): + assert_equal(hztomel(-1), 0) + + +class aubio_hztomel_htk_test_case(TestCase): + + def test_meltohz(self): + assert_equal(meltohz(0, htk=True), 0) + assert_almost_equal(meltohz(2595, htk=True), 6300., decimal=1) + + def test_hztomel(self): + assert_equal(hztomel(0, htk=True), 0) + assert_almost_equal(hztomel(3428.7, htk=True), 2000., decimal=1) + assert_almost_equal(hztomel(6300, htk=True), 2595., decimal=1) + + def test_meltohz_negative(self): + with assert_warns(UserWarning): + assert_equal(meltohz(-1, htk=True), 0) + assert_almost_equal(meltohz(2000, htk=True), 3428.7, decimal=1) + assert_almost_equal(meltohz(1000, htk=True), 1000., decimal=1) + + def test_hztomel_negative(self): + with assert_warns(UserWarning): + assert_equal(meltohz(-1, htk=True), 0) + with assert_warns(UserWarning): + assert_equal(hztomel(-1, htk=True), 0) + assert_almost_equal(hztomel(1000, htk=True), 1000., decimal=1) + + def test_hztomel_htk(self): + for f in np.linspace(0, 20000, 1000): + assert_almost_equal(meltohz_htk(hztomel_htk(f)) - f, 0, decimal=1) + for f in np.linspace(0, 20000, 1000): + assert_almost_equal(hztomel_htk(meltohz_htk(f)) - f, 0, decimal=1) + + +class aubio_hztomel_wrong_values(TestCase): + """ more tests to cover all branches """ + + def test_hztomel_wrong_values(self): + with self.assertRaises(TypeError): + hztomel('s') + + def test_meltohz_wrong_values(self): + with self.assertRaises(TypeError): + meltohz(bytes('ad')) + + def test_meltohz_no_arg(self): + with self.assertRaises(TypeError): + meltohz() + + def test_meltohz_htk_no_arg(self): + with self.assertRaises(TypeError): + meltohz_htk() + + def test_hztomel_htk_wrong_values(self): + with self.assertRaises(TypeError): + hztomel_htk('0') + + def test_hztomel_htk_false(self): + assert hztomel(120, htk=False) == hztomel(120) + + def test_meltohz_htk_false(self): + assert meltohz(12, htk=False) == meltohz(12) + + +if __name__ == '__main__': + from unittest import main + main() diff --git a/python/tests/test_mathutils.py b/python/tests/test_mathutils.py index f68fb11..ee68966 100755 --- a/python/tests/test_mathutils.py +++ b/python/tests/test_mathutils.py @@ -1,6 +1,5 @@ #! /usr/bin/env python -from unittest import main from numpy.testing import TestCase, assert_equal from numpy import array, arange, isnan, isinf from aubio import bintomidi, miditobin, freqtobin, bintofreq, freqtomidi, miditofreq @@ -101,4 +100,5 @@ class aubio_mathutils(TestCase): assert_equal ( array(b) < 0, False ) if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_mfcc.py b/python/tests/test_mfcc.py index e7f3b18..fed7eb8 100755 --- a/python/tests/test_mfcc.py +++ b/python/tests/test_mfcc.py @@ -1,7 +1,6 @@ #! /usr/bin/env python -from nose2 import main -from nose2.tools import params +from _tools import parametrize, assert_raises from numpy import random, count_nonzero from numpy.testing import TestCase from aubio import mfcc, cvec, float_type @@ -15,28 +14,21 @@ samplerate = 44100 new_params = ['buf_size', 'n_filters', 'n_coeffs', 'samplerate'] new_deflts = [1024, 40, 13, 44100] -class aubio_mfcc(TestCase): +class Test_aubio_mfcc(object): - def setUp(self): - self.o = mfcc() + members_args = 'name' - def test_default_creation(self): - pass - - def test_delete(self): - del self.o - - @params(*new_params) + @parametrize(members_args, new_params) def test_read_only_member(self, name): - o = self.o - with self.assertRaises((TypeError, AttributeError)): + o = mfcc() + with assert_raises((TypeError, AttributeError)): setattr(o, name, 0) - @params(*zip(new_params, new_deflts)) + @parametrize('name, expected', zip(new_params, new_deflts)) def test_default_param(self, name, expected): """ test mfcc.{:s} = {:d} """.format(name, expected) - o = self.o - self.assertEqual( getattr(o, name), expected) + o = mfcc() + assert getattr(o, name) == expected class aubio_mfcc_wrong_params(TestCase): @@ -82,9 +74,9 @@ class aubio_mfcc_compute(TestCase): #print coeffs -class aubio_mfcc_all_parameters(TestCase): +class Test_aubio_mfcc_all_parameters(object): - @params( + run_values = [ (2048, 40, 13, 44100), (1024, 40, 13, 44100), (512, 40, 13, 44100), @@ -100,7 +92,10 @@ class aubio_mfcc_all_parameters(TestCase): #(1024, 30, 20, 44100), (1024, 40, 40, 44100), (1024, 40, 3, 44100), - ) + ] + run_args = ['buf_size', 'n_filters', 'n_coeffs', 'samplerate'] + + @parametrize(run_args, run_values) def test_run_with_params(self, buf_size, n_filters, n_coeffs, samplerate): " check mfcc can run with reasonable parameters " o = mfcc(buf_size, n_filters, n_coeffs, samplerate) @@ -110,5 +105,43 @@ class aubio_mfcc_all_parameters(TestCase): o(spec) #print coeffs + +class aubio_mfcc_fb_params(TestCase): + + def test_set_scale(self): + buf_size, n_filters, n_coeffs, samplerate = 512, 20, 10, 16000 + m = mfcc(buf_size, n_filters, n_coeffs, samplerate) + m.set_scale(10.5) + assert m.get_scale() == 10.5 + m(cvec(buf_size)) + + def test_set_power(self): + buf_size, n_filters, n_coeffs, samplerate = 512, 20, 10, 16000 + m = mfcc(buf_size, n_filters, n_coeffs, samplerate) + m.set_power(2.5) + assert m.get_power() == 2.5 + m(cvec(buf_size)) + + def test_set_mel_coeffs(self): + buf_size, n_filters, n_coeffs, samplerate = 512, 20, 10, 16000 + m = mfcc(buf_size, n_filters, n_coeffs, samplerate) + m.set_mel_coeffs(0., samplerate/2.) + m(cvec(buf_size)) + + def test_set_mel_coeffs_htk(self): + buf_size, n_filters, n_coeffs, samplerate = 512, 20, 10, 16000 + m = mfcc(buf_size, n_filters, n_coeffs, samplerate) + m.set_mel_coeffs_htk(0., samplerate/2.) + m(cvec(buf_size)) + + def test_set_mel_coeffs_slaney(self): + buf_size, n_filters, n_coeffs, samplerate = 512, 40, 10, 16000 + m = mfcc(buf_size, n_filters, n_coeffs, samplerate) + m.set_mel_coeffs_slaney() + m(cvec(buf_size)) + assert m.get_power() == 1 + assert m.get_scale() == 1 + if __name__ == '__main__': - main() + from _tools import run_module_suite + run_module_suite() diff --git a/python/tests/test_midi2note.py b/python/tests/test_midi2note.py index 1c2ccf5..5451c58 100755 --- a/python/tests/test_midi2note.py +++ b/python/tests/test_midi2note.py @@ -2,7 +2,7 @@ # -*- coding: utf-8 -*- from aubio import midi2note -import unittest +from _tools import parametrize, assert_raises list_of_known_midis = ( ( 0, 'C-1' ), @@ -14,30 +14,31 @@ list_of_known_midis = ( ( 127, 'G9' ), ) -class midi2note_good_values(unittest.TestCase): +class Test_midi2note_good_values(object): - def test_midi2note_known_values(self): + @parametrize('midi, note', list_of_known_midis) + def test_midi2note_known_values(self, midi, note): " known values are correctly converted " - for midi, note in list_of_known_midis: - self.assertEqual ( midi2note(midi), note ) + assert midi2note(midi) == (note) -class midi2note_wrong_values(unittest.TestCase): +class Test_midi2note_wrong_values(object): def test_midi2note_negative_value(self): " fails when passed a negative value " - self.assertRaises(ValueError, midi2note, -2) + assert_raises(ValueError, midi2note, -2) def test_midi2note_large(self): " fails when passed a value greater than 127 " - self.assertRaises(ValueError, midi2note, 128) + assert_raises(ValueError, midi2note, 128) def test_midi2note_floating_value(self): " fails when passed a floating point " - self.assertRaises(TypeError, midi2note, 69.2) + assert_raises(TypeError, midi2note, 69.2) def test_midi2note_character_value(self): " fails when passed a value that can not be transformed to integer " - self.assertRaises(TypeError, midi2note, "a") + assert_raises(TypeError, midi2note, "a") if __name__ == '__main__': - unittest.main() + from _tools import run_module_suite + run_module_suite() diff --git a/python/tests/test_musicutils.py b/python/tests/test_musicutils.py index dd54abb..eaa774f 100755 --- a/python/tests/test_musicutils.py +++ b/python/tests/test_musicutils.py @@ -1,9 +1,8 @@ #! /usr/bin/env python -from unittest import main import numpy as np from numpy.testing import TestCase -from numpy.testing.utils import assert_equal, assert_almost_equal +from numpy.testing import assert_equal, assert_almost_equal from aubio import window, level_lin, db_spl, silence_detection, level_detection from aubio import fvec, float_type @@ -85,4 +84,5 @@ class aubio_level_detection(TestCase): assert level_detection(ones(1024, dtype = float_type), -70) == 0 if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_note2midi.py b/python/tests/test_note2midi.py index 968c34a..0608195 100755 --- a/python/tests/test_note2midi.py +++ b/python/tests/test_note2midi.py @@ -3,8 +3,9 @@ from __future__ import unicode_literals -from aubio import note2midi, freq2note -import unittest +from aubio import note2midi, freq2note, note2freq, float_type +from numpy.testing import TestCase +from _tools import parametrize, assert_raises, skipTest list_of_known_notes = ( ( 'C-1', 0 ), @@ -13,26 +14,62 @@ list_of_known_notes = ( ( 'C3', 48 ), ( 'B3', 59 ), ( 'B#3', 60 ), + ( 'C♯4', 61 ), ( 'A4', 69 ), ( 'A#4', 70 ), + ( 'A♯4', 70 ), + ( 'A\u266f4', 70 ), ( 'Bb4', 70 ), ( 'B♭4', 70 ), + ( 'B\u266d4', 70 ), ( 'G8', 115 ), ( 'G♯8', 116 ), ( 'G9', 127 ), - ( 'G\udd2a2', 45 ), - ( 'B\ufffd2', 45 ), ( 'A♮2', 45 ), ) -class note2midi_good_values(unittest.TestCase): +list_of_known_notes_with_unicode_issues = ( + ('C𝄪4', 62 ), + ('E𝄫4', 62 ), + ) - def test_note2midi_known_values(self): - " known values are correctly converted " - for note, midi in list_of_known_notes: - self.assertEqual ( note2midi(note), midi ) +list_of_unknown_notes = ( + ( 'G\udd2a2' ), + ( 'B\ufffd2' ), + ( 'B\u266e\u266e2' ), + ( 'B\u266f\u266d3' ), + ( 'B33' ), + ( 'C.3' ), + ( 'A' ), + ( '2' ), + ) -class note2midi_wrong_values(unittest.TestCase): +class Test_note2midi_good_values(object): + + @parametrize('note, midi', list_of_known_notes) + def test_note2midi_known_values(self, note, midi): + " known values are correctly converted " + assert note2midi(note) == midi + + @parametrize('note, midi', list_of_known_notes_with_unicode_issues) + def test_note2midi_known_values_with_unicode_issues(self, note, midi): + " difficult values are correctly converted unless expected failure " + try: + assert note2midi(note) == midi + except UnicodeEncodeError as e: + # platforms with decoding failures include: + # - osx: python <= 2.7.10 + # - win: python <= 2.7.12 + import sys + strmsg = "len(u'\\U0001D12A') != 1, expected decoding failure" + strmsg += " | upgrade to Python 3 to fix" + strmsg += " | {:s} | {:s} {:s}" + if len('\U0001D12A') != 1 and sys.version[0] == '2': + skipTest(strmsg.format(repr(e), sys.platform, sys.version)) + else: + raise + +class note2midi_wrong_values(TestCase): def test_note2midi_missing_octave(self): " fails when passed only one character" @@ -66,12 +103,40 @@ class note2midi_wrong_values(unittest.TestCase): " fails when passed a non-string value " self.assertRaises(TypeError, note2midi, 123) + def test_note2midi_wrong_data_too_long(self): + " fails when passed a note with a note name longer than expected" + self.assertRaises(ValueError, note2midi, 'CB+-3') + +class Test_note2midi_unknown_values(object): -class freq2note_simple_test(unittest.TestCase): + @parametrize('note', list_of_unknown_notes) + def test_note2midi_unknown_values(self, note): + " unknown values throw out an error " + assert_raises(ValueError, note2midi, note) - def test_freq2note(self): +class freq2note_simple_test(TestCase): + + def test_freq2note_above(self): " make sure freq2note(441) == A4 " self.assertEqual("A4", freq2note(441)) + def test_freq2note_under(self): + " make sure freq2note(439) == A4 " + self.assertEqual("A4", freq2note(439)) + +class note2freq_simple_test(TestCase): + + def test_note2freq(self): + " make sure note2freq('A3') == 220" + self.assertEqual(220, note2freq("A3")) + + def test_note2freq_under(self): + " make sure note2freq(A4) == 440" + if float_type == 'float32': + self.assertEqual(440, note2freq("A4")) + else: + self.assertLess(abs(note2freq("A4")-440), 1.e-12) + if __name__ == '__main__': - unittest.main() + from _tools import run_module_suite + run_module_suite() diff --git a/python/tests/test_notes.py b/python/tests/test_notes.py new file mode 100755 index 0000000..a95d010 --- /dev/null +++ b/python/tests/test_notes.py @@ -0,0 +1,94 @@ +#! /usr/bin/env python + +from numpy.testing import TestCase, assert_equal, assert_almost_equal +from aubio import notes, source +import numpy as np +from utils import list_all_sounds + +list_of_sounds = list_all_sounds('sounds') + +AUBIO_DEFAULT_NOTES_SILENCE = -70. +AUBIO_DEFAULT_NOTES_RELEASE_DROP = 10. +AUBIO_DEFAULT_NOTES_MINIOI_MS = 30. + +class aubio_notes_default(TestCase): + + def test_members(self): + o = notes() + assert_equal ([o.buf_size, o.hop_size, o.method, o.samplerate], + [1024,512,'default',44100]) + + +class aubio_notes_params(TestCase): + + samplerate = 44100 + + def setUp(self): + self.o = notes(samplerate = self.samplerate) + + def test_get_minioi_ms(self): + assert_equal (self.o.get_minioi_ms(), AUBIO_DEFAULT_NOTES_MINIOI_MS) + + def test_set_minioi_ms(self): + val = 40. + self.o.set_minioi_ms(val) + assert_almost_equal (self.o.get_minioi_ms(), val) + + def test_get_silence(self): + assert_equal (self.o.get_silence(), AUBIO_DEFAULT_NOTES_SILENCE) + + def test_set_silence(self): + val = -50 + self.o.set_silence(val) + assert_equal (self.o.get_silence(), val) + + def test_get_release_drop(self): + assert_equal (self.o.get_release_drop(), AUBIO_DEFAULT_NOTES_RELEASE_DROP) + + def test_set_release_drop(self): + val = 50 + self.o.set_release_drop(val) + assert_equal (self.o.get_release_drop(), val) + + def test_set_release_drop_wrong(self): + val = -10 + with self.assertRaises(ValueError): + self.o.set_release_drop(val) + +class aubio_notes_sinewave(TestCase): + + def analyze_file(self, filepath, samplerate=0): + win_s = 512 # fft size + hop_s = 256 # hop size + + s = source(filepath, samplerate, hop_s) + samplerate = s.samplerate + + tolerance = 0.8 + + notes_o = notes("default", win_s, hop_s, samplerate) + total_frames = 0 + + results = [] + while True: + samples, read = s() + new_note = notes_o(samples) + if (new_note[0] != 0): + note_str = ' '.join(["%.2f" % i for i in new_note]) + results.append( [total_frames, np.copy(new_note)] ) + total_frames += read + if read < hop_s: break + return results + + def test_sinewave(self): + for filepath in list_of_sounds: + if '44100Hz_44100f_sine441.wav' in filepath: + results = self.analyze_file(filepath) + assert_equal (len(results), 1) + assert_equal (len(results[0]), 2) + assert_equal (results[0][0], 1280) + assert_equal (results[0][1], [69, 123, -1]) + +if __name__ == '__main__': + from unittest import main + main() diff --git a/python/tests/test_onset.py b/python/tests/test_onset.py index dcb6dab..08edbee 100755 --- a/python/tests/test_onset.py +++ b/python/tests/test_onset.py @@ -1,8 +1,7 @@ #! /usr/bin/env python -from unittest import main from numpy.testing import TestCase, assert_equal, assert_almost_equal -from aubio import onset +from aubio import onset, fvec class aubio_onset_default(TestCase): @@ -19,25 +18,25 @@ class aubio_onset_params(TestCase): self.o = onset(samplerate = self.samplerate) def test_get_delay(self): - assert_equal (self.o.get_delay(), int(4.3 * self.o.hop_size)) + self.assertGreater(self.o.get_delay(), 0) def test_get_delay_s(self): - assert_almost_equal (self.o.get_delay_s(), self.o.get_delay() / float(self.samplerate)) + self.assertGreater(self.o.get_delay_s(), 0.) def test_get_delay_ms(self): - assert_almost_equal (self.o.get_delay_ms(), self.o.get_delay() * 1000. / self.samplerate, 5) + self.assertGreater(self.o.get_delay_ms(), 0.) def test_get_minioi(self): - assert_almost_equal (self.o.get_minioi(), 0.02 * self.samplerate) + self.assertGreater(self.o.get_minioi(), 0) def test_get_minioi_s(self): - assert_almost_equal (self.o.get_minioi_s(), 0.02) + self.assertGreater(self.o.get_minioi_s(), 0.) def test_get_minioi_ms(self): - assert_equal (self.o.get_minioi_ms(), 20.) + self.assertGreater(self.o.get_minioi_ms(), 0.) def test_get_threshold(self): - assert_almost_equal (self.o.get_threshold(), 0.3) + self.assertGreater(self.o.get_threshold(), 0.) def test_set_delay(self): val = 256 @@ -83,5 +82,38 @@ class aubio_onset_32000(aubio_onset_params): class aubio_onset_8000(aubio_onset_params): samplerate = 8000 +class aubio_onset_coverate(TestCase): + # extra tests to execute the C routines and improve coverage + + def test_all_methods(self): + for method in ['default', 'energy', 'hfc', 'complexdomain', 'complex', + 'phase', 'wphase', 'mkl', 'kl', 'specflux', 'specdiff', + 'old_default']: + o = onset(method=method, buf_size=512, hop_size=256) + o(fvec(256)) + + def test_get_methods(self): + o = onset(method='default', buf_size=512, hop_size=256) + + assert o.get_silence() == -70 + o.set_silence(-20) + assert_almost_equal(o.get_silence(), -20) + + assert o.get_compression() == 1 + o.set_compression(.99) + assert_almost_equal(o.get_compression(), .99) + + assert o.get_awhitening() == 0 + o.set_awhitening(1) + assert o.get_awhitening() == 1 + + o.get_last() + o.get_last_ms() + o.get_last_s() + o.get_descriptor() + o.get_thresholded_descriptor() + + if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_phasevoc.py b/python/tests/test_phasevoc.py index 957d3b1..b228269 100755 --- a/python/tests/test_phasevoc.py +++ b/python/tests/test_phasevoc.py @@ -1,9 +1,8 @@ #! /usr/bin/env python from numpy.testing import TestCase, assert_equal, assert_array_less +from _tools import parametrize, skipTest from aubio import fvec, cvec, pvoc, float_type -from nose2 import main -from nose2.tools import params import numpy as np if float_type == 'float32': @@ -18,7 +17,7 @@ def create_sine(hop_s, freq, samplerate): def create_noise(hop_s): return np.random.rand(hop_s).astype(float_type) * 2. - 1. -class aubio_pvoc_test_case(TestCase): +class Test_aubio_pvoc_test_case(object): """ pvoc object test case """ def test_members_automatic_sizes_default(self): @@ -52,11 +51,21 @@ class aubio_pvoc_test_case(TestCase): assert_equal (s.phas[s.phas > 0], +np.pi) assert_equal (s.phas[s.phas < 0], -np.pi) assert_equal (np.abs(s.phas[np.abs(s.phas) != np.pi]), 0) - self.skipTest('pvoc(fvec(%d)).phas != +0, ' % win_s \ + skipTest('pvoc(fvec(%d)).phas != +0, ' % win_s \ + 'This is expected when using fftw3 on powerpc.') assert_equal ( r, 0.) - @params( + def test_no_overlap(self): + win_s, hop_s = 1024, 1024 + f = pvoc (win_s, hop_s) + t = fvec (hop_s) + for _ in range(4): + s = f(t) + r = f.rdo(s) + assert_equal ( t, 0.) + + resynth_noise_args = "hop_s, ratio" + resynth_noise_values = [ ( 256, 8), ( 256, 4), ( 256, 2), @@ -78,13 +87,16 @@ class aubio_pvoc_test_case(TestCase): (8192, 8), (8192, 4), (8192, 2), - ) + ] + + @parametrize(resynth_noise_args, resynth_noise_values) def test_resynth_steps_noise(self, hop_s, ratio): """ check the resynthesis of a random signal is correct """ sigin = create_noise(hop_s) self.reconstruction(sigin, hop_s, ratio) - @params( + resynth_sine_args = "samplerate, hop_s, ratio, freq" + resynth_sine_values = [ (44100, 256, 8, 441), (44100, 256, 4, 1203), (44100, 256, 2, 3045), @@ -99,7 +111,9 @@ class aubio_pvoc_test_case(TestCase): (22050, 256, 8, 445), (96000, 1024, 8, 47000), (96000, 1024, 8, 20), - ) + ] + + @parametrize(resynth_sine_args, resynth_sine_values) def test_resynth_steps_sine(self, samplerate, hop_s, ratio, freq): """ check the resynthesis of a sine is correct """ sigin = create_sine(hop_s, freq, samplerate) @@ -190,5 +204,5 @@ class aubio_pvoc_wrong_params(TestCase): self.skipTest('creating aubio.pvoc with size %d did not fail' % win_s) if __name__ == '__main__': + from unittest import main main() - diff --git a/python/tests/test_pitch.py b/python/tests/test_pitch.py index 00c8eea..6305532 100755 --- a/python/tests/test_pitch.py +++ b/python/tests/test_pitch.py @@ -1,7 +1,6 @@ #! /usr/bin/env python -from unittest import TestCase, main -from numpy.testing import assert_equal +from numpy.testing import TestCase, assert_equal from numpy import sin, arange, mean, median, isnan, pi from aubio import fvec, pitch, freqtomidi, float_type @@ -70,8 +69,8 @@ class aubio_pitch_Sinusoid(TestCase): #print 'len(pitches), cut:', len(pitches), cut #print 'median errors: ', median(errors), 'median pitches: ', median(pitches) -pitch_algorithms = [ "default", "yinfft", "yin", "schmitt", "mcomb", "fcomb" , "specacf" ] -pitch_algorithms = [ "default", "yinfft", "yin", "schmitt", "mcomb", "fcomb" ] +pitch_algorithms = [ "default", "yinfft", "yin", "yinfast", "schmitt", "mcomb", "fcomb" , "specacf" ] +pitch_algorithms = [ "default", "yinfft", "yin", "yinfast", "schmitt", "mcomb", "fcomb" ] #freqs = [ 27.5, 55., 110., 220., 440., 880., 1760., 3520. ] freqs = [ 110., 220., 440., 880., 1760., 3520. ] @@ -116,10 +115,11 @@ def create_test (algo, mode): for algo in pitch_algorithms: for mode in signal_modes: - test_method = create_test (algo, mode) - test_method.__name__ = 'test_pitch_%s_%d_%d_%dHz_sin_%.0f' % ( algo, + _test_method = create_test (algo, mode) + _test_method.__name__ = 'test_pitch_%s_%d_%d_%dHz_sin_%.0f' % ( algo, mode[0], mode[1], mode[2], mode[3] ) - setattr (aubio_pitch_Sinusoid, test_method.__name__, test_method) + setattr (aubio_pitch_Sinusoid, _test_method.__name__, _test_method) if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_sink.py b/python/tests/test_sink.py index 7b397ec..9516fe4 100755 --- a/python/tests/test_sink.py +++ b/python/tests/test_sink.py @@ -1,10 +1,10 @@ #! /usr/bin/env python -from nose2 import main -from nose2.tools import params from numpy.testing import TestCase from aubio import fvec, source, sink -from .utils import list_all_sounds, get_tmp_sink_path, del_tmp_sink_path +from utils import list_all_sounds, get_tmp_sink_path, del_tmp_sink_path +from utils import parse_file_samplerate +from _tools import parametrize, skipTest, assert_raises, assert_warns list_of_sounds = list_all_sounds('sounds') samplerates = [0, 44100, 8000, 32000] @@ -20,11 +20,27 @@ for soundfile in list_of_sounds: for samplerate in samplerates: all_params.append((hop_size, samplerate, soundfile)) -class aubio_sink_test_case(TestCase): +class Test_aubio_sink(object): - def setUp(self): - if not len(list_of_sounds): - self.skipTest('add some sound files in \'python/tests/sounds\'') + def test_wrong_filename(self): + with assert_raises(RuntimeError): + sink('') + + def test_wrong_samplerate(self): + with assert_raises(RuntimeError): + sink(get_tmp_sink_path(), -1) + + def test_wrong_samplerate_too_large(self): + with assert_raises(RuntimeError): + sink(get_tmp_sink_path(), 1536001, 2) + + def test_wrong_channels(self): + with assert_raises(RuntimeError): + sink(get_tmp_sink_path(), 44100, -1) + + def test_wrong_channels_too_large(self): + with assert_raises(RuntimeError): + sink(get_tmp_sink_path(), 44100, 202020) def test_many_sinks(self): from tempfile import mkdtemp @@ -43,13 +59,19 @@ class aubio_sink_test_case(TestCase): g.close() shutil.rmtree(tmpdir) - @params(*all_params) + @parametrize('hop_size, samplerate, path', all_params) def test_read_and_write(self, hop_size, samplerate, path): - + orig_samplerate = parse_file_samplerate(soundfile) try: - f = source(path, samplerate, hop_size) + if orig_samplerate is not None and orig_samplerate < samplerate: + # upsampling should emit a warning + with assert_warns(UserWarning): + f = source(soundfile, samplerate, hop_size) + else: + f = source(soundfile, samplerate, hop_size) except RuntimeError as e: - self.skipTest('failed opening with hop_s = {:d}, samplerate = {:d} ({:s})'.format(hop_size, samplerate, str(e))) + err_msg = '{:s} (hop_s = {:d}, samplerate = {:d})' + skipTest(err_msg.format(str(e), hop_size, samplerate)) if samplerate == 0: samplerate = f.samplerate sink_path = get_tmp_sink_path() g = sink(sink_path, samplerate) @@ -61,12 +83,19 @@ class aubio_sink_test_case(TestCase): if read < f.hop_size: break del_tmp_sink_path(sink_path) - @params(*all_params) + @parametrize('hop_size, samplerate, path', all_params) def test_read_and_write_multi(self, hop_size, samplerate, path): + orig_samplerate = parse_file_samplerate(soundfile) try: - f = source(path, samplerate, hop_size) + if orig_samplerate is not None and orig_samplerate < samplerate: + # upsampling should emit a warning + with assert_warns(UserWarning): + f = source(soundfile, samplerate, hop_size) + else: + f = source(soundfile, samplerate, hop_size) except RuntimeError as e: - self.skipTest('failed opening with hop_s = {:d}, samplerate = {:d} ({:s})'.format(hop_size, samplerate, str(e))) + err_msg = '{:s} (hop_s = {:d}, samplerate = {:d})' + skipTest(err_msg.format(str(e), hop_size, samplerate)) if samplerate == 0: samplerate = f.samplerate sink_path = get_tmp_sink_path() g = sink(sink_path, samplerate, channels = f.channels) @@ -93,5 +122,14 @@ class aubio_sink_test_case(TestCase): g.close() del_tmp_sink_path(sink_path) + def test_read_with(self): + samplerate = 44100 + sink_path = get_tmp_sink_path() + vec = fvec(128) + with sink(sink_path, samplerate) as g: + for _ in range(10): + g(vec, 128) + if __name__ == '__main__': - main() + from _tools import run_module_suite + run_module_suite() diff --git a/python/tests/test_slicing.py b/python/tests/test_slicing.py index c96ba52..18391e9 100755 --- a/python/tests/test_slicing.py +++ b/python/tests/test_slicing.py @@ -1,10 +1,9 @@ #! /usr/bin/env python -from unittest import main from numpy.testing import TestCase, assert_equal from aubio import slice_source_at_stamps -from .utils import count_files_in_directory, get_default_test_sound -from .utils import count_samples_in_directory, count_samples_in_file +from utils import count_files_in_directory, get_default_test_sound +from utils import count_samples_in_directory, count_samples_in_file import tempfile import shutil @@ -23,19 +22,27 @@ class aubio_slicing_test_case(TestCase): def test_slice_start_only_no_zero(self): regions_start = [i*1000 for i in range(1, n_slices)] - slice_source_at_stamps(self.source_file, regions_start, output_dir = self.output_dir) + slice_source_at_stamps(self.source_file, regions_start, + output_dir = self.output_dir, create_first=True) def test_slice_start_beyond_end(self): regions_start = [i*1000 for i in range(1, n_slices)] regions_start += [count_samples_in_file(self.source_file) + 1000] - slice_source_at_stamps(self.source_file, regions_start, output_dir = self.output_dir) + slice_source_at_stamps(self.source_file, regions_start, + output_dir = self.output_dir, create_first=True) def test_slice_start_every_blocksize(self): hopsize = 200 - regions_start = [i*hopsize for i in range(1, n_slices)] + regions_start = [i*hopsize for i in range(0, n_slices)] slice_source_at_stamps(self.source_file, regions_start, output_dir = self.output_dir, hopsize = 200) + def test_slice_start_every_half_blocksize(self): + hopsize = 200 + regions_start = [i*hopsize//2 for i in range(0, n_slices)] + slice_source_at_stamps(self.source_file, regions_start, + output_dir = self.output_dir, hopsize = 200) + def tearDown(self): original_samples = count_samples_in_file(self.source_file) written_samples = count_samples_in_directory(self.output_dir) @@ -91,6 +98,19 @@ class aubio_slicing_with_ends_test_case(TestCase): assert_equal(written_samples, expected_samples, "number of samples written different from number of original samples") + def test_slice_start_and_ends_with_missing_end(self): + regions_start = [i*1000 for i in range(n_slices)] + regions_ends = [r-1 for r in regions_start[1:]] + slice_source_at_stamps(self.source_file, regions_start, regions_ends, + output_dir = self.output_dir) + written_samples = count_samples_in_directory(self.output_dir) + original_samples = count_samples_in_file(self.source_file) + total_files = count_files_in_directory(self.output_dir) + assert_equal(n_slices, total_files, + "number of slices created different from expected") + assert_equal(written_samples, original_samples, + "number of samples written different from number of original samples") + def tearDown(self): shutil.rmtree(self.output_dir) @@ -133,7 +153,7 @@ class aubio_slicing_wrong_ends_test_case(TestCase): regions_start = [i*1000 for i in range(1, n_slices)] regions_end = None slice_source_at_stamps (self.source_file, regions_start, regions_end, - output_dir = self.output_dir) + output_dir = self.output_dir, create_first=True) total_files = count_files_in_directory(self.output_dir) assert_equal(n_slices, total_files, "number of slices created different from expected") @@ -146,4 +166,5 @@ class aubio_slicing_wrong_ends_test_case(TestCase): shutil.rmtree(self.output_dir) if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_source.py b/python/tests/test_source.py index 4ec80fb..8b35e77 100755 --- a/python/tests/test_source.py +++ b/python/tests/test_source.py @@ -1,16 +1,18 @@ #! /usr/bin/env python -from nose2 import main -from nose2.tools import params -from numpy.testing import TestCase + +from numpy.testing import TestCase, assert_equal from aubio import source -from .utils import list_all_sounds +from utils import list_all_sounds, parse_file_samplerate +import unittest +from _tools import assert_raises, assert_equal, assert_warns +from _tools import parametrize, skipTest list_of_sounds = list_all_sounds('sounds') samplerates = [0, 44100, 8000, 32000] hop_sizes = [512, 1024, 64] -path = None +default_test_sound = len(list_of_sounds) and list_of_sounds[0] or None all_params = [] for soundfile in list_of_sounds: @@ -18,72 +20,103 @@ for soundfile in list_of_sounds: for samplerate in samplerates: all_params.append((hop_size, samplerate, soundfile)) +no_sounds_msg = "no test sounds, add some in 'python/tests/sounds/'!" -class aubio_source_test_case_base(TestCase): +_debug = False - def setUp(self): - if not len(list_of_sounds): self.skipTest('add some sound files in \'python/tests/sounds\'') - self.default_test_sound = list_of_sounds[0] -class aubio_source_test_case(aubio_source_test_case_base): +class Test_aubio_source_test_case(TestCase): + + def setUp(self): + if not default_test_sound: + skipTest(no_sounds_msg) def test_close_file(self): samplerate = 0 # use native samplerate hop_size = 256 - for p in list_of_sounds: - f = source(p, samplerate, hop_size) - f.close() + f = source(default_test_sound, samplerate, hop_size) + f.close() def test_close_file_twice(self): samplerate = 0 # use native samplerate hop_size = 256 - for p in list_of_sounds: - f = source(p, samplerate, hop_size) - f.close() - f.close() + f = source(default_test_sound, samplerate, hop_size) + f.close() + f.close() + + def test_read_after_close(self): + samplerate = 0 # use native samplerate + hop_size = 256 + f = source(default_test_sound, samplerate, hop_size) + read, frames = f() + f.close() + with assert_raises(RuntimeError): + read, frames = f() + with assert_raises(RuntimeError): + read, frames = f.do_multi() + -class aubio_source_read_test_case(aubio_source_test_case_base): +class Test_aubio_source_read(object): def read_from_source(self, f): total_frames = 0 while True: - _ , read = f() + samples , read = f() total_frames += read - if read < f.hop_size: break - #result_str = "read {:.2f}s ({:d} frames in {:d} blocks at {:d}Hz) from {:s}" - #result_params = total_frames / float(f.samplerate), total_frames, total_frames//f.hop_size, f.samplerate, f.uri - #print (result_str.format(*result_params)) + if read < f.hop_size: + assert_equal(samples[read:], 0) + break + if _debug: + result_str = "read {:.2f}s ({:d} frames" + result_str += " in {:d} blocks at {:d}Hz) from {:s}" + result_params = total_frames / float(f.samplerate), total_frames, \ + total_frames//f.hop_size, f.samplerate, f.uri + print (result_str.format(*result_params)) return total_frames - @params(*all_params) + @parametrize('hop_size, samplerate, soundfile', all_params) def test_samplerate_hopsize(self, hop_size, samplerate, soundfile): + orig_samplerate = parse_file_samplerate(soundfile) try: - f = source(soundfile, samplerate, hop_size) + if orig_samplerate is not None and orig_samplerate < samplerate: + # upsampling should emit a warning + with assert_warns(UserWarning): + f = source(soundfile, samplerate, hop_size) + else: + f = source(soundfile, samplerate, hop_size) except RuntimeError as e: - self.skipTest('failed opening with hop_s = {:d}, samplerate = {:d} ({:s})'.format(hop_size, samplerate, str(e))) + err_msg = 'failed opening with hop_s={:d}, samplerate={:d} ({:s})' + skipTest(err_msg.format(hop_size, samplerate, str(e))) assert f.samplerate != 0 - self.read_from_source(f) - - @params(*list_of_sounds) + read_frames = self.read_from_source(f) + if 'f_' in soundfile and samplerate == 0: + import re + f = re.compile(r'.*_\([0:9]*f\)_.*') + match_f = re.findall('([0-9]*)f_', soundfile) + if len(match_f) == 1: + expected_frames = int(match_f[0]) + assert_equal(expected_frames, read_frames) + + @parametrize('p', list_of_sounds) def test_samplerate_none(self, p): f = source(p) assert f.samplerate != 0 self.read_from_source(f) - @params(*list_of_sounds) + @parametrize('p', list_of_sounds) def test_samplerate_0(self, p): f = source(p, 0) assert f.samplerate != 0 self.read_from_source(f) - @params(*list_of_sounds) + @parametrize('p', list_of_sounds) def test_zero_hop_size(self, p): f = source(p, 0, 0) assert f.samplerate != 0 assert f.hop_size != 0 self.read_from_source(f) - @params(*list_of_sounds) + @parametrize('p', list_of_sounds) def test_seek_to_half(self, p): from random import randint f = source(p, 0, 0) @@ -95,7 +128,7 @@ class aubio_source_read_test_case(aubio_source_test_case_base): b = self.read_from_source(f) assert a == b + c - @params(*list_of_sounds) + @parametrize('p', list_of_sounds) def test_duration(self, p): total_frames = 0 f = source(p) @@ -104,54 +137,77 @@ class aubio_source_read_test_case(aubio_source_test_case_base): _, read = f() total_frames += read if read < f.hop_size: break - self.assertEqual(duration, total_frames) + assert_equal (duration, total_frames) -class aubio_source_test_wrong_params(TestCase): +class Test_aubio_source_wrong_params(object): def test_wrong_file(self): - with self.assertRaises(RuntimeError): + with assert_raises(RuntimeError): source('path_to/unexisting file.mp3') -class aubio_source_test_wrong_params_with_file(aubio_source_test_case_base): +@unittest.skipIf(default_test_sound is None, no_sounds_msg) +class Test_aubio_source_wrong_params_with_file(TestCase): def test_wrong_samplerate(self): - with self.assertRaises(ValueError): - source(self.default_test_sound, -1) + with assert_raises(ValueError): + source(default_test_sound, -1) def test_wrong_hop_size(self): - with self.assertRaises(ValueError): - source(self.default_test_sound, 0, -1) + with assert_raises(ValueError): + source(default_test_sound, 0, -1) def test_wrong_channels(self): - with self.assertRaises(ValueError): - source(self.default_test_sound, 0, 0, -1) + with assert_raises(ValueError): + source(default_test_sound, 0, 0, -1) def test_wrong_seek(self): - f = source(self.default_test_sound) - with self.assertRaises(ValueError): + f = source(default_test_sound) + with assert_raises(ValueError): f.seek(-1) def test_wrong_seek_too_large(self): - f = source(self.default_test_sound) + f = source(default_test_sound) try: - with self.assertRaises(ValueError): + with assert_raises(ValueError): f.seek(f.duration + f.samplerate * 10) - except AssertionError: - self.skipTest('seeking after end of stream failed raising ValueError') + except: + skipTest('seeking after end of stream failed raising ValueError') -class aubio_source_readmulti_test_case(aubio_source_read_test_case): +class Test_aubio_source_readmulti(Test_aubio_source_read): def read_from_source(self, f): total_frames = 0 while True: - _, read = f.do_multi() + samples, read = f.do_multi() total_frames += read - if read < f.hop_size: break - #result_str = "read {:.2f}s ({:d} frames in {:d} channels and {:d} blocks at {:d}Hz) from {:s}" - #result_params = total_frames / float(f.samplerate), total_frames, f.channels, int(total_frames/f.hop_size), f.samplerate, f.uri - #print (result_str.format(*result_params)) + if read < f.hop_size: + assert_equal(samples[:,read:], 0) + break + if _debug: + result_str = "read {:.2f}s ({:d} frames in {:d} channels" + result_str += " and {:d} blocks at {:d}Hz) from {:s}" + result_params = total_frames / float(f.samplerate), total_frames, \ + f.channels, int(total_frames/f.hop_size), \ + f.samplerate, f.uri + print (result_str.format(*result_params)) return total_frames +class Test_aubio_source_with(object): + + @parametrize('filename', list_of_sounds) + def test_read_from_mono(self, filename): + total_frames = 0 + hop_size = 2048 + with source(filename, 0, hop_size) as input_source: + assert_equal(input_source.hop_size, hop_size) + #assert_equal(input_source.samplerate, samplerate) + total_frames = 0 + for frames in input_source: + total_frames += frames.shape[-1] + # check we read as many samples as we expected + assert_equal(total_frames, input_source.duration) + if __name__ == '__main__': - main() + from _tools import run_module_suite + run_module_suite() diff --git a/python/tests/test_source_channels.py b/python/tests/test_source_channels.py new file mode 100755 index 0000000..eee696d --- /dev/null +++ b/python/tests/test_source_channels.py @@ -0,0 +1,93 @@ +#! /usr/bin/env python + +"""A brute force test using `sink` to create and write samples to a stereo +file, then `source` to check the correct content is read from the files.""" + +import os.path +import unittest +import aubio +import numpy as np +from numpy.testing import assert_equal +from utils import get_tmp_sink_path + +class aubio_source_test_case(unittest.TestCase): + + def test_read_from_mono(self): + out = get_tmp_sink_path() + samplerate = 44100 + hop_size = 256 + blocks = 10 + channels = 1 + write_samples = np.ones([channels, hop_size], dtype=aubio.float_type) + write_samples *= .5 + self.check_write_and_read(samplerate, channels, hop_size, blocks, + write_samples) + + def test_read_from_stereo(self): + out = get_tmp_sink_path() + samplerate = 44100 + hop_size = 256 + blocks = 10 + channels = 1 + write_samples = np.ones([channels, hop_size], dtype=aubio.float_type) + write_samples *= .5 + self.check_write_and_read(samplerate, channels, hop_size, blocks, + write_samples) + + def test_read_from_half_stereo(self): + samplerate = 16000 + channels = 2 + hop_size = 512 + blocks = 10 + write_samples = np.ones([channels, hop_size], dtype=aubio.float_type) + write_samples *= .5 + write_samples[1, :] = 0 + self.check_write_and_read(samplerate, channels, hop_size, blocks, + write_samples) + + def test_read_from_cancelling_channels(self): + samplerate = 16000 + channels = 2 + hop_size = 512 + blocks = 10 + write_samples = np.ones([channels, hop_size], dtype=aubio.float_type) + write_samples *= .5 + write_samples[1] *= -1 + self.check_write_and_read(samplerate, channels, hop_size, blocks, + write_samples) + + def test_read_from_strange_three_channels(self): + samplerate = 8000 + channels = 3 + hop_size = 123 + blocks = 10 + write_samples = np.ones([channels, hop_size], dtype=aubio.float_type) + write_samples *= .5 + write_samples[1, :] = 0 + self.check_write_and_read(samplerate, channels, hop_size, blocks, + write_samples) + + def check_write_and_read(self, samplerate, channels, + hop_size, blocks, write_samples): + expected_mono = np.sum(write_samples, axis=0)/write_samples.shape[0] + out = get_tmp_sink_path() + snk = aubio.sink(out, samplerate, channels=channels) + for i in range(blocks): + snk.do_multi(write_samples, hop_size) + # close the sink before reading from it + snk.close() + + src = aubio.source(out, samplerate, hop_size) + for i in range(blocks): + read_samples, read = src.do_multi() + assert_equal (read_samples, write_samples) + assert_equal (read, hop_size) + + src.seek(0) + for i in range(blocks): + read_samples, read = src() + assert_equal (read, hop_size) + assert_equal (read_samples, expected_mono) + +if __name__ == '__main__': + unittest.main() diff --git a/python/tests/test_specdesc.py b/python/tests/test_specdesc.py index 063d13d..32a46fe 100755 --- a/python/tests/test_specdesc.py +++ b/python/tests/test_specdesc.py @@ -1,6 +1,5 @@ #! /usr/bin/env python -from unittest import main from numpy.testing import TestCase, assert_equal, assert_almost_equal from numpy import random, arange, log, zeros from aubio import specdesc, cvec, float_type @@ -225,10 +224,9 @@ class aubio_specdesc_wrong(TestCase): specdesc("default", -10) def test_unknown(self): - # FIXME should fail? - with self.assertRaises(ValueError): + with self.assertRaises(RuntimeError): specdesc("unknown", 512) - self.skipTest('todo: new_specdesc should fail on wrong method') if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/test_tempo.py b/python/tests/test_tempo.py new file mode 100755 index 0000000..cf183c9 --- /dev/null +++ b/python/tests/test_tempo.py @@ -0,0 +1,91 @@ +#! /usr/bin/env python + +from unittest import main +from numpy.testing import TestCase, assert_equal, assert_almost_equal +import aubio + +class aubio_tempo_default(TestCase): + + def test_members(self): + o = aubio.tempo() + assert_equal ([o.buf_size, o.hop_size, o.method, o.samplerate], + [1024,512,'default',44100]) + +class aubio_tempo_params(TestCase): + + samplerate = 44100 + + def setUp(self): + self.o = aubio.tempo(samplerate = self.samplerate) + + def test_get_delay(self): + self.assertEqual(self.o.get_delay(), 0) + + def test_set_delay(self): + val = 256 + self.o.set_delay(val) + assert_equal (self.o.get_delay(), val) + + def test_get_delay_s(self): + self.assertEqual(self.o.get_delay_s(), 0.) + + def test_set_delay_s(self): + val = .05 + self.o.set_delay_s(val) + assert_almost_equal (self.o.get_delay_s(), val) + + def test_get_delay_ms(self): + self.assertEqual(self.o.get_delay_ms(), 0.) + + def test_set_delay_ms(self): + val = 50. + self.o.set_delay_ms(val) + assert_almost_equal (self.o.get_delay_ms(), val) + + def test_get_threshold(self): + assert_almost_equal(self.o.get_threshold(), 0.3) + + def test_set_threshold(self): + val = .1 + self.o.set_threshold(val) + assert_almost_equal (self.o.get_threshold(), val) + + def test_get_silence(self): + self.assertEqual(self.o.get_silence(), -90.) + + def test_set_silence(self): + val = -50. + self.o.set_silence(val) + assert_almost_equal (self.o.get_silence(), val) + + def test_get_last(self): + self.assertEqual(self.o.get_last(), 0.) + + def test_get_last_s(self): + self.assertEqual(self.o.get_last_s(), 0.) + + def test_get_last_ms(self): + self.assertEqual(self.o.get_last_ms(), 0.) + + def test_get_period(self): + self.assertEqual(self.o.get_period(), 0.) + + def test_get_period_s(self): + self.assertEqual(self.o.get_period_s(), 0.) + + def test_get_last_tatum(self): + self.assertEqual(self.o.get_last_tatum(), 0.) + + def test_set_tatum_signature(self): + self.o.set_tatum_signature(8) + self.o.set_tatum_signature(64) + self.o.set_tatum_signature(1) + + def test_set_wrong_tatum_signature(self): + with self.assertRaises(ValueError): + self.o.set_tatum_signature(101) + with self.assertRaises(ValueError): + self.o.set_tatum_signature(0) + +if __name__ == '__main__': + main() diff --git a/python/tests/test_zero_crossing_rate.py b/python/tests/test_zero_crossing_rate.py index 7f6d479..21f9e40 100755 --- a/python/tests/test_zero_crossing_rate.py +++ b/python/tests/test_zero_crossing_rate.py @@ -1,6 +1,5 @@ #! /usr/bin/env python -from unittest import main from numpy.testing import TestCase from aubio import fvec, zero_crossing_rate @@ -43,4 +42,5 @@ class zero_crossing_rate_test_case(TestCase): self.assertEqual(2./buf_size, zero_crossing_rate(self.vector)) if __name__ == '__main__': + from unittest import main main() diff --git a/python/tests/utils.py b/python/tests/utils.py index b0963fc..7606404 100644 --- a/python/tests/utils.py +++ b/python/tests/utils.py @@ -1,18 +1,20 @@ #! /usr/bin/env python import os +import re import glob +import struct import numpy as np from tempfile import mkstemp DEFAULT_SOUND = '22050Hz_5s_brownnoise.wav' +def is32bit(): + return struct.calcsize("P") * 8 == 32 + def array_from_text_file(filename, dtype = 'float'): - filename = os.path.join(os.path.dirname(__file__), filename) - with open(filename) as f: - lines = f.readlines() - return np.array([line.split() for line in lines], - dtype = dtype) + realpathname = os.path.join(os.path.dirname(__file__), filename) + return np.loadtxt(realpathname, dtype = dtype) def list_all_sounds(rel_dir): datadir = os.path.join(os.path.dirname(__file__), rel_dir) @@ -38,13 +40,10 @@ def del_tmp_sink_path(path): try: os.unlink(path) except WindowsError as e: - print("deleting {:s} failed ({:s}), reopening".format(path, repr(e))) - with open(path, 'wb') as f: - f.close() - try: - os.unlink(path) - except WindowsError as f: - print("deleting {:s} failed ({:s}), aborting".format(path, repr(e))) + # removing the temporary directory sometimes fails on windows + import warnings + errmsg = "failed deleting temporary file {:s} ({:s})" + warnings.warn(UserWarning(errmsg.format(path, repr(e)))) def array_from_yaml_file(filename): import yaml @@ -83,3 +82,16 @@ def count_files_in_directory(samples_dir): if file_path: total_files += 1 return total_files + +def parse_file_samplerate(soundfile): + samplerate = None + # parse samplerate + re_sr = re.compile(r'/([0-9]{4,})Hz_.*') + match_samplerate = re_sr.findall(soundfile) + if match_samplerate: + samplerate = int(match_samplerate[0]) + else: + import warnings + warnings.warn(UserWarning("could not parse samplerate for {:s}" + .format(soundfile))) + return samplerate diff --git a/requirements.txt b/requirements.txt index 99ce0ab..1cc18fc 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,2 +1,2 @@ numpy -nose2 +pytest diff --git a/scripts/build_android b/scripts/build_android new file mode 100755 index 0000000..e1e790c --- /dev/null +++ b/scripts/build_android @@ -0,0 +1,41 @@ +#! /bin/bash + +set -e +set -x + +# location of android NDK +NDK_PATH=$PWD/../contrib/android-ndk-r12 + +WAFOPTS="--disable-avcodec --disable-samplerate --disable-jack --disable-sndfile" + +# set these variables to change the default values +[ -z $PLATFORM ] && PLATFORM=android-19 +[ -z $ARCH ] && ARCH=arm + +# location nof the standalone toolchains, created with +# $NDK_PATH/build/tools/make-standalone-toolchains.sh +NDK_TOOLCHAINS=$PWD/contrib + +# location of the current toolchain +CURRENT_TOOLCHAIN=$NDK_TOOLCHAINS/toolchain-$PLATFORM-$ARCH + +# if it does not exist, create the toolchain +[ -d $CURRENT_TOOLCHAIN ] || \ + $NDK_PATH/build/tools/make-standalone-toolchain.sh \ + --platform=$PLATFORM --arch=$ARCH \ + --install-dir=$CURRENT_TOOLCHAIN + +# aubio install destination directory +DESTDIR=$PWD/dist-$PLATFORM-$ARCH + +# wipe it out if it exists +[ -d $DESTDIR ] && rm -rf $DESTDIR + +# get the link to gcc +CC=`ls $CURRENT_TOOLCHAIN/*-linux-android*/bin/gcc` + +CFLAGS="-Os" \ + CC=$CC \ + ./waf distclean configure build install --destdir=$DESTDIR \ + --verbose \ + --with-target-platform=android $WAFOPTS diff --git a/scripts/build_apple_frameworks b/scripts/build_apple_frameworks index 08df8f6..9cb6661 100755 --- a/scripts/build_apple_frameworks +++ b/scripts/build_apple_frameworks @@ -1,5 +1,8 @@ #! /bin/sh +# cd to aubio directory for consistency +cd `dirname $0`/.. + AUBIO_TMPDIR=`mktemp -d /var/tmp/aubio-build-XXXX` PACKAGE=aubio source VERSION @@ -10,7 +13,7 @@ mkdir -p "$OUTPUTDIR" # add git abbreviated commit hash #VERSION+=+$(git log --pretty=format:"%h" -1) -CFLAGS="-Werror -Ofast" +CFLAGS="-Werror -Os" WAFCONF="--disable-sndfile --disable-avcodec --disable-samplerate --enable-fat" # --disable-memcpy --disable-accelerate" export VERSION diff --git a/scripts/build_emscripten b/scripts/build_emscripten index 9d4fc54..dd24cbc 100755 --- a/scripts/build_emscripten +++ b/scripts/build_emscripten @@ -1,4 +1,4 @@ -#! /bin/sh +#! /bin/bash function checkprog() { type $1 >/dev/null 2>&1 || { echo >&2 "$1 required but not found, aborting."; exit 1; } @@ -9,13 +9,10 @@ checkprog emconfigure checkprog emmake # clean -emmake ./waf distclean +./waf distclean # configure -emconfigure ./waf configure --prefix=$EMSCRIPTEN/system/local/ --with-target-platform emscripten +emconfigure ./waf configure --with-target-platform emscripten $* # build -emmake ./waf --testcmd="node %s" - -# intall -#emmake ./waf install +emmake ./waf build diff --git a/scripts/build_mingw b/scripts/build_mingw index 8c02894..b0a0d5a 100755 --- a/scripts/build_mingw +++ b/scripts/build_mingw @@ -1,28 +1,119 @@ #! /bin/bash -# This script cross compiles aubio for windows using mingw, both for 32 and 64 -# bits. Built binaries will be placed in ./dist-win32 and ./dist-win64. - +# This script cross compiles aubio for windows using mingw, four times: +# +# - 32 and 64 bits with no external dependencies +# - 32 and 64 bits against ffmpeg +# # On debian or ubuntu, you will want to 'apt-get install gcc-mingw-w64' set -e set -x -WAFOPTS="-v --disable-avcodec --disable-samplerate --disable-jack --disable-sndfile" +python this_version.py -v +VERSION=`python $PWD/this_version.py -v` + +FFMPEG_BUILDS_URL="https://ffmpeg.zeranoe.com/builds" +FFMPEG_DEFAULT="3.3.3" + +# define some default CFLAGS +DEF_CFLAGS="-Os -I/usr/share/mingw-w64" +DEF_LDFLAGS="" + +WAFOPTS="" +# disable external deps to make sure we don't try to use the host package +WAFOPTS+=" --disable-samplerate --disable-jack --disable-sndfile" +# enable ffmpeg build +WAFOPTS+=" --disable-avcodec" +# install without a prefix +WAFOPTS+=" --prefix=/" +# compile the tests, but fake running them +# passing this option WAFOPTS fails (escaping?), added in actual waf call below +#WAFOPTS+=" --testcmd='echo %s'" + +# debugging +#WAFOPTS+=" -v" +#WAFOPTS+=" -j1" +#WAFOPTS+=" --notests" + +function fetch_ffpmeg() { + ## manually add ffmpeg (no pkg-config .pc files in bins) + [ -n "$FFMPEG_VERSION" ] || FFMPEG_VERSION=$FFMPEG_DEFAULT + FFMPEG_TARBALL="$PWD/ffmpeg-$FFMPEG_VERSION-$TARGET-dev.zip" + FFMPEG_BINARIES="${FFMPEG_TARBALL%%.zip}" + if [ ! -d "$FFMPEG_BINARIES" ] + then + if [ ! -f "$FFMPEG_TARBALL" ] + then + curl -O $FFMPEG_BUILDS_URL/$TARGET/dev/ffmpeg-$FFMPEG_VERSION-$TARGET-dev.zip + else + echo "using $FFMPEG_TARBALL" + fi + unzip -x $FFMPEG_TARBALL + else + echo "using $FFMPEG_BINARIES" + fi +} + +function get_cflags() { + CFLAGS="$DEF_CFLAGS" + LDFLAGS="$DEF_LDFLAGS" + if [ -n "$WITH_FFMEG" ] + then + fetch_ffpmeg + CFLAGS+=" -DHAVE_LIBAV=1 -DHAVE_SWRESAMPLE=1" + CFLAGS+=" -I$FFMPEG_BINARIES/include" + LDFLAGS+=" -lavcodec -lavformat -lavutil -lswresample" + LDFLAGS+=" -L$FFMPEG_BINARIES/lib" + fi +} + +function build_mingw() { + DESTDIR="$PWD/aubio-$VERSION-$TARGET" + [ -n "$WITH_FFMEG" ] && DESTDIR+="-ffmpeg" + [ -f $DESTDIR.zip ] && echo "Remove existing $DESTDIR.zip first" && exit 1 + [ -d $DESTDIR ] && rm -rf $DESTDIR + WAFOPTS_TGT="$WAFOPTS --destdir=$DESTDIR" + WAFOPTS_TGT+=" --with-target-platform=$TARGET" + get_cflags + CFLAGS="$CFLAGS" LDFLAGS="$LDFLAGS" \ + ./waf distclean configure build install $WAFOPTS_TGT --testcmd='echo %s' + # fix dll location (see https://github.com/waf-project/waf/issues/1860) + mv $DESTDIR/lib/libaubio-5.dll $DESTDIR/bin + mv $DESTDIR/lib/libaubio-5.def $DESTDIR/bin + zip -r $DESTDIR.zip `basename $DESTDIR` + rm -rf $DESTDIR + sha256sum $DESTDIR.zip > $DESTDIR.zip.sha256 +} + +function build_mingw32() { + TARGET=win32 + export CC=i686-w64-mingw32-gcc + export NM=i686-w64-mingw32-nm + build_mingw +} + +function build_mingw64() { + TARGET=win64 + export CC=x86_64-w64-mingw32-gcc + export NM=x86_64-w64-mingw32-nm + build_mingw +} + +# fetch waf if needed +[ -f "waf" ] || make getwaf -[ -d dist-win32 ] && rm -rf dist-win32 -[ -d dist-win64 ] && rm -rf dist-win64 +# first build without ffmpeg +build_mingw32 +build_mingw64 -CFLAGS="-Os" \ - LDFLAGS="" \ - CC=x86_64-w64-mingw32-gcc \ - ./waf distclean configure build install --destdir=$PWD/dist-win64 \ - --testcmd="echo %s" \ - $WAFOPTS --with-target-platform=win64 +# then build against ffmpeg +WITH_FFMEG=1 +build_mingw32 +build_mingw64 -CFLAGS="-Os" \ - LDFLAGS="" \ - CC=i686-w64-mingw32-gcc \ - ./waf distclean configure build install --destdir=$PWD/dist-win32 \ - --testcmd="echo %s" \ - $WAFOPTS --with-target-platform=win32 +set +x +echo "" +echo "All done! The following files were generated:" +echo "" +ls -lart aubio*.zip* diff --git a/scripts/get_waf.sh b/scripts/get_waf.sh index 5b41d8d..c7d9217 100755 --- a/scripts/get_waf.sh +++ b/scripts/get_waf.sh @@ -1,10 +1,54 @@ -#! /bin/sh +#! /bin/bash set -e -set -x +#set -x -WAFURL=https://waf.io/waf-1.8.22 +WAFVERSION=2.0.14 +WAFTARBALL=waf-$WAFVERSION.tar.bz2 +WAFURL=https://waf.io/$WAFTARBALL +WAFUPSTREAMKEY=https://gitlab.com/ita1024/waf/raw/master/utils/pubkey.asc -( which wget > /dev/null && wget -qO waf $WAFURL ) || ( which curl > /dev/null && curl $WAFURL > waf ) +WAFBUILDDIR=`mktemp -d` +function cleanup () { + rm -rf $WAFBUILDDIR +} + +trap cleanup SIGINT SIGTERM + +function download () { + ( [[ -n `which wget` ]] && wget -qO $1 $2 ) || ( [[ -n `which curl` ]] && curl -so $1 $2 ) +} + +function checkwaf () { + download $WAFTARBALL.asc $WAFURL.asc + if [[ -z `which gpg` ]] + then + echo "Warning: gpg not found, not verifying signature for $WAFTARBALL" + else + download - $WAFUPSTREAMKEY | gpg --import + gpg --verify $WAFTARBALL.asc || exit 1 + fi +} + +function fetchwaf () { + download $WAFTARBALL $WAFURL + checkwaf +} + +function buildwaf () { + tar xf $WAFTARBALL + pushd waf-$WAFVERSION + NOCLIMB=1 python waf-light --tools=c_emscripten $* + popd +} + +pushd $WAFBUILDDIR +fetchwaf +buildwaf +popd + +cp -prv $WAFBUILDDIR/waf-$WAFVERSION/waf $PWD chmod +x waf + +cleanup diff --git a/scripts/setenv_local.sh b/scripts/setenv_local.sh index ac9a5cf..93fd9de 100644 --- a/scripts/setenv_local.sh +++ b/scripts/setenv_local.sh @@ -1,24 +1,15 @@ #! /usr/bin/env bash -# This script sets the environment to execute aubio binaries and python code -# directly from build/ python/build/ without installing libaubio on the system - -# Usage: $ source ./scripts/setenv_local.sh - -# WARNING: this script will *overwrite* existing (DY)LD_LIBRARY_PATH and -# PYTHONPATH variables. - -PYTHON_PLATFORM=`python -c "import pkg_resources, sys; print '%s-%s' % (pkg_resources.get_build_platform(), '.'.join(map(str, sys.version_info[0:2])))"` +# This script sets the LD_LIBRARY_PATH environment variable to ./build/src to +# execute aubio binaries without installing libaubio. +# +# Usage: $ source scripts/setenv_local.sh +# +# Note: on macOs, the variable is DYLD_LIBRARY_PATH AUBIODIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )/.." && pwd )" -AUBIOLIB=$AUBIODIR/build/src -AUBIOPYTHON=$AUBIODIR/build/lib.$PYTHON_PLATFORM -if [ "$(dirname $PWD)" == "scripts" ]; then - AUBIODIR=$(basename $PWD) -else - AUBIODIR=$(basename $PWD) -fi +AUBIOLIB=$AUBIODIR/build/src if [ "$(uname)" == "Darwin" ]; then export DYLD_LIBRARY_PATH=$AUBIOLIB @@ -27,6 +18,3 @@ else export LD_LIBRARY_PATH=$AUBIOLIB echo export LD_LIBRARY_PATH=$LD_LIBRARY_PATH fi - -export PYTHONPATH=$AUBIOPYTHON -echo export PYTHONPATH=$PYTHONPATH @@ -1,41 +1,35 @@ #! /usr/bin/env python -import sys, os.path, glob +import sys +import os.path +import glob from setuptools import setup, Extension -from python.lib.moresetuptools import * -# function to generate gen/*.{c,h} -from python.lib.gen_external import generate_external, header, output_path -# read from VERSION -for l in open('VERSION').readlines(): exec (l.strip()) +# add ./python/lib to current path +sys.path.append(os.path.join('python', 'lib')) # noqa +from moresetuptools import build_ext, CleanGenerated -if AUBIO_MAJOR_VERSION is None or AUBIO_MINOR_VERSION is None \ - or AUBIO_PATCH_VERSION is None: - raise SystemError("Failed parsing VERSION file.") +# function to generate gen/*.{c,h} +from this_version import get_aubio_version, get_aubio_pyversion -__version__ = '.'.join(map(str, [AUBIO_MAJOR_VERSION, - AUBIO_MINOR_VERSION, - AUBIO_PATCH_VERSION])) -if AUBIO_VERSION_STATUS is not None: - if AUBIO_VERSION_STATUS.startswith('~'): - AUBIO_VERSION_STATUS = AUBIO_VERSION_STATUS[1:] - __version__ += AUBIO_VERSION_STATUS +__version__ = get_aubio_pyversion() +__aubio_version__ = get_aubio_version() include_dirs = [] library_dirs = [] -define_macros = [] +define_macros = [('AUBIO_VERSION', '%s' % __aubio_version__)] extra_link_args = [] -include_dirs += [ 'python/ext' ] -include_dirs += [ output_path ] # aubio-generated.h +include_dirs += ['python/ext'] try: import numpy - include_dirs += [ numpy.get_include() ] + include_dirs += [numpy.get_include()] except ImportError: pass if sys.platform.startswith('darwin'): - extra_link_args += ['-framework','CoreFoundation', '-framework','AudioToolbox'] + extra_link_args += ['-framework', 'CoreFoundation', + '-framework', 'AudioToolbox'] sources = sorted(glob.glob(os.path.join('python', 'ext', '*.c'))) @@ -46,21 +40,11 @@ aubio_extension = Extension("aubio._aubio", extra_link_args = extra_link_args, define_macros = define_macros) -if os.path.isfile('src/aubio.h'): - # if aubio headers are found in this directory - add_local_aubio_header(aubio_extension) - # was waf used to build the shared lib? - if os.path.isdir(os.path.join('build','src')): - # link against build/src/libaubio, built with waf - add_local_aubio_lib(aubio_extension) - else: - # add libaubio sources and look for optional deps with pkg-config - add_local_aubio_sources(aubio_extension) - __version__ += '_libaubio' -else: - # look for aubio headers and lib using pkg-config - add_system_aubio(aubio_extension) - +# TODO: find a way to track if package is built against libaubio +# if os.path.isfile('src/aubio.h'): +# if not os.path.isdir(os.path.join('build','src')): +# pass +# #__version__ += 'a2' # python only version classifiers = [ 'Development Status :: 4 - Beta', @@ -74,38 +58,41 @@ classifiers = [ 'Operating System :: Microsoft :: Windows', 'Programming Language :: C', 'Programming Language :: Python', - 'License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)', + 'License :: OSI Approved :: ' + 'GNU General Public License v3 or later (GPLv3+)', ] -from distutils.command.build_ext import build_ext as _build_ext -class build_ext(_build_ext): - - def build_extension(self, extension): - # generate files python/gen/*.c, python/gen/aubio-generated.h - extension.sources += generate_external(header, output_path, overwrite = False) - return _build_ext.build_extension(self, extension) +thisdir = os.path.abspath(os.path.dirname(__file__)) +py_readme_file = os.path.join(thisdir, 'python', 'README.md') +with open(py_readme_file, 'r') as fp: + long_description = ''.join(fp.readlines()[3:]) distrib = setup(name='aubio', version = __version__, packages = ['aubio'], - package_dir = {'aubio':'python/lib/aubio'}, - scripts = ['python/scripts/aubiocut'], + package_dir = {'aubio': 'python/lib/aubio'}, ext_modules = [aubio_extension], - description = 'interface to the aubio library', - long_description = 'interface to the aubio library', + description = 'a collection of tools for music analysis', + long_description = long_description, + long_description_content_type = 'text/markdown', license = 'GNU/GPL version 3', author = 'Paul Brossier', author_email = 'piem@aubio.org', maintainer = 'Paul Brossier', maintainer_email = 'piem@aubio.org', - url = 'http://aubio.org/', + url = 'https://aubio.org/', platforms = 'any', classifiers = classifiers, install_requires = ['numpy'], + setup_requires = ['numpy'], cmdclass = { 'clean': CleanGenerated, - 'generate': GenerateCommand, 'build_ext': build_ext, }, - test_suite = 'nose2.collector.collector', + entry_points = { + 'console_scripts': [ + 'aubio = aubio.cmd:main', + 'aubiocut = aubio.cut:main', + ], + }, ) diff --git a/src/aubio.h b/src/aubio.h index 0d2b416..73deb24 100644 --- a/src/aubio.h +++ b/src/aubio.h @@ -111,6 +111,15 @@ Several examples of C programs are available in the \p examples/ and \p tests/src directories of the source tree. + Some examples: + - @ref spectral/test-fft.c + - @ref spectral/test-phasevoc.c + - @ref onset/test-onset.c + - @ref pitch/test-pitch.c + - @ref tempo/test-tempo.c + - @ref test-fvec.c + - @ref test-cvec.c + \subsection unstable_api Unstable API Several more functions are available and used within aubio, but not @@ -130,7 +139,7 @@ \section download Download Latest versions, further documentation, examples, wiki, and mailing lists can - be found at http://aubio.org . + be found at https://aubio.org . */ @@ -173,11 +182,13 @@ extern "C" #include "temporal/a_weighting.h" #include "temporal/c_weighting.h" #include "spectral/fft.h" +#include "spectral/dct.h" #include "spectral/phasevoc.h" #include "spectral/filterbank.h" #include "spectral/filterbank_mel.h" #include "spectral/mfcc.h" #include "spectral/specdesc.h" +#include "spectral/awhitening.h" #include "spectral/tss.h" #include "pitch/pitch.h" #include "onset/onset.h" @@ -188,6 +199,7 @@ extern "C" #include "synth/sampler.h" #include "synth/wavetable.h" #include "utils/parameter.h" +#include "utils/log.h" #if AUBIO_UNSTABLE #include "mathutils.h" @@ -203,6 +215,7 @@ extern "C" #include "pitch/pitchmcomb.h" #include "pitch/pitchyin.h" #include "pitch/pitchyinfft.h" +#include "pitch/pitchyinfast.h" #include "pitch/pitchschmitt.h" #include "pitch/pitchfcomb.h" #include "pitch/pitchspecacf.h" diff --git a/src/aubio_priv.h b/src/aubio_priv.h index 530edc9..5795f1e 100644 --- a/src/aubio_priv.h +++ b/src/aubio_priv.h @@ -33,7 +33,9 @@ * */ +#ifdef HAVE_CONFIG_H #include "config.h" +#endif #ifdef HAVE_STDLIB_H #include <stdlib.h> @@ -60,25 +62,43 @@ #include <string.h> #endif +#ifdef HAVE_ERRNO_H +#include <errno.h> +#endif + #ifdef HAVE_LIMITS_H #include <limits.h> // for CHAR_BIT, in C99 standard #endif -#ifdef HAVE_ACCELERATE -#define HAVE_ATLAS 1 -#include <Accelerate/Accelerate.h> -#elif defined(HAVE_ATLAS_CBLAS_H) +#ifdef HAVE_STDARG_H +#include <stdarg.h> +#endif + +#if defined(HAVE_BLAS) // --enable-blas=true +// check which cblas header we found +#if defined(HAVE_ATLAS_CBLAS_H) #define HAVE_ATLAS 1 #include <atlas/cblas.h> -#else -#undef HAVE_ATLAS +#elif defined(HAVE_OPENBLAS_CBLAS_H) +#include <openblas/cblas.h> +#elif defined(HAVE_CBLAS_H) +#include <cblas.h> +#elif !defined(HAVE_ACCELERATE) +#error "HAVE_BLAS was defined, but no blas header was found" +#endif /* end of cblas includes */ #endif -#ifdef HAVE_ACCELERATE +#if defined(HAVE_ACCELERATE) +// include accelerate framework after blas +#define HAVE_ATLAS 1 +#define HAVE_BLAS 1 #include <Accelerate/Accelerate.h> + #ifndef HAVE_AUBIO_DOUBLE #define aubio_vDSP_mmov vDSP_mmov #define aubio_vDSP_vmul vDSP_vmul +#define aubio_vDSP_vsmul vDSP_vsmul +#define aubio_vDSP_vsadd vDSP_vsadd #define aubio_vDSP_vfill vDSP_vfill #define aubio_vDSP_meanv vDSP_meanv #define aubio_vDSP_sve vDSP_sve @@ -87,9 +107,12 @@ #define aubio_vDSP_minv vDSP_minv #define aubio_vDSP_minvi vDSP_minvi #define aubio_vDSP_dotpr vDSP_dotpr +#define aubio_vDSP_vclr vDSP_vclr #else /* HAVE_AUBIO_DOUBLE */ #define aubio_vDSP_mmov vDSP_mmovD #define aubio_vDSP_vmul vDSP_vmulD +#define aubio_vDSP_vsmul vDSP_vsmulD +#define aubio_vDSP_vsadd vDSP_vsaddD #define aubio_vDSP_vfill vDSP_vfillD #define aubio_vDSP_meanv vDSP_meanvD #define aubio_vDSP_sve vDSP_sveD @@ -98,27 +121,61 @@ #define aubio_vDSP_minv vDSP_minvD #define aubio_vDSP_minvi vDSP_minviD #define aubio_vDSP_dotpr vDSP_dotprD +#define aubio_vDSP_vclr vDSP_vclrD #endif /* HAVE_AUBIO_DOUBLE */ #endif /* HAVE_ACCELERATE */ -#ifdef HAVE_ATLAS +#if defined(HAVE_BLAS) #ifndef HAVE_AUBIO_DOUBLE +#ifdef HAVE_ATLAS #define aubio_catlas_set catlas_sset +#endif /* HAVE_ATLAS */ #define aubio_cblas_copy cblas_scopy #define aubio_cblas_swap cblas_sswap #define aubio_cblas_dot cblas_sdot #else /* HAVE_AUBIO_DOUBLE */ +#ifdef HAVE_ATLAS #define aubio_catlas_set catlas_dset +#endif /* HAVE_ATLAS */ #define aubio_cblas_copy cblas_dcopy #define aubio_cblas_swap cblas_dswap #define aubio_cblas_dot cblas_ddot #endif /* HAVE_AUBIO_DOUBLE */ -#endif /* HAVE_ATLAS */ +#endif /* HAVE_BLAS */ -#if !defined(HAVE_MEMCPY_HACKS) && !defined(HAVE_ACCELERATE) && !defined(HAVE_ATLAS) +#if defined HAVE_INTEL_IPP +#include <ippcore.h> +#include <ippvm.h> +#include <ipps.h> +#ifndef HAVE_AUBIO_DOUBLE +#define aubio_ippsSet ippsSet_32f +#define aubio_ippsZero ippsZero_32f +#define aubio_ippsCopy ippsCopy_32f +#define aubio_ippsMul ippsMul_32f +#define aubio_ippsMulC ippsMulC_32f +#define aubio_ippsAddC ippsAddC_32f +#define aubio_ippsLn ippsLn_32f_A21 +#define aubio_ippsMean(a,b,c) ippsMean_32f(a, b, c, ippAlgHintFast) +#define aubio_ippsSum(a,b,c) ippsSum_32f(a, b, c, ippAlgHintFast) +#define aubio_ippsMax ippsMax_32f +#define aubio_ippsMin ippsMin_32f +#else /* HAVE_AUBIO_DOUBLE */ +#define aubio_ippsSet ippsSet_64f +#define aubio_ippsZero ippsZero_64f +#define aubio_ippsCopy ippsCopy_64f +#define aubio_ippsMul ippsMul_64f +#define aubio_ippsMulC ippsMulC_64f +#define aubio_ippsAddC ippsAddC_64f +#define aubio_ippsLn ippsLn_64f_A26 +#define aubio_ippsMean ippsMean_64f +#define aubio_ippsSum ippsSum_64f +#define aubio_ippsMax ippsMax_64f +#define aubio_ippsMin ippsMin_64f +#endif /* HAVE_AUBIO_DOUBLE */ +#endif + +#if !defined(HAVE_MEMCPY_HACKS) && !defined(HAVE_ACCELERATE) && !defined(HAVE_ATLAS) && !defined(HAVE_INTEL_IPP) #define HAVE_NOOPT 1 -#else -#undef HAVE_NOOPT #endif #include "types.h" @@ -168,16 +225,25 @@ typedef enum { AUBIO_FAIL = 1 } aubio_status; +/* Logging */ + +#include "utils/log.h" + +/** internal logging function, defined in utils/log.c */ +uint_t aubio_log(sint_t level, const char_t *fmt, ...); + #ifdef HAVE_C99_VARARGS_MACROS -#define AUBIO_ERR(...) fprintf(stderr, "AUBIO ERROR: " __VA_ARGS__) -#define AUBIO_MSG(...) fprintf(stdout, __VA_ARGS__) -#define AUBIO_DBG(...) fprintf(stderr, __VA_ARGS__) -#define AUBIO_WRN(...) fprintf(stderr, "AUBIO WARNING: " __VA_ARGS__) +#define AUBIO_ERR(...) aubio_log(AUBIO_LOG_ERR, "AUBIO ERROR: " __VA_ARGS__) +#define AUBIO_INF(...) aubio_log(AUBIO_LOG_INF, "AUBIO INFO: " __VA_ARGS__) +#define AUBIO_MSG(...) aubio_log(AUBIO_LOG_MSG, __VA_ARGS__) +#define AUBIO_DBG(...) aubio_log(AUBIO_LOG_DBG, __VA_ARGS__) +#define AUBIO_WRN(...) aubio_log(AUBIO_LOG_WRN, "AUBIO WARNING: " __VA_ARGS__) #else -#define AUBIO_ERR(format, args...) fprintf(stderr, "AUBIO ERROR: " format , ##args) -#define AUBIO_MSG(format, args...) fprintf(stdout, format , ##args) -#define AUBIO_DBG(format, args...) fprintf(stderr, format , ##args) -#define AUBIO_WRN(format, args...) fprintf(stderr, "AUBIO WARNING: " format, ##args) +#define AUBIO_ERR(format, args...) aubio_log(AUBIO_LOG_ERR, "AUBIO ERROR: " format , ##args) +#define AUBIO_INF(format, args...) aubio_log(AUBIO_LOG_INF, "AUBIO INFO: " format , ##args) +#define AUBIO_MSG(format, args...) aubio_log(AUBIO_LOG_MSG, format , ##args) +#define AUBIO_DBG(format, args...) aubio_log(AUBIO_LOG_DBG, format , ##args) +#define AUBIO_WRN(format, args...) aubio_log(AUBIO_LOG_WRN, "AUBIO WARNING: " format, ##args) #endif #define AUBIO_ERROR AUBIO_ERR @@ -185,6 +251,9 @@ typedef enum { #define AUBIO_QUIT(_s) exit(_s) #define AUBIO_SPRINTF sprintf +#define AUBIO_MAX_SAMPLERATE (192000*8) +#define AUBIO_MAX_CHANNELS 1024 + /* pi and 2*pi */ #ifndef M_PI #define PI (3.14159265358979323846) @@ -209,6 +278,7 @@ typedef enum { #define LOG logf #define FLOOR floorf #define CEIL ceilf +#define ATAN atanf #define ATAN2 atan2f #else #define EXP exp @@ -221,6 +291,7 @@ typedef enum { #define LOG log #define FLOOR floor #define CEIL ceil +#define ATAN atan #define ATAN2 atan2 #endif #define ROUND(x) FLOOR(x+.5) @@ -259,6 +330,24 @@ typedef enum { #define isnan _isnan #endif +#if !defined(_MSC_VER) +#define AUBIO_STRERROR(errno,buf,len) strerror_r(errno, buf, len) +#else +#define AUBIO_STRERROR(errno,buf,len) strerror_s(buf, len, errno) +#endif + +#ifdef HAVE_C99_VARARGS_MACROS +#define AUBIO_STRERR(...) \ + char errorstr[256]; \ + AUBIO_STRERROR(errno, errorstr, sizeof(errorstr)); \ + AUBIO_ERR(__VA_ARGS__) +#else +#define AUBIO_STRERR(format, args...) \ + char errorstr[256]; \ + AUBIO_STRERROR(errno, errorstr, sizeof(errorstr)); \ + AUBIO_ERR(format, ##args) +#endif + /* handy shortcuts */ #define DB2LIN(g) (POW(10.0,(g)*0.05f)) #define LIN2DB(v) (20.0*LOG10(v)) @@ -302,4 +391,11 @@ typedef enum { #endif #endif /* __STRICT_ANSI__ */ +#if defined(DEBUG) +#include <assert.h> +#define AUBIO_ASSERT(x) assert(x) +#else +#define AUBIO_ASSERT(x) +#endif /* DEBUG */ + #endif /* AUBIO_PRIV_H */ @@ -85,31 +85,40 @@ void cvec_copy(const cvec_t *s, cvec_t *t) { s->length, t->length); return; } -#ifdef HAVE_MEMCPY_HACKS +#if defined(HAVE_INTEL_IPP) + aubio_ippsCopy(s->phas, t->phas, (int)s->length); + aubio_ippsCopy(s->norm, t->norm, (int)s->length); +#elif defined(HAVE_MEMCPY_HACKS) memcpy(t->norm, s->norm, t->length * sizeof(smpl_t)); memcpy(t->phas, s->phas, t->length * sizeof(smpl_t)); -#else /* HAVE_MEMCPY_HACKS */ +#else uint_t j; for (j=0; j< t->length; j++) { t->norm[j] = s->norm[j]; t->phas[j] = s->phas[j]; } -#endif /* HAVE_MEMCPY_HACKS */ +#endif } -void cvec_norm_set_all (cvec_t *s, smpl_t val) { +void cvec_norm_set_all(cvec_t *s, smpl_t val) { +#if defined(HAVE_INTEL_IPP) + aubio_ippsSet(val, s->norm, (int)s->length); +#else uint_t j; for (j=0; j< s->length; j++) { s->norm[j] = val; } +#endif } void cvec_norm_zeros(cvec_t *s) { -#ifdef HAVE_MEMCPY_HACKS +#if defined(HAVE_INTEL_IPP) + aubio_ippsZero(s->norm, (int)s->length); +#elif defined(HAVE_MEMCPY_HACKS) memset(s->norm, 0, s->length * sizeof(smpl_t)); -#else /* HAVE_MEMCPY_HACKS */ +#else cvec_norm_set_all (s, 0.); -#endif /* HAVE_MEMCPY_HACKS */ +#endif } void cvec_norm_ones(cvec_t *s) { @@ -117,14 +126,20 @@ void cvec_norm_ones(cvec_t *s) { } void cvec_phas_set_all (cvec_t *s, smpl_t val) { +#if defined(HAVE_INTEL_IPP) + aubio_ippsSet(val, s->phas, (int)s->length); +#else uint_t j; for (j=0; j< s->length; j++) { s->phas[j] = val; } +#endif } void cvec_phas_zeros(cvec_t *s) { -#ifdef HAVE_MEMCPY_HACKS +#if defined(HAVE_INTEL_IPP) + aubio_ippsZero(s->phas, (int)s->length); +#elif defined(HAVE_MEMCPY_HACKS) memset(s->phas, 0, s->length * sizeof(smpl_t)); #else cvec_phas_set_all (s, 0.); @@ -139,3 +154,16 @@ void cvec_zeros(cvec_t *s) { cvec_norm_zeros(s); cvec_phas_zeros(s); } + +void cvec_logmag(cvec_t *s, smpl_t lambda) { +#if defined(HAVE_INTEL_IPP) + aubio_ippsMulC(s->norm, lambda, s->norm, (int)s->length); + aubio_ippsAddC(s->norm, 1.0, s->norm, (int)s->length); + aubio_ippsLn(s->norm, s->norm, (int)s->length); +#else + uint_t j; + for (j=0; j< s->length; j++) { + s->norm[j] = LOG(lambda * s->norm[j] + 1); + } +#endif +} @@ -230,6 +230,16 @@ void cvec_phas_ones(cvec_t *s); */ void cvec_zeros(cvec_t *s); +/** take logarithmic magnitude + + \param s input cvec to compress + \param lambda value to use for normalisation + + \f$ S_k = log( \lambda * S_k + 1 ) \f$ + +*/ +void cvec_logmag(cvec_t *s, smpl_t lambda); + #ifdef __cplusplus } #endif @@ -110,7 +110,7 @@ void fmat_ones(fmat_t *s) { void fmat_rev(fmat_t *s) { uint_t i,j; for (i=0; i< s->height; i++) { - for (j=0; j< FLOOR(s->length/2); j++) { + for (j=0; j< FLOOR((smpl_t)s->length/2); j++) { ELEM_SWAP(s->data[i][j], s->data[i][s->length-1-j]); } } @@ -160,7 +160,7 @@ void fmat_vecmul(const fmat_t *s, const fvec_t *scale, fvec_t *output) { assert(s->height == output->length); assert(s->length == scale->length); #endif -#if !defined(HAVE_ACCELERATE) && !defined(HAVE_ATLAS) +#if !defined(HAVE_ACCELERATE) && !defined(HAVE_BLAS) uint_t j; fvec_zeros(output); for (j = 0; j < s->length; j++) { @@ -169,7 +169,7 @@ void fmat_vecmul(const fmat_t *s, const fvec_t *scale, fvec_t *output) { * s->data[k][j]; } } -#elif defined(HAVE_ATLAS) +#elif defined(HAVE_BLAS) for (k = 0; k < s->height; k++) { output->data[k] = aubio_cblas_dot( s->length, scale->data, 1, s->data[k], 1); } @@ -60,27 +60,30 @@ void fvec_print(const fvec_t *s) { } void fvec_set_all (fvec_t *s, smpl_t val) { -#if !defined(HAVE_ACCELERATE) && !defined(HAVE_ATLAS) - uint_t j; - for (j=0; j< s->length; j++) { - s->data[j] = val; - } +#if defined(HAVE_INTEL_IPP) + aubio_ippsSet(val, s->data, (int)s->length); #elif defined(HAVE_ATLAS) aubio_catlas_set(s->length, val, s->data, 1); #elif defined(HAVE_ACCELERATE) aubio_vDSP_vfill(&val, s->data, 1, s->length); +#else + uint_t j; + for ( j = 0; j< s->length; j++ ) + { + s->data[j] = val; + } #endif } void fvec_zeros(fvec_t *s) { -#if !defined(HAVE_MEMCPY_HACKS) && !defined(HAVE_ACCELERATE) - fvec_set_all (s, 0.); -#else -#if defined(HAVE_MEMCPY_HACKS) +#if defined(HAVE_INTEL_IPP) + aubio_ippsZero(s->data, (int)s->length); +#elif defined(HAVE_ACCELERATE) + aubio_vDSP_vclr(s->data, 1, s->length); +#elif defined(HAVE_MEMCPY_HACKS) memset(s->data, 0, s->length * sizeof(smpl_t)); #else - aubio_vDSP_vclr(s->data, 1, s->length); -#endif + fvec_set_all(s, 0.); #endif } @@ -90,33 +93,37 @@ void fvec_ones(fvec_t *s) { void fvec_rev(fvec_t *s) { uint_t j; - for (j=0; j< FLOOR(s->length/2); j++) { + for (j=0; j< FLOOR((smpl_t)s->length/2); j++) { ELEM_SWAP(s->data[j], s->data[s->length-1-j]); } } void fvec_weight(fvec_t *s, const fvec_t *weight) { -#ifndef HAVE_ACCELERATE - uint_t j; uint_t length = MIN(s->length, weight->length); - for (j=0; j< length; j++) { +#if defined(HAVE_INTEL_IPP) + aubio_ippsMul(s->data, weight->data, s->data, (int)length); +#elif defined(HAVE_ACCELERATE) + aubio_vDSP_vmul( s->data, 1, weight->data, 1, s->data, 1, length ); +#else + uint_t j; + for (j = 0; j < length; j++) { s->data[j] *= weight->data[j]; } -#else - aubio_vDSP_vmul(s->data, 1, weight->data, 1, s->data, 1, s->length); #endif /* HAVE_ACCELERATE */ } void fvec_weighted_copy(const fvec_t *in, const fvec_t *weight, fvec_t *out) { -#ifndef HAVE_ACCELERATE + uint_t length = MIN(in->length, MIN(out->length, weight->length)); +#if defined(HAVE_INTEL_IPP) + aubio_ippsMul(in->data, weight->data, out->data, (int)length); +#elif defined(HAVE_ACCELERATE) + aubio_vDSP_vmul(in->data, 1, weight->data, 1, out->data, 1, length); +#else uint_t j; - uint_t length = MIN(out->length, weight->length); - for (j=0; j< length; j++) { + for (j = 0; j < length; j++) { out->data[j] = in->data[j] * weight->data[j]; } -#else - aubio_vDSP_vmul(in->data, 1, weight->data, 1, out->data, 1, out->length); -#endif /* HAVE_ACCELERATE */ +#endif } void fvec_copy(const fvec_t *s, fvec_t *t) { @@ -125,16 +132,18 @@ void fvec_copy(const fvec_t *s, fvec_t *t) { s->length, t->length); return; } -#ifdef HAVE_NOOPT - uint_t j; - for (j=0; j< t->length; j++) { - t->data[j] = s->data[j]; - } -#elif defined(HAVE_MEMCPY_HACKS) - memcpy(t->data, s->data, t->length * sizeof(smpl_t)); -#elif defined(HAVE_ATLAS) +#if defined(HAVE_INTEL_IPP) + aubio_ippsCopy(s->data, t->data, (int)s->length); +#elif defined(HAVE_BLAS) aubio_cblas_copy(s->length, s->data, 1, t->data, 1); #elif defined(HAVE_ACCELERATE) aubio_vDSP_mmov(s->data, t->data, 1, s->length, 1, 1); +#elif defined(HAVE_MEMCPY_HACKS) + memcpy(t->data, s->data, t->length * sizeof(smpl_t)); +#else + uint_t j; + for (j = 0; j < t->length; j++) { + t->data[j] = s->data[j]; + } #endif } diff --git a/src/io/audio_unit.c b/src/io/audio_unit.c index a21906a..2674388 100644 --- a/src/io/audio_unit.c +++ b/src/io/audio_unit.c @@ -18,9 +18,8 @@ */ -#include "config.h" -#ifdef HAVE_AUDIO_UNIT #include "aubio_priv.h" +#ifdef HAVE_AUDIO_UNIT #include "fvec.h" #include "fmat.h" diff --git a/src/io/ioutils.c b/src/io/ioutils.c new file mode 100644 index 0000000..6657352 --- /dev/null +++ b/src/io/ioutils.c @@ -0,0 +1,159 @@ +/* + Copyright (C) 2016 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#include "aubio_priv.h" +#include "fmat.h" + +uint_t +aubio_io_validate_samplerate(const char_t *kind, const char_t *path, uint_t samplerate) +{ + if ((sint_t)(samplerate) <= 0) { + AUBIO_ERR("%s: failed creating %s, samplerate should be positive, not %d\n", + kind, path, samplerate); + return AUBIO_FAIL; + } + if ((sint_t)samplerate > AUBIO_MAX_SAMPLERATE) { + AUBIO_ERR("%s: failed creating %s, samplerate %dHz is too large\n", + kind, path, samplerate); + return AUBIO_FAIL; + } + return AUBIO_OK; +} + +uint_t +aubio_io_validate_channels(const char_t *kind, const char_t *path, uint_t channels) +{ + if ((sint_t)(channels) <= 0) { + AUBIO_ERR("sink_%s: failed creating %s, channels should be positive, not %d\n", + kind, path, channels); + return AUBIO_FAIL; + } + if (channels > AUBIO_MAX_CHANNELS) { + AUBIO_ERR("sink_%s: failed creating %s, too many channels (%d but %d available)\n", + kind, path, channels, AUBIO_MAX_CHANNELS); + return AUBIO_FAIL; + } + return AUBIO_OK; +} + +uint_t +aubio_source_validate_input_length(const char_t *kind, const char_t *path, + uint_t hop_size, uint_t read_data_length) +{ + uint_t length = hop_size; + if (hop_size < read_data_length) { + AUBIO_WRN("%s: partial read from %s, trying to read %d frames, but" + " hop_size is %d\n", kind, path, read_data_length, hop_size); + } else if (hop_size > read_data_length) { + AUBIO_WRN("%s: partial read from %s, trying to read %d frames into" + " a buffer of length %d\n", kind, path, hop_size, read_data_length); + length = read_data_length; + } + return length; +} + +uint_t +aubio_source_validate_input_channels(const char_t *kind, const char_t *path, + uint_t source_channels, uint_t read_data_height) +{ + uint_t channels = source_channels; + if (read_data_height < source_channels) { + AUBIO_WRN("%s: partial read from %s, trying to read %d channels," + " but found output of height %d\n", kind, path, source_channels, + read_data_height); + channels = read_data_height; + } else if (read_data_height > source_channels) { + // do not show a warning when trying to read into more channels than + // the input source. +#if 0 + AUBIO_WRN("%s: partial read from %s, trying to read %d channels," + " but found output of height %d\n", kind, path, source_channels, + read_data_height); +#endif + channels = source_channels; + } + return channels; +} + +void +aubio_source_pad_output (fvec_t *read_data, uint_t source_read) +{ + if (source_read < read_data->length) { + AUBIO_MEMSET(read_data->data + source_read, 0, + (read_data->length - source_read) * sizeof(smpl_t)); + } +} + +void +aubio_source_pad_multi_output (fmat_t *read_data, + uint_t source_channels, uint_t source_read) { + uint_t i; + if (source_read < read_data->length) { + for (i = 0; i < read_data->height; i++) { + AUBIO_MEMSET(read_data->data[i] + source_read, 0, + (read_data->length - source_read) * sizeof(smpl_t)); + } + } + + // destination matrix has more channels than the file + // copy channels from the source to extra output channels + if (read_data->height > source_channels) { + for (i = source_channels; i < read_data->height; i++) { + AUBIO_MEMCPY(read_data->data[i], read_data->data[i % source_channels], + sizeof(smpl_t) * read_data->length); + } + } +} + +uint_t +aubio_sink_validate_input_length(const char_t *kind, const char_t *path, + uint_t max_size, uint_t write_data_length, uint_t write) +{ + uint_t can_write = write; + + if (write > max_size) { + AUBIO_WRN("%s: partial write to %s, trying to write %d frames," + " at most %d can be written at once\n", kind, path, write, max_size); + can_write = max_size; + } + + if (can_write > write_data_length) { + AUBIO_WRN("%s: partial write to %s, trying to write %d frames," + " but found input of length %d\n", kind, path, write, + write_data_length); + can_write = write_data_length; + } + + return can_write; +} + +uint_t +aubio_sink_validate_input_channels(const char_t *kind, const char_t *path, + uint_t sink_channels, uint_t write_data_height) +{ + uint_t channels = sink_channels; + if (write_data_height < sink_channels) { + AUBIO_WRN("%s: partial write to %s, trying to write %d channels," + " but found input of height %d\n", kind, path, sink_channels, + write_data_height); + channels = write_data_height; + } + return channels; +} diff --git a/src/io/ioutils.h b/src/io/ioutils.h new file mode 100644 index 0000000..f3cdf92 --- /dev/null +++ b/src/io/ioutils.h @@ -0,0 +1,133 @@ +/* + Copyright (C) 2016 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#ifndef AUBIO_IOUTILS_H +#define AUBIO_IOUTILS_H + +/** \file + + Simple utility functions to validate input parameters. + +*/ + +#ifdef __cplusplus +extern "C" { +#endif + +/** validate samplerate + + \param kind the object kind to report on + \param path the object properties to report on + \param samplerate the object properties to report on + \return 0 if ok, non-zero if validation failed + + */ +uint_t aubio_io_validate_samplerate(const char_t *kind, const char_t *path, + uint_t samplerate); + +/** validate number of channels + + \param kind the object kind to report on + \param path the object properties to report on + \param channels the object properties to report on + \return 0 if ok, non-zero if validation failed + + */ +uint_t aubio_io_validate_channels(const char_t *kind, const char_t *path, + uint_t channels); + +/** validate length of source output + + \param kind the object kind to report on + \param path the path to report on + \param hop_size number of frames to be read + \param read_data_length actual length of input + + \return hop_size or the maximum number of frames that can be written +*/ +uint_t +aubio_source_validate_input_length(const char_t *kind, const char_t *path, + uint_t hop_size, uint_t read_data_length); + +/** validate height of source output + + \param kind the object kind to report on + \param path the path to report on + \param source_channels maximum number of channels that can be written + \param read_data_height actual height of input + + \return write_data_height or the maximum number of channels +*/ +uint_t +aubio_source_validate_input_channels(const char_t *kind, const char_t *path, + uint_t source_channels, uint_t read_data_height); + +/** pad end of source output vector with zeroes + + \param read_data output vector to pad + \param source_read number of frames read + +*/ +void +aubio_source_pad_output (fvec_t *read_data, uint_t source_read); + +/** pad end of source output matrix with zeroes + + \param read_data output matrix to pad + \param source_channels number of channels in the source + \param source_read number of frames read + +*/ +void +aubio_source_pad_multi_output (fmat_t *read_data, uint_t source_channels, + uint_t source_read); + +/** validate length of sink input + + \param kind the object kind to report on + \param path the path to report on + \param max_size maximum number of frames that can be written + \param write_data_length actual length of input + \param write number of samples asked + + \return write or the maximum number of frames that can be written +*/ +uint_t +aubio_sink_validate_input_length(const char_t *kind, const char_t *path, + uint_t max_size, uint_t write_data_length, uint_t write); + +/** validate height of sink input + + \param kind the object kind to report on + \param path the path to report on + \param sink_channels maximum number of channels that can be written + \param write_data_height actual height of input matrix + + \return write_data_height or the maximum number of channels +*/ +uint_t +aubio_sink_validate_input_channels(const char_t *kind, const char_t *path, + uint_t sink_channels, uint_t write_data_height); + +#ifdef __cplusplus +} +#endif + +#endif /* AUBIO_IOUTILS_H */ diff --git a/src/io/sink.c b/src/io/sink.c index a13316a..364a2c3 100644 --- a/src/io/sink.c +++ b/src/io/sink.c @@ -18,7 +18,6 @@ */ -#include "config.h" #include "aubio_priv.h" #include "fvec.h" #include "fmat.h" @@ -98,9 +97,12 @@ aubio_sink_t * new_aubio_sink(const char_t * uri, uint_t samplerate) { return s; } #endif /* HAVE_WAVWRITE */ - AUBIO_ERROR("sink: failed creating %s with samplerate %dHz\n", - uri, samplerate); - AUBIO_FREE(s); +#if !defined(HAVE_WAVWRITE) && \ + !defined(HAVE_SNDFILE) && \ + !defined(HAVE_SINK_APPLE_AUDIO) + AUBIO_ERROR("sink: failed creating '%s' at %dHz (no sink built-in)\n", uri, samplerate); +#endif + del_aubio_sink(s); return NULL; } @@ -133,8 +135,8 @@ uint_t aubio_sink_close(aubio_sink_t *s) { } void del_aubio_sink(aubio_sink_t * s) { - if (!s) return; - s->s_del((void *)s->sink); + //AUBIO_ASSERT(s); + if (s && s->s_del && s->sink) + s->s_del((void *)s->sink); AUBIO_FREE(s); - return; } diff --git a/src/io/sink_apple_audio.c b/src/io/sink_apple_audio.c index b6fd358..c58a52c 100644 --- a/src/io/sink_apple_audio.c +++ b/src/io/sink_apple_audio.c @@ -18,23 +18,20 @@ */ -#include "config.h" +#include "aubio_priv.h" #ifdef HAVE_SINK_APPLE_AUDIO - -#include "aubio_priv.h" #include "fvec.h" #include "fmat.h" #include "io/sink_apple_audio.h" +#include "io/ioutils.h" // CFURLRef, CFURLCreateWithFileSystemPath, ... #include <CoreFoundation/CoreFoundation.h> // ExtAudioFileRef, AudioStreamBasicDescription, AudioBufferList, ... #include <AudioToolbox/AudioToolbox.h> -#define FLOAT_TO_SHORT(x) (short)(x * 32768) - -extern int createAubioBufferList(AudioBufferList *bufferList, int channels, int segmentSize); +extern int createAudioBufferList(AudioBufferList *bufferList, int channels, int segmentSize); extern void freeAudioBufferList(AudioBufferList *bufferList); extern CFURLRef createURLFromPath(const char * path); char_t *getPrintableOSStatusError(char_t *str, OSStatus error); @@ -62,21 +59,26 @@ aubio_sink_apple_audio_t * new_aubio_sink_apple_audio(const char_t * uri, uint_t s->max_frames = MAX_SIZE; s->async = false; - if (uri == NULL) { + if ( (uri == NULL) || (strnlen(uri, PATH_MAX) < 1) ) { AUBIO_ERROR("sink_apple_audio: Aborted opening null path\n"); goto beach; } - if (s->path != NULL) AUBIO_FREE(s->path); + s->path = AUBIO_ARRAY(char_t, strnlen(uri, PATH_MAX) + 1); strncpy(s->path, uri, strnlen(uri, PATH_MAX) + 1); s->samplerate = 0; s->channels = 0; - // negative samplerate given, abort - if ((sint_t)samplerate < 0) goto beach; // zero samplerate given. do not open yet - if ((sint_t)samplerate == 0) return s; + if ((sint_t)samplerate == 0) { + return s; + } + + // invalid samplerate given, abort + if (aubio_io_validate_samplerate("sink_apple_audio", s->path, samplerate)) { + goto beach; + } s->samplerate = samplerate; s->channels = 1; @@ -88,16 +90,18 @@ aubio_sink_apple_audio_t * new_aubio_sink_apple_audio(const char_t * uri, uint_t return s; beach: - AUBIO_FREE(s); + del_aubio_sink_apple_audio(s); return NULL; } uint_t aubio_sink_apple_audio_preset_samplerate(aubio_sink_apple_audio_t *s, uint_t samplerate) { - if ((sint_t)(samplerate) <= 0) return AUBIO_FAIL; + if (aubio_io_validate_samplerate("sink_apple_audio", s->path, samplerate)) { + return AUBIO_FAIL; + } s->samplerate = samplerate; // automatically open when both samplerate and channels have been set - if (s->samplerate != 0 && s->channels != 0) { + if (/* s->samplerate != 0 && */ s->channels != 0) { return aubio_sink_apple_audio_open(s); } return AUBIO_OK; @@ -105,10 +109,12 @@ uint_t aubio_sink_apple_audio_preset_samplerate(aubio_sink_apple_audio_t *s, uin uint_t aubio_sink_apple_audio_preset_channels(aubio_sink_apple_audio_t *s, uint_t channels) { - if ((sint_t)(channels) <= 0) return AUBIO_FAIL; + if (aubio_io_validate_channels("sink_apple_audio", s->path, channels)) { + return AUBIO_FAIL; + } s->channels = channels; // automatically open when both samplerate and channels have been set - if (s->samplerate != 0 && s->channels != 0) { + if (s->samplerate != 0 /* && s->channels != 0 */) { return aubio_sink_apple_audio_open(s); } return AUBIO_OK; @@ -143,6 +149,18 @@ uint_t aubio_sink_apple_audio_open(aubio_sink_apple_audio_t *s) { AudioFileTypeID fileType = kAudioFileWAVEType; CFURLRef fileURL = createURLFromPath(s->path); bool overwrite = true; + + // set the in-memory format + AudioStreamBasicDescription inputFormat; + memset(&inputFormat, 0, sizeof(AudioStreamBasicDescription)); + inputFormat.mFormatID = kAudioFormatLinearPCM; + inputFormat.mSampleRate = (Float64)(s->samplerate); + inputFormat.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked; + inputFormat.mChannelsPerFrame = s->channels; + inputFormat.mBitsPerChannel = sizeof(smpl_t) * 8; + inputFormat.mFramesPerPacket = 1; + inputFormat.mBytesPerFrame = inputFormat.mBitsPerChannel * inputFormat.mChannelsPerFrame / 8; + inputFormat.mBytesPerPacket = inputFormat.mFramesPerPacket * inputFormat.mBytesPerFrame; OSStatus err = noErr; err = ExtAudioFileCreateWithURL(fileURL, fileType, &clientFormat, NULL, overwrite ? kAudioFileFlags_EraseFile : 0, &s->audioFile); @@ -154,7 +172,18 @@ uint_t aubio_sink_apple_audio_open(aubio_sink_apple_audio_t *s) { getPrintableOSStatusError(errorstr, err)); goto beach; } - if (createAubioBufferList(&s->bufferList, s->channels, s->max_frames * s->channels)) { + + err = ExtAudioFileSetProperty(s->audioFile, + kExtAudioFileProperty_ClientDataFormat, + sizeof(AudioStreamBasicDescription), &inputFormat); + if (err) { + char_t errorstr[20]; + AUBIO_ERR("sink_apple_audio: error when trying to set output format on %s " + "(%s)\n", s->path, getPrintableOSStatusError(errorstr, err)); + goto beach; + } + + if (createAudioBufferList(&s->bufferList, s->channels, s->max_frames * s->channels)) { AUBIO_ERR("sink_apple_audio: error when creating buffer list for %s, " "out of memory? \n", s->path); goto beach; @@ -167,46 +196,42 @@ beach: void aubio_sink_apple_audio_do(aubio_sink_apple_audio_t * s, fvec_t * write_data, uint_t write) { UInt32 c, v; - short *data = (short*)s->bufferList.mBuffers[0].mData; - if (write > s->max_frames) { - AUBIO_WRN("sink_apple_audio: trying to write %d frames, max %d\n", write, s->max_frames); - write = s->max_frames; - } - smpl_t *buf = write_data->data; - - if (buf) { - for (c = 0; c < s->channels; c++) { - for (v = 0; v < write; v++) { - data[v * s->channels + c] = - FLOAT_TO_SHORT(buf[ v * s->channels + c]); - } - } + smpl_t *data = (smpl_t*)s->bufferList.mBuffers[0].mData; + uint_t length = aubio_sink_validate_input_length("sink_apple_audio", s->path, + s->max_frames, write_data->length, write); + + for (c = 0; c < s->channels; c++) { + for (v = 0; v < length; v++) { + data[v * s->channels + c] = write_data->data[v]; + } } - aubio_sink_apple_audio_write(s, write); + + aubio_sink_apple_audio_write(s, length); } void aubio_sink_apple_audio_do_multi(aubio_sink_apple_audio_t * s, fmat_t * write_data, uint_t write) { UInt32 c, v; - short *data = (short*)s->bufferList.mBuffers[0].mData; - if (write > s->max_frames) { - AUBIO_WRN("sink_apple_audio: trying to write %d frames, max %d\n", write, s->max_frames); - write = s->max_frames; - } - smpl_t **buf = write_data->data; - - if (buf) { - for (c = 0; c < s->channels; c++) { - for (v = 0; v < write; v++) { - data[v * s->channels + c] = - FLOAT_TO_SHORT(buf[c][v]); - } - } + smpl_t *data = (smpl_t*)s->bufferList.mBuffers[0].mData; + uint_t channels = aubio_sink_validate_input_channels("sink_apple_audio", + s->path, s->channels, write_data->height); + uint_t length = aubio_sink_validate_input_length("sink_apple_audio", s->path, + s->max_frames, write_data->length, write); + + for (c = 0; c < channels; c++) { + for (v = 0; v < length; v++) { + data[v * s->channels + c] = write_data->data[c][v]; + } } - aubio_sink_apple_audio_write(s, write); + + aubio_sink_apple_audio_write(s, length); } void aubio_sink_apple_audio_write(aubio_sink_apple_audio_t *s, uint_t write) { OSStatus err = noErr; + // set mDataByteSize to match the number of frames to be written + // see https://www.mail-archive.com/coreaudio-api@lists.apple.com/msg01109.html + s->bufferList.mBuffers[0].mDataByteSize = write * s->channels + * sizeof(smpl_t); if (s->async) { err = ExtAudioFileWriteAsync(s->audioFile, write, &s->bufferList); if (err) { @@ -250,11 +275,13 @@ uint_t aubio_sink_apple_audio_close(aubio_sink_apple_audio_t * s) { } void del_aubio_sink_apple_audio(aubio_sink_apple_audio_t * s) { - if (s->audioFile) aubio_sink_apple_audio_close (s); - if (s->path) AUBIO_FREE(s->path); + AUBIO_ASSERT(s); + if (s->audioFile) + aubio_sink_apple_audio_close (s); + if (s->path) + AUBIO_FREE(s->path); freeAudioBufferList(&s->bufferList); AUBIO_FREE(s); - return; } #endif /* HAVE_SINK_APPLE_AUDIO */ diff --git a/src/io/sink_sndfile.c b/src/io/sink_sndfile.c index 99392c3..35e2215 100644 --- a/src/io/sink_sndfile.c +++ b/src/io/sink_sndfile.c @@ -19,18 +19,17 @@ */ -#include "config.h" +#include "aubio_priv.h" #ifdef HAVE_SNDFILE #include <sndfile.h> -#include "aubio_priv.h" #include "fvec.h" #include "fmat.h" #include "io/sink_sndfile.h" +#include "io/ioutils.h" -#define MAX_CHANNELS 6 #define MAX_SIZE 4096 #if !HAVE_AUBIO_DOUBLE @@ -59,20 +58,23 @@ aubio_sink_sndfile_t * new_aubio_sink_sndfile(const char_t * path, uint_t sample if (path == NULL) { AUBIO_ERR("sink_sndfile: Aborted opening null path\n"); - return NULL; + goto beach; } - if (s->path) AUBIO_FREE(s->path); s->path = AUBIO_ARRAY(char_t, strnlen(path, PATH_MAX) + 1); strncpy(s->path, path, strnlen(path, PATH_MAX) + 1); s->samplerate = 0; s->channels = 0; - // negative samplerate given, abort - if ((sint_t)samplerate < 0) goto beach; // zero samplerate given. do not open yet - if ((sint_t)samplerate == 0) return s; + if ((sint_t)samplerate == 0) { + return s; + } + // invalid samplerate given, abort + if (aubio_io_validate_samplerate("sink_sndfile", s->path, samplerate)) { + goto beach; + } s->samplerate = samplerate; s->channels = 1; @@ -89,10 +91,12 @@ beach: uint_t aubio_sink_sndfile_preset_samplerate(aubio_sink_sndfile_t *s, uint_t samplerate) { - if ((sint_t)(samplerate) <= 0) return AUBIO_FAIL; + if (aubio_io_validate_samplerate("sink_sndfile", s->path, samplerate)) { + return AUBIO_FAIL; + } s->samplerate = samplerate; // automatically open when both samplerate and channels have been set - if (s->samplerate != 0 && s->channels != 0) { + if (/* s->samplerate != 0 && */ s->channels != 0) { return aubio_sink_sndfile_open(s); } return AUBIO_OK; @@ -100,10 +104,12 @@ uint_t aubio_sink_sndfile_preset_samplerate(aubio_sink_sndfile_t *s, uint_t samp uint_t aubio_sink_sndfile_preset_channels(aubio_sink_sndfile_t *s, uint_t channels) { - if ((sint_t)(channels) <= 0) return AUBIO_FAIL; + if (aubio_io_validate_channels("sink_sndfile", s->path, channels)) { + return AUBIO_FAIL; + } s->channels = channels; // automatically open when both samplerate and channels have been set - if (s->samplerate != 0 && s->channels != 0) { + if (s->samplerate != 0 /* && s->channels != 0 */) { return aubio_sink_sndfile_open(s); } return AUBIO_OK; @@ -132,15 +138,16 @@ uint_t aubio_sink_sndfile_open(aubio_sink_sndfile_t *s) { if (s->handle == NULL) { /* show libsndfile err msg */ - AUBIO_ERR("sink_sndfile: Failed opening %s. %s\n", s->path, sf_strerror (NULL)); + AUBIO_ERR("sink_sndfile: Failed opening \"%s\" with %d channels, %dHz: %s\n", + s->path, s->channels, s->samplerate, sf_strerror (NULL)); return AUBIO_FAIL; } s->scratch_size = s->max_size*s->channels; /* allocate data for de/interleaving reallocated when needed. */ - if (s->scratch_size >= MAX_SIZE * MAX_CHANNELS) { - AUBIO_ERR("sink_sndfile: %d x %d exceeds maximum aubio_sink_sndfile buffer size %d\n", - s->max_size, s->channels, MAX_CHANNELS * MAX_CHANNELS); + if (s->scratch_size >= MAX_SIZE * AUBIO_MAX_CHANNELS) { + AUBIO_ERR("sink_sndfile: %d x %d exceeds maximum buffer size %d\n", + s->max_size, s->channels, MAX_SIZE * AUBIO_MAX_CHANNELS); return AUBIO_FAIL; } s->scratch_data = AUBIO_ARRAY(smpl_t,s->scratch_size); @@ -149,24 +156,17 @@ uint_t aubio_sink_sndfile_open(aubio_sink_sndfile_t *s) { } void aubio_sink_sndfile_do(aubio_sink_sndfile_t *s, fvec_t * write_data, uint_t write){ - uint_t i, j, channels = s->channels; - int nsamples = 0; - smpl_t *pwrite; + uint_t i, j; sf_count_t written_frames; - - if (write > s->max_size) { - AUBIO_WRN("sink_sndfile: trying to write %d frames, but only %d can be written at a time\n", - write, s->max_size); - write = s->max_size; - } - - nsamples = channels * write; + uint_t channels = s->channels; + uint_t length = aubio_sink_validate_input_length("sink_sndfile", s->path, + s->max_size, write_data->length, write); + int nsamples = channels * length; /* interleaving data */ for ( i = 0; i < channels; i++) { - pwrite = (smpl_t *)write_data->data; - for (j = 0; j < write; j++) { - s->scratch_data[channels*j+i] = pwrite[j]; + for (j = 0; j < length; j++) { + s->scratch_data[channels*j+i] = write_data->data[j]; } } @@ -179,24 +179,18 @@ void aubio_sink_sndfile_do(aubio_sink_sndfile_t *s, fvec_t * write_data, uint_t } void aubio_sink_sndfile_do_multi(aubio_sink_sndfile_t *s, fmat_t * write_data, uint_t write){ - uint_t i, j, channels = s->channels; - int nsamples = 0; - smpl_t *pwrite; + uint_t i, j; sf_count_t written_frames; - - if (write > s->max_size) { - AUBIO_WRN("sink_sndfile: trying to write %d frames, but only %d can be written at a time\n", - write, s->max_size); - write = s->max_size; - } - - nsamples = channels * write; + uint_t channels = aubio_sink_validate_input_channels("sink_sndfile", s->path, + s->channels, write_data->height); + uint_t length = aubio_sink_validate_input_length("sink_sndfile", s->path, + s->max_size, write_data->length, write); + int nsamples = channels * length; /* interleaving data */ - for ( i = 0; i < write_data->height; i++) { - pwrite = (smpl_t *)write_data->data[i]; - for (j = 0; j < write; j++) { - s->scratch_data[channels*j+i] = pwrite[j]; + for ( i = 0; i < channels; i++) { + for (j = 0; j < length; j++) { + s->scratch_data[channels*j+i] = write_data->data[i][j]; } } @@ -221,10 +215,13 @@ uint_t aubio_sink_sndfile_close (aubio_sink_sndfile_t *s) { } void del_aubio_sink_sndfile(aubio_sink_sndfile_t * s){ - if (!s) return; - if (s->path) AUBIO_FREE(s->path); - aubio_sink_sndfile_close(s); - AUBIO_FREE(s->scratch_data); + AUBIO_ASSERT(s); + if (s->handle) + aubio_sink_sndfile_close(s); + if (s->path) + AUBIO_FREE(s->path); + if (s->scratch_data) + AUBIO_FREE(s->scratch_data); AUBIO_FREE(s); } diff --git a/src/io/sink_wavwrite.c b/src/io/sink_wavwrite.c index 761f6e5..8ea05b7 100644 --- a/src/io/sink_wavwrite.c +++ b/src/io/sink_wavwrite.c @@ -19,18 +19,15 @@ */ -#include "config.h" +#include "aubio_priv.h" #ifdef HAVE_WAVWRITE -#include "aubio_priv.h" #include "fvec.h" #include "fmat.h" #include "io/sink_wavwrite.h" +#include "io/ioutils.h" -#include <errno.h> - -#define MAX_CHANNELS 6 #define MAX_SIZE 4096 #define FLOAT_TO_SHORT(x) (short)(x * 32768) @@ -68,8 +65,12 @@ struct _aubio_sink_wavwrite_t { unsigned short *scratch_data; }; -unsigned char *write_little_endian (unsigned int s, unsigned char *str, unsigned int length); -unsigned char *write_little_endian (unsigned int s, unsigned char *str, unsigned int length) { +static unsigned char *write_little_endian (unsigned int s, unsigned char *str, + unsigned int length); + +static unsigned char *write_little_endian (unsigned int s, unsigned char *str, + unsigned int length) +{ uint_t i; for (i = 0; i < length; i++) { str[i] = s >> (i * 8); @@ -84,12 +85,7 @@ aubio_sink_wavwrite_t * new_aubio_sink_wavwrite(const char_t * path, uint_t samp AUBIO_ERR("sink_wavwrite: Aborted opening null path\n"); goto beach; } - if ((sint_t)samplerate < 0) { - AUBIO_ERR("sink_wavwrite: Can not create %s with samplerate %d\n", path, samplerate); - goto beach; - } - if (s->path) AUBIO_FREE(s->path); s->path = AUBIO_ARRAY(char_t, strnlen(path, PATH_MAX) + 1); strncpy(s->path, path, strnlen(path, PATH_MAX) + 1); @@ -100,12 +96,14 @@ aubio_sink_wavwrite_t * new_aubio_sink_wavwrite(const char_t * path, uint_t samp s->samplerate = 0; s->channels = 0; - // negative samplerate given, abort - if ((sint_t)samplerate < 0) goto beach; // zero samplerate given. do not open yet - if ((sint_t)samplerate == 0) return s; - // samplerate way too large, fail - if ((sint_t)samplerate > 192000 * 4) goto beach; + if ((sint_t)samplerate == 0) { + return s; + } + // invalid samplerate given, abort + if (aubio_io_validate_samplerate("sink_wavwrite", s->path, samplerate)) { + goto beach; + } s->samplerate = samplerate; s->channels = 1; @@ -125,10 +123,12 @@ beach: uint_t aubio_sink_wavwrite_preset_samplerate(aubio_sink_wavwrite_t *s, uint_t samplerate) { - if ((sint_t)(samplerate) <= 0) return AUBIO_FAIL; + if (aubio_io_validate_samplerate("sink_wavwrite", s->path, samplerate)) { + return AUBIO_FAIL; + } s->samplerate = samplerate; // automatically open when both samplerate and channels have been set - if (s->samplerate != 0 && s->channels != 0) { + if (/* s->samplerate != 0 && */ s->channels != 0) { return aubio_sink_wavwrite_open(s); } return AUBIO_OK; @@ -136,10 +136,12 @@ uint_t aubio_sink_wavwrite_preset_samplerate(aubio_sink_wavwrite_t *s, uint_t sa uint_t aubio_sink_wavwrite_preset_channels(aubio_sink_wavwrite_t *s, uint_t channels) { - if ((sint_t)(channels) <= 0) return AUBIO_FAIL; + if (aubio_io_validate_channels("sink_wavwrite", s->path, channels)) { + return AUBIO_FAIL; + } s->channels = channels; // automatically open when both samplerate and channels have been set - if (s->samplerate != 0 && s->channels != 0) { + if (s->samplerate != 0 /* && s->channels != 0 */) { return aubio_sink_wavwrite_open(s); } return AUBIO_OK; @@ -158,60 +160,70 @@ uint_t aubio_sink_wavwrite_get_channels(const aubio_sink_wavwrite_t *s) uint_t aubio_sink_wavwrite_open(aubio_sink_wavwrite_t *s) { unsigned char buf[5]; uint_t byterate, blockalign; + size_t written = 0; /* open output file */ s->fid = fopen((const char *)s->path, "wb"); if (!s->fid) { - AUBIO_ERR("sink_wavwrite: could not open %s (%s)\n", s->path, strerror(errno)); + AUBIO_STRERR("sink_wavwrite: could not open %s (%s)\n", s->path, errorstr); goto beach; } // ChunkID - fwrite("RIFF", 4, 1, s->fid); + written += fwrite("RIFF", 4, 1, s->fid); // ChunkSize (0 for now, actual size will be written in _close) - fwrite(write_little_endian(0, buf, 4), 4, 1, s->fid); + written += fwrite(write_little_endian(0, buf, 4), 4, 1, s->fid); // Format - fwrite("WAVE", 4, 1, s->fid); + written += fwrite("WAVE", 4, 1, s->fid); // Subchunk1ID - fwrite("fmt ", 4, 1, s->fid); + written += fwrite("fmt ", 4, 1, s->fid); // Subchunk1Size - fwrite(write_little_endian(16, buf, 4), 4, 1, s->fid); + written += fwrite(write_little_endian(16, buf, 4), 4, 1, s->fid); // AudioFormat - fwrite(write_little_endian(1, buf, 2), 2, 1, s->fid); + written += fwrite(write_little_endian(1, buf, 2), 2, 1, s->fid); // NumChannels - fwrite(write_little_endian(s->channels, buf, 2), 2, 1, s->fid); + written += fwrite(write_little_endian(s->channels, buf, 2), 2, 1, s->fid); // SampleRate - fwrite(write_little_endian(s->samplerate, buf, 4), 4, 1, s->fid); + written += fwrite(write_little_endian(s->samplerate, buf, 4), 4, 1, s->fid); // ByteRate byterate = s->samplerate * s->channels * s->bitspersample / 8; - fwrite(write_little_endian(byterate, buf, 4), 4, 1, s->fid); + written += fwrite(write_little_endian(byterate, buf, 4), 4, 1, s->fid); // BlockAlign blockalign = s->channels * s->bitspersample / 8; - fwrite(write_little_endian(blockalign, buf, 2), 2, 1, s->fid); + written += fwrite(write_little_endian(blockalign, buf, 2), 2, 1, s->fid); // BitsPerSample - fwrite(write_little_endian(s->bitspersample, buf, 2), 2, 1, s->fid); + written += fwrite(write_little_endian(s->bitspersample, buf, 2), 2, 1, s->fid); // Subchunk2ID - fwrite("data", 4, 1, s->fid); + written += fwrite("data", 4, 1, s->fid); // Subchunk1Size (0 for now, actual size will be written in _close) - fwrite(write_little_endian(0, buf, 4), 4, 1, s->fid); + written += fwrite(write_little_endian(0, buf, 4), 4, 1, s->fid); + + // fwrite(*, *, 1, s->fid) was called 13 times, check success + if (written != 13 || fflush(s->fid)) { + AUBIO_STRERR("sink_wavwrite: writing header to %s failed" + " (wrote %d/%d, %s)\n", s->path, written, 13, errorstr); + fclose(s->fid); + s->fid = NULL; + return AUBIO_FAIL; + } s->scratch_size = s->max_size * s->channels; /* allocate data for de/interleaving reallocated when needed. */ - if (s->scratch_size >= MAX_SIZE * MAX_CHANNELS) { + if (s->scratch_size >= MAX_SIZE * AUBIO_MAX_CHANNELS) { AUBIO_ERR("sink_wavwrite: %d x %d exceeds SIZE maximum buffer size %d\n", - s->max_size, s->channels, MAX_SIZE * MAX_CHANNELS); + s->max_size, s->channels, MAX_SIZE * AUBIO_MAX_CHANNELS); goto beach; } s->scratch_data = AUBIO_ARRAY(unsigned short,s->scratch_size); @@ -222,76 +234,82 @@ beach: return AUBIO_FAIL; } +static +void aubio_sink_wavwrite_write_frames(aubio_sink_wavwrite_t *s, uint_t write) +{ + uint_t written_frames = 0; -void aubio_sink_wavwrite_do(aubio_sink_wavwrite_t *s, fvec_t * write_data, uint_t write){ - uint_t i = 0, written_frames = 0; + written_frames = fwrite(s->scratch_data, 2 * s->channels, write, s->fid); - if (write > s->max_size) { - AUBIO_WRN("sink_wavwrite: trying to write %d frames to %s, " - "but only %d can be written at a time\n", write, s->path, s->max_size); - write = s->max_size; + if (written_frames != write) { + AUBIO_STRERR("sink_wavwrite: trying to write %d frames to %s, but only %d" + " could be written (%s)\n", write, s->path, written_frames, errorstr); } + s->total_frames_written += written_frames; +} - for (i = 0; i < write; i++) { - s->scratch_data[i] = HTOLES(FLOAT_TO_SHORT(write_data->data[i])); - } - written_frames = fwrite(s->scratch_data, 2, write, s->fid); +void aubio_sink_wavwrite_do(aubio_sink_wavwrite_t *s, fvec_t * write_data, uint_t write){ + uint_t c = 0, i = 0; + uint_t length = aubio_sink_validate_input_length("sink_wavwrite", s->path, + s->max_size, write_data->length, write); - if (written_frames != write) { - AUBIO_WRN("sink_wavwrite: trying to write %d frames to %s, " - "but only %d could be written\n", write, s->path, written_frames); + for (c = 0; c < s->channels; c++) { + for (i = 0; i < length; i++) { + s->scratch_data[i * s->channels + c] = HTOLES(FLOAT_TO_SHORT(write_data->data[i])); + } } - s->total_frames_written += written_frames; - return; + + aubio_sink_wavwrite_write_frames(s, length); } void aubio_sink_wavwrite_do_multi(aubio_sink_wavwrite_t *s, fmat_t * write_data, uint_t write){ - uint_t c = 0, i = 0, written_frames = 0; + uint_t c = 0, i = 0; - if (write > s->max_size) { - AUBIO_WRN("sink_wavwrite: trying to write %d frames to %s, " - "but only %d can be written at a time\n", write, s->path, s->max_size); - write = s->max_size; - } + uint_t channels = aubio_sink_validate_input_channels("sink_wavwrite", s->path, + s->channels, write_data->height); + uint_t length = aubio_sink_validate_input_length("sink_wavwrite", s->path, + s->max_size, write_data->length, write); - for (c = 0; c < s->channels; c++) { - for (i = 0; i < write; i++) { + for (c = 0; c < channels; c++) { + for (i = 0; i < length; i++) { s->scratch_data[i * s->channels + c] = HTOLES(FLOAT_TO_SHORT(write_data->data[c][i])); } } - written_frames = fwrite(s->scratch_data, 2, write * s->channels, s->fid); - if (written_frames != write * s->channels) { - AUBIO_WRN("sink_wavwrite: trying to write %d frames to %s, " - "but only %d could be written\n", write, s->path, written_frames / s->channels); - } - s->total_frames_written += written_frames; - return; + aubio_sink_wavwrite_write_frames(s, length); } uint_t aubio_sink_wavwrite_close(aubio_sink_wavwrite_t * s) { uint_t data_size = s->total_frames_written * s->bitspersample * s->channels / 8; unsigned char buf[5]; + size_t written = 0, err = 0; if (!s->fid) return AUBIO_FAIL; // ChunkSize - fseek(s->fid, 4, SEEK_SET); - fwrite(write_little_endian(data_size + 36, buf, 4), 4, 1, s->fid); + err += fseek(s->fid, 4, SEEK_SET); + written += fwrite(write_little_endian(data_size + 36, buf, 4), 4, 1, s->fid); // Subchunk2Size - fseek(s->fid, 40, SEEK_SET); - fwrite(write_little_endian(data_size, buf, 4), 4, 1, s->fid); + err += fseek(s->fid, 40, SEEK_SET); + written += fwrite(write_little_endian(data_size, buf, 4), 4, 1, s->fid); + if (written != 2 || err != 0) { + AUBIO_STRERR("sink_wavwrite: updating header of %s failed, expected %d" + " write but got only %d (%s)\n", s->path, 2, written, errorstr); + } // close file if (fclose(s->fid)) { - AUBIO_ERR("sink_wavwrite: Error closing file %s (%s)\n", s->path, strerror(errno)); + AUBIO_STRERR("sink_wavwrite: Error closing file %s (%s)\n", s->path, errorstr); } s->fid = NULL; return AUBIO_OK; } void del_aubio_sink_wavwrite(aubio_sink_wavwrite_t * s){ - if (!s) return; - aubio_sink_wavwrite_close(s); - if (s->path) AUBIO_FREE(s->path); - AUBIO_FREE(s->scratch_data); + AUBIO_ASSERT(s); + if (s->fid) + aubio_sink_wavwrite_close(s); + if (s->path) + AUBIO_FREE(s->path); + if (s->scratch_data) + AUBIO_FREE(s->scratch_data); AUBIO_FREE(s); } diff --git a/src/io/source.c b/src/io/source.c index 246eb90..2fbbb6d 100644 --- a/src/io/source.c +++ b/src/io/source.c @@ -18,7 +18,6 @@ */ -#include "config.h" #include "aubio_priv.h" #include "fvec.h" #include "fmat.h" @@ -115,9 +114,14 @@ aubio_source_t * new_aubio_source(const char_t * uri, uint_t samplerate, uint_t return s; } #endif /* HAVE_WAVREAD */ - AUBIO_ERROR("source: failed creating aubio source with %s" - " at samplerate %d with hop_size %d\n", uri, samplerate, hop_size); - AUBIO_FREE(s); +#if !defined(HAVE_WAVREAD) && \ + !defined(HAVE_LIBAV) && \ + !defined(HAVE_SOURCE_APPLE_AUDIO) && \ + !defined(HAVE_SNDFILE) + AUBIO_ERROR("source: failed creating with %s at %dHz with hop size %d" + " (no source built-in)\n", uri, samplerate, hop_size); +#endif + del_aubio_source(s); return NULL; } @@ -134,8 +138,9 @@ uint_t aubio_source_close(aubio_source_t * s) { } void del_aubio_source(aubio_source_t * s) { - if (!s) return; - s->s_del((void *)s->source); + //AUBIO_ASSERT(s); + if (s && s->s_del && s->source) + s->s_del((void *)s->source); AUBIO_FREE(s); } diff --git a/src/io/source.h b/src/io/source.h index 2df2584..ff9b8be 100644 --- a/src/io/source.h +++ b/src/io/source.h @@ -59,7 +59,6 @@ A simple source to read from 16-bits PCM RIFF encoded WAV files. \example io/test-source.c - \example io/test-source_multi.c */ diff --git a/src/io/source_apple_audio.c b/src/io/source_apple_audio.c index a86d4a3..733f7e0 100644 --- a/src/io/source_apple_audio.c +++ b/src/io/source_apple_audio.c @@ -18,13 +18,13 @@ */ -#include "config.h" +#include "aubio_priv.h" #ifdef HAVE_SOURCE_APPLE_AUDIO -#include "aubio_priv.h" #include "fvec.h" #include "fmat.h" +#include "ioutils.h" #include "io/source_apple_audio.h" // ExtAudioFileRef, AudioStreamBasicDescription, AudioBufferList, ... @@ -35,8 +35,6 @@ #define RT_BYTE3( a ) ( ((a) >> 16) & 0xff ) #define RT_BYTE4( a ) ( ((a) >> 24) & 0xff ) -#define SHORT_TO_FLOAT(x) (smpl_t)(x * 3.0517578125e-05) - struct _aubio_source_apple_audio_t { uint_t channels; uint_t samplerate; //< requested samplerate @@ -49,7 +47,7 @@ struct _aubio_source_apple_audio_t { AudioBufferList bufferList; }; -extern int createAubioBufferList(AudioBufferList *bufferList, int channels, int max_source_samples); +extern int createAudioBufferList(AudioBufferList *bufferList, int channels, int max_source_samples); extern void freeAudioBufferList(AudioBufferList *bufferList); extern CFURLRef createURLFromPath(const char * path); char_t *getPrintableOSStatusError(char_t *str, OSStatus error); @@ -60,7 +58,7 @@ aubio_source_apple_audio_t * new_aubio_source_apple_audio(const char_t * path, u { aubio_source_apple_audio_t * s = AUBIO_NEW(aubio_source_apple_audio_t); - if (path == NULL) { + if (path == NULL || strnlen(path, PATH_MAX) < 1) { AUBIO_ERROR("source_apple_audio: Aborted opening null path\n"); goto beach; } @@ -86,7 +84,7 @@ aubio_source_apple_audio_t * new_aubio_source_apple_audio(const char_t * path, u return s; beach: - AUBIO_FREE(s); + del_aubio_source_apple_audio(s); return NULL; } @@ -95,7 +93,6 @@ uint_t aubio_source_apple_audio_open (aubio_source_apple_audio_t *s, const char_ OSStatus err = noErr; UInt32 propSize; - if (s->path) AUBIO_FREE(s->path); s->path = AUBIO_ARRAY(char_t, strnlen(path, PATH_MAX) + 1); strncpy(s->path, path, strnlen(path, PATH_MAX) + 1); @@ -140,17 +137,16 @@ uint_t aubio_source_apple_audio_open (aubio_source_apple_audio_t *s, const char_ s->channels = fileFormat.mChannelsPerFrame; AudioStreamBasicDescription clientFormat; - propSize = sizeof(clientFormat); + propSize = sizeof(AudioStreamBasicDescription); memset(&clientFormat, 0, sizeof(AudioStreamBasicDescription)); clientFormat.mFormatID = kAudioFormatLinearPCM; clientFormat.mSampleRate = (Float64)(s->samplerate); - clientFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked; + clientFormat.mFormatFlags = kAudioFormatFlagIsFloat; clientFormat.mChannelsPerFrame = s->channels; - clientFormat.mBitsPerChannel = sizeof(short) * 8; + clientFormat.mBitsPerChannel = sizeof(smpl_t) * 8; clientFormat.mFramesPerPacket = 1; clientFormat.mBytesPerFrame = clientFormat.mBitsPerChannel * clientFormat.mChannelsPerFrame / 8; clientFormat.mBytesPerPacket = clientFormat.mFramesPerPacket * clientFormat.mBytesPerFrame; - clientFormat.mReserved = 0; // set the client format description err = ExtAudioFileSetProperty(s->audioFile, kExtAudioFileProperty_ClientDataFormat, @@ -188,7 +184,7 @@ uint_t aubio_source_apple_audio_open (aubio_source_apple_audio_t *s, const char_ // allocate the AudioBufferList freeAudioBufferList(&s->bufferList); - if (createAubioBufferList(&s->bufferList, s->channels, s->block_size * s->channels)) { + if (createAudioBufferList(&s->bufferList, s->channels, s->block_size * s->channels)) { AUBIO_ERR("source_apple_audio: failed creating bufferList\n"); goto beach; } @@ -197,90 +193,68 @@ beach: return err; } -void aubio_source_apple_audio_do(aubio_source_apple_audio_t *s, fvec_t * read_to, uint_t * read) { - UInt32 c, v, loadedPackets = s->block_size; +static UInt32 aubio_source_apple_audio_read_frame(aubio_source_apple_audio_t *s) +{ + UInt32 loadedPackets = s->block_size; OSStatus err = ExtAudioFileRead(s->audioFile, &loadedPackets, &s->bufferList); if (err) { char_t errorstr[20]; AUBIO_ERROR("source_apple_audio: error while reading %s " "with ExtAudioFileRead (%s)\n", s->path, getPrintableOSStatusError(errorstr, err)); - goto beach; } + return loadedPackets; +} - short *data = (short*)s->bufferList.mBuffers[0].mData; +void aubio_source_apple_audio_do(aubio_source_apple_audio_t *s, fvec_t * read_to, + uint_t * read) { + uint_t c, v; + UInt32 loadedPackets = aubio_source_apple_audio_read_frame(s); + uint_t length = aubio_source_validate_input_length("source_apple_audio", + s->path, s->block_size, read_to->length); + smpl_t *data = (smpl_t*)s->bufferList.mBuffers[0].mData; - smpl_t *buf = read_to->data; + length = MIN(loadedPackets, length); - for (v = 0; v < loadedPackets; v++) { - buf[v] = 0.; + for (v = 0; v < length; v++) { + read_to->data[v] = 0.; for (c = 0; c < s->channels; c++) { - buf[v] += SHORT_TO_FLOAT(data[ v * s->channels + c]); + read_to->data[v] += data[ v * s->channels + c]; } - buf[v] /= (smpl_t)s->channels; + read_to->data[v] /= (smpl_t)s->channels; } // short read, fill with zeros - if (loadedPackets < s->block_size) { - for (v = loadedPackets; v < s->block_size; v++) { - buf[v] = 0.; - } - } + aubio_source_pad_output(read_to, length); - *read = (uint_t)loadedPackets; - return; -beach: - *read = 0; - return; + *read = (uint_t)length; } void aubio_source_apple_audio_do_multi(aubio_source_apple_audio_t *s, fmat_t * read_to, uint_t * read) { - UInt32 c, v, loadedPackets = s->block_size; - OSStatus err = ExtAudioFileRead(s->audioFile, &loadedPackets, &s->bufferList); - if (err) { - char_t errorstr[20]; - AUBIO_ERROR("source_apple_audio: error while reading %s " - "with ExtAudioFileRead (%s)\n", s->path, - getPrintableOSStatusError(errorstr, err)); - goto beach; + uint_t c, v; + uint_t length = aubio_source_validate_input_length("source_apple_audio", + s->path, s->block_size, read_to->length); + uint_t channels = aubio_source_validate_input_channels("source_apple_audio", + s->path, s->channels, read_to->height); + UInt32 loadedPackets = aubio_source_apple_audio_read_frame(s); + smpl_t *data = (smpl_t*)s->bufferList.mBuffers[0].mData; + + length = MIN(loadedPackets, length); + + for (v = 0; v < length; v++) { + for (c = 0; c < channels; c++) { + read_to->data[c][v] = data[ v * s->channels + c]; + } } - short *data = (short*)s->bufferList.mBuffers[0].mData; + aubio_source_pad_multi_output(read_to, s->channels, (uint_t)length); - smpl_t **buf = read_to->data; - - for (v = 0; v < loadedPackets; v++) { - for (c = 0; c < read_to->height; c++) { - buf[c][v] = SHORT_TO_FLOAT(data[ v * s->channels + c]); - } - } - // if read_data has more channels than the file - if (read_to->height > s->channels) { - // copy last channel to all additional channels - for (v = 0; v < loadedPackets; v++) { - for (c = s->channels; c < read_to->height; c++) { - buf[c][v] = SHORT_TO_FLOAT(data[ v * s->channels + (s->channels - 1)]); - } - } - } - // short read, fill with zeros - if (loadedPackets < s->block_size) { - for (v = loadedPackets; v < s->block_size; v++) { - for (c = 0; c < read_to->height; c++) { - buf[c][v] = 0.; - } - } - } - *read = (uint_t)loadedPackets; - return; -beach: - *read = 0; - return; + *read = (uint_t)length; } uint_t aubio_source_apple_audio_close (aubio_source_apple_audio_t *s) { OSStatus err = noErr; - if (!s->audioFile) { return AUBIO_FAIL; } + if (!s->audioFile) { return AUBIO_OK; } err = ExtAudioFileDispose(s->audioFile); s->audioFile = NULL; if (err) { @@ -294,11 +268,11 @@ uint_t aubio_source_apple_audio_close (aubio_source_apple_audio_t *s) } void del_aubio_source_apple_audio(aubio_source_apple_audio_t * s){ + AUBIO_ASSERT(s); aubio_source_apple_audio_close (s); if (s->path) AUBIO_FREE(s->path); freeAudioBufferList(&s->bufferList); AUBIO_FREE(s); - return; } uint_t aubio_source_apple_audio_seek (aubio_source_apple_audio_t * s, uint_t pos) { @@ -324,7 +298,7 @@ uint_t aubio_source_apple_audio_seek (aubio_source_apple_audio_t * s, uint_t pos } // after a short read, the bufferList size needs to resetted to prepare for a full read AudioBufferList *bufferList = &s->bufferList; - bufferList->mBuffers[0].mDataByteSize = s->block_size * s->channels * sizeof (short); + bufferList->mBuffers[0].mDataByteSize = s->block_size * s->channels * sizeof (smpl_t); // do the actual seek err = ExtAudioFileSeek(s->audioFile, resampled_pos); if (err) { @@ -370,7 +344,7 @@ uint_t aubio_source_apple_audio_get_duration(const aubio_source_apple_audio_t * AUBIO_ERROR("source_apple_audio: Failed getting %s duration, " "error in ExtAudioFileGetProperty (%s)\n", s->path, getPrintableOSStatusError(errorstr, err)); - return err; + return 0; } return (uint_t)fileLengthFrames; } diff --git a/src/io/source_avcodec.c b/src/io/source_avcodec.c index faf8015..5b25d85 100644 --- a/src/io/source_avcodec.c +++ b/src/io/source_avcodec.c @@ -18,32 +18,55 @@ */ - -#include "config.h" +#include "aubio_priv.h" #ifdef HAVE_LIBAV +#include <libavcodec/avcodec.h> +#include <libavformat/avformat.h> +#if defined(HAVE_SWRESAMPLE) +#include <libswresample/swresample.h> +#elif defined(HAVE_AVRESAMPLE) +#include <libavresample/avresample.h> +#endif +#include <libavutil/opt.h> + // determine whether we use libavformat from ffmpeg or from libav #define FFMPEG_LIBAVFORMAT (LIBAVFORMAT_VERSION_MICRO > 99 ) -// max_analyze_duration2 was used from ffmpeg libavformat 55.43.100 through 57.2.100 +// max_analyze_duration2 was used from ffmpeg libavformat 55.43.100 -> 57.2.100 #define FFMPEG_LIBAVFORMAT_MAX_DUR2 FFMPEG_LIBAVFORMAT && ( \ (LIBAVFORMAT_VERSION_MAJOR == 55 && LIBAVFORMAT_VERSION_MINOR >= 43) \ || (LIBAVFORMAT_VERSION_MAJOR == 56) \ || (LIBAVFORMAT_VERSION_MAJOR == 57 && LIBAVFORMAT_VERSION_MINOR < 2) \ ) -#include <libavcodec/avcodec.h> -#include <libavformat/avformat.h> -#include <libavresample/avresample.h> -#include <libavutil/opt.h> -#include <stdlib.h> +// backward compatibility with libavcodec55 +#if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(57,0,0) +#define HAVE_AUBIO_LIBAVCODEC_DEPRECATED 1 +#endif + +#if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(58,3,102) +#define HAVE_AUBIO_LIBAVCODEC_TIMEBASE_FIX 1 +#endif + +#if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(55,28,1) +#warning "libavcodec < 56 is deprecated" +#define av_frame_alloc avcodec_alloc_frame +#define av_frame_free avcodec_free_frame +#define av_packet_unref av_free_packet +#endif #include "aubio_priv.h" #include "fvec.h" #include "fmat.h" +#include "ioutils.h" #include "source_avcodec.h" +#if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(56, 56, 0) #define AUBIO_AVCODEC_MAX_BUFFER_SIZE FF_MIN_BUFFER_SIZE +#else +#define AUBIO_AVCODEC_MAX_BUFFER_SIZE AV_INPUT_BUFFER_MIN_SIZE +#endif struct _aubio_source_avcodec_t { uint_t hop_size; @@ -59,18 +82,24 @@ struct _aubio_source_avcodec_t { AVFormatContext *avFormatCtx; AVCodecContext *avCodecCtx; AVFrame *avFrame; + AVPacket avPacket; +#ifdef HAVE_AVRESAMPLE AVAudioResampleContext *avr; - float *output; +#elif defined(HAVE_SWRESAMPLE) + SwrContext *avr; +#endif + smpl_t *output; uint_t read_samples; uint_t read_index; sint_t selected_stream; uint_t eof; - uint_t multi; }; -// hack to create or re-create the context the first time _do or _do_multi is called -void aubio_source_avcodec_reset_resampler(aubio_source_avcodec_t * s, uint_t multi); -void aubio_source_avcodec_readframe(aubio_source_avcodec_t *s, uint_t * read_samples); +// create or re-create the context when _do or _do_multi is called +void aubio_source_avcodec_reset_resampler(aubio_source_avcodec_t * s); +// actually read a frame +void aubio_source_avcodec_readframe(aubio_source_avcodec_t *s, + uint_t * read_samples); uint_t aubio_source_avcodec_has_network_url(aubio_source_avcodec_t *s); @@ -87,38 +116,50 @@ uint_t aubio_source_avcodec_has_network_url(aubio_source_avcodec_t *s) { } -aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, uint_t samplerate, uint_t hop_size) { +aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, + uint_t samplerate, uint_t hop_size) { aubio_source_avcodec_t * s = AUBIO_NEW(aubio_source_avcodec_t); + AVFormatContext *avFormatCtx = NULL; + AVCodecContext *avCodecCtx = NULL; + AVFrame *avFrame = NULL; + sint_t selected_stream = -1; +#if FF_API_LAVF_AVCTX + AVCodecParameters *codecpar; +#endif + AVCodec *codec; + uint_t i; int err; if (path == NULL) { AUBIO_ERR("source_avcodec: Aborted opening null path\n"); goto beach; } if ((sint_t)samplerate < 0) { - AUBIO_ERR("source_avcodec: Can not open %s with samplerate %d\n", path, samplerate); + AUBIO_ERR("source_avcodec: Can not open %s with samplerate %d\n", + path, samplerate); goto beach; } if ((sint_t)hop_size <= 0) { - AUBIO_ERR("source_avcodec: Can not open %s with hop_size %d\n", path, hop_size); + AUBIO_ERR("source_avcodec: Can not open %s with hop_size %d\n", + path, hop_size); goto beach; } s->hop_size = hop_size; s->channels = 1; - if (s->path) AUBIO_FREE(s->path); s->path = AUBIO_ARRAY(char_t, strnlen(path, PATH_MAX) + 1); strncpy(s->path, path, strnlen(path, PATH_MAX) + 1); +#if LIBAVFORMAT_VERSION_INT < AV_VERSION_INT(58,0,0) // register all formats and codecs av_register_all(); +#endif if (aubio_source_avcodec_has_network_url(s)) { avformat_network_init(); } // try opening the file and get some info about it - AVFormatContext *avFormatCtx = s->avFormatCtx; avFormatCtx = NULL; if ( (err = avformat_open_input(&avFormatCtx, s->path, NULL, NULL) ) < 0 ) { char errorstr[256]; @@ -138,8 +179,8 @@ aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, uint_t sa if ( (err = avformat_find_stream_info(avFormatCtx, NULL)) < 0 ) { char errorstr[256]; av_strerror (err, errorstr, sizeof(errorstr)); - AUBIO_ERR("source_avcodec: Could not find stream information " "for %s (%s)\n", s->path, - errorstr); + AUBIO_ERR("source_avcodec: Could not find stream information " + "for %s (%s)\n", s->path, errorstr); goto beach; } @@ -147,8 +188,6 @@ aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, uint_t sa //av_dump_format(avFormatCtx, 0, s->path, 0); // look for the first audio stream - uint_t i; - sint_t selected_stream = -1; for (i = 0; i < avFormatCtx->nb_streams; i++) { #if FF_API_LAVF_AVCTX if (avFormatCtx->streams[i]->codecpar->codec_type == AVMEDIA_TYPE_AUDIO) { @@ -170,25 +209,25 @@ aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, uint_t sa //AUBIO_DBG("Taking stream %d in file %s\n", selected_stream, s->path); s->selected_stream = selected_stream; - AVCodecContext *avCodecCtx = s->avCodecCtx; #if FF_API_LAVF_AVCTX - AVCodecParameters *codecpar = avFormatCtx->streams[selected_stream]->codecpar; + codecpar = avFormatCtx->streams[selected_stream]->codecpar; if (codecpar == NULL) { AUBIO_ERR("source_avcodec: Could not find decoder for %s", s->path); goto beach; } - AVCodec *codec = avcodec_find_decoder(codecpar->codec_id); + codec = avcodec_find_decoder(codecpar->codec_id); /* Allocate a codec context for the decoder */ avCodecCtx = avcodec_alloc_context3(codec); if (!avCodecCtx) { - AUBIO_ERR("source_avcodec: Failed to allocate the %s codec context for path %s\n", - av_get_media_type_string(AVMEDIA_TYPE_AUDIO), s->path); + AUBIO_ERR("source_avcodec: Failed to allocate the %s codec context " + "for path %s\n", av_get_media_type_string(AVMEDIA_TYPE_AUDIO), + s->path); goto beach; } #else avCodecCtx = avFormatCtx->streams[selected_stream]->codec; - AVCodec *codec = avcodec_find_decoder(avCodecCtx->codec_id); + codec = avcodec_find_decoder(avCodecCtx->codec_id); #endif if (codec == NULL) { AUBIO_ERR("source_avcodec: Could not find decoder for %s", s->path); @@ -198,16 +237,23 @@ aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, uint_t sa #if FF_API_LAVF_AVCTX /* Copy codec parameters from input stream to output codec context */ if ((err = avcodec_parameters_to_context(avCodecCtx, codecpar)) < 0) { - AUBIO_ERR("source_avcodec: Failed to copy %s codec parameters to decoder context for %s\n", - av_get_media_type_string(AVMEDIA_TYPE_AUDIO), s->path); + AUBIO_ERR("source_avcodec: Failed to copy %s codec parameters to " + "decoder context for %s\n", + av_get_media_type_string(AVMEDIA_TYPE_AUDIO), s->path); goto beach; } +#if HAVE_AUBIO_LIBAVCODEC_TIMEBASE_FIX + // avoids 'skipped frames warning' with avecodec < 58, deprecated after + av_codec_set_pkt_timebase(avCodecCtx, + avFormatCtx->streams[selected_stream]->time_base); +#endif #endif if ( ( err = avcodec_open2(avCodecCtx, codec, NULL) ) < 0) { char errorstr[256]; av_strerror (err, errorstr, sizeof(errorstr)); - AUBIO_ERR("source_avcodec: Could not load codec for %s (%s)\n", s->path, errorstr); + AUBIO_ERR("source_avcodec: Could not load codec for %s (%s)\n", s->path, + errorstr); goto beach; } @@ -228,14 +274,14 @@ aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, uint_t sa s->input_samplerate, s->samplerate); } - AVFrame *avFrame = s->avFrame; avFrame = av_frame_alloc(); if (!avFrame) { AUBIO_ERR("source_avcodec: Could not allocate frame for (%s)\n", s->path); } /* allocate output for avr */ - s->output = (float *)av_malloc(AUBIO_AVCODEC_MAX_BUFFER_SIZE * sizeof(float)); + s->output = (smpl_t *)av_malloc(AUBIO_AVCODEC_MAX_BUFFER_SIZE + * sizeof(smpl_t)); s->read_samples = 0; s->read_index = 0; @@ -244,11 +290,11 @@ aubio_source_avcodec_t * new_aubio_source_avcodec(const char_t * path, uint_t sa s->avCodecCtx = avCodecCtx; s->avFrame = avFrame; - // default to mono output - aubio_source_avcodec_reset_resampler(s, 0); + aubio_source_avcodec_reset_resampler(s); + + if (s->avr == NULL) goto beach; s->eof = 0; - s->multi = 0; //av_log_set_level(AV_LOG_QUIET); @@ -261,47 +307,78 @@ beach: return NULL; } -void aubio_source_avcodec_reset_resampler(aubio_source_avcodec_t * s, uint_t multi) { - if ( (multi != s->multi) || (s->avr == NULL) ) { +void aubio_source_avcodec_reset_resampler(aubio_source_avcodec_t * s) +{ + // create or reset resampler to/from mono/multi-channel + if ( s->avr == NULL ) { + int err; int64_t input_layout = av_get_default_channel_layout(s->input_channels); - uint_t output_channels = multi ? s->input_channels : 1; - int64_t output_layout = av_get_default_channel_layout(output_channels); - if (s->avr != NULL) { - avresample_close( s->avr ); - av_free ( s->avr ); - s->avr = NULL; - } - AVAudioResampleContext *avr = s->avr; - avr = avresample_alloc_context(); - - av_opt_set_int(avr, "in_channel_layout", input_layout, 0); - av_opt_set_int(avr, "out_channel_layout", output_layout, 0); - av_opt_set_int(avr, "in_sample_rate", s->input_samplerate, 0); - av_opt_set_int(avr, "out_sample_rate", s->samplerate, 0); + int64_t output_layout = av_get_default_channel_layout(s->input_channels); +#ifdef HAVE_AVRESAMPLE + AVAudioResampleContext *avr = avresample_alloc_context(); +#elif defined(HAVE_SWRESAMPLE) + SwrContext *avr = swr_alloc(); +#endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ + + av_opt_set_int(avr, "in_channel_layout", input_layout, 0); + av_opt_set_int(avr, "out_channel_layout", output_layout, 0); + av_opt_set_int(avr, "in_sample_rate", s->input_samplerate, 0); + av_opt_set_int(avr, "out_sample_rate", s->samplerate, 0); av_opt_set_int(avr, "in_sample_fmt", s->avCodecCtx->sample_fmt, 0); - av_opt_set_int(avr, "out_sample_fmt", AV_SAMPLE_FMT_FLT, 0); - int err; - if ( ( err = avresample_open(avr) ) < 0) { +#if HAVE_AUBIO_DOUBLE + av_opt_set_int(avr, "out_sample_fmt", AV_SAMPLE_FMT_DBL, 0); +#else + av_opt_set_int(avr, "out_sample_fmt", AV_SAMPLE_FMT_FLT, 0); +#endif + // TODO: use planar? + //av_opt_set_int(avr, "out_sample_fmt", AV_SAMPLE_FMT_FLTP, 0); +#ifdef HAVE_AVRESAMPLE + if ( ( err = avresample_open(avr) ) < 0) +#elif defined(HAVE_SWRESAMPLE) + if ( ( err = swr_init(avr) ) < 0) +#endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ + { char errorstr[256]; av_strerror (err, errorstr, sizeof(errorstr)); - AUBIO_ERR("source_avcodec: Could not open AVAudioResampleContext for %s (%s)\n", - s->path, errorstr); - //goto beach; + AUBIO_ERR("source_avcodec: Could not open resampling context" + " for %s (%s)\n", s->path, errorstr); return; } s->avr = avr; - s->multi = multi; } } -void aubio_source_avcodec_readframe(aubio_source_avcodec_t *s, uint_t * read_samples) { +void aubio_source_avcodec_readframe(aubio_source_avcodec_t *s, + uint_t * read_samples) +{ AVFormatContext *avFormatCtx = s->avFormatCtx; AVCodecContext *avCodecCtx = s->avCodecCtx; AVFrame *avFrame = s->avFrame; - AVPacket avPacket; - av_init_packet (&avPacket); + AVPacket avPacket = s->avPacket; +#ifdef HAVE_AVRESAMPLE AVAudioResampleContext *avr = s->avr; - float *output = s->output; +#elif defined(HAVE_SWRESAMPLE) + SwrContext *avr = s->avr; +#endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ + int got_frame = 0; +#ifdef HAVE_AVRESAMPLE + int in_linesize = 0; + int in_samples = avFrame->nb_samples; + int out_linesize = 0; + int max_out_samples = AUBIO_AVCODEC_MAX_BUFFER_SIZE; + int out_samples = 0; +#elif defined(HAVE_SWRESAMPLE) + int in_samples = avFrame->nb_samples; + int max_out_samples = AUBIO_AVCODEC_MAX_BUFFER_SIZE / avCodecCtx->channels; + int out_samples = 0; +#endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ + smpl_t *output = s->output; +#ifndef FF_API_LAVF_AVCTX + int len = 0; +#else + int ret = 0; +#endif + av_init_packet (&avPacket); *read_samples = 0; do @@ -314,83 +391,115 @@ void aubio_source_avcodec_readframe(aubio_source_avcodec_t *s, uint_t * read_sam if (err != 0) { char errorstr[256]; av_strerror (err, errorstr, sizeof(errorstr)); - AUBIO_ERR("Could not read frame in %s (%s)\n", s->path, errorstr); + AUBIO_ERR("source_avcodec: could not read frame in %s (%s)\n", + s->path, errorstr); + s->eof = 1; goto beach; } } while (avPacket.stream_index != s->selected_stream); - int got_frame = 0; #if FF_API_LAVF_AVCTX - int ret = avcodec_send_packet(avCodecCtx, &avPacket); - if (ret < 0 && ret != AVERROR_EOF) { - AUBIO_ERR("source_avcodec: error when sending packet for %s\n", s->path); - goto beach; - } - ret = avcodec_receive_frame(avCodecCtx, avFrame); + ret = avcodec_send_packet(avCodecCtx, &avPacket); + if (ret < 0 && ret != AVERROR_EOF) { + AUBIO_ERR("source_avcodec: error when sending packet for %s\n", s->path); + goto beach; + } + ret = avcodec_receive_frame(avCodecCtx, avFrame); if (ret >= 0) { - got_frame = 1; - } - if (ret < 0) { + got_frame = 1; + } + if (ret < 0) { if (ret == AVERROR(EAGAIN)) { - AUBIO_WRN("source_avcodec: output is not available right now - user must try to send new input\n"); + //AUBIO_WRN("source_avcodec: output is not available right now - " + // "user must try to send new input\n"); + goto beach; } else if (ret == AVERROR_EOF) { - AUBIO_WRN("source_avcodec: the decoder has been fully flushed, and there will be no more output frames\n"); + AUBIO_WRN("source_avcodec: the decoder has been fully flushed, " + "and there will be no more output frames\n"); } else { AUBIO_ERR("source_avcodec: decoding errors on %s\n", s->path); - goto beach; + goto beach; } } #else - int len = avcodec_decode_audio4(avCodecCtx, avFrame, &got_frame, &avPacket); + len = avcodec_decode_audio4(avCodecCtx, avFrame, &got_frame, &avPacket); if (len < 0) { - AUBIO_ERR("Error while decoding %s\n", s->path); + AUBIO_ERR("source_avcodec: error while decoding %s\n", s->path); goto beach; } #endif if (got_frame == 0) { - //AUBIO_ERR("Could not get frame for (%s)\n", s->path); + AUBIO_WRN("source_avcodec: did not get a frame when reading %s\n", + s->path); goto beach; } - int in_linesize = 0; +#if LIBAVUTIL_VERSION_MAJOR > 52 + if (avFrame->channels != (sint_t)s->input_channels) { + AUBIO_WRN ("source_avcodec: trying to read from %d channel(s)," + "but configured for %d; is '%s' corrupt?\n", + avFrame->channels, s->input_channels, s->path); + goto beach; + } +#else +#warning "avutil < 53 is deprecated, crashes might occur on corrupt files" +#endif + +#ifdef HAVE_AVRESAMPLE + in_linesize = 0; av_samples_get_buffer_size(&in_linesize, avCodecCtx->channels, avFrame->nb_samples, avCodecCtx->sample_fmt, 1); - int in_samples = avFrame->nb_samples; - int out_linesize = 0; - int max_out_samples = AUBIO_AVCODEC_MAX_BUFFER_SIZE; - int out_samples = avresample_convert ( avr, + in_samples = avFrame->nb_samples; + out_linesize = 0; + max_out_samples = AUBIO_AVCODEC_MAX_BUFFER_SIZE; + out_samples = avresample_convert ( avr, (uint8_t **)&output, out_linesize, max_out_samples, (uint8_t **)avFrame->data, in_linesize, in_samples); - if (out_samples <= 0) { - //AUBIO_ERR("No sample found while converting frame (%s)\n", s->path); +#elif defined(HAVE_SWRESAMPLE) + in_samples = avFrame->nb_samples; + max_out_samples = AUBIO_AVCODEC_MAX_BUFFER_SIZE / avCodecCtx->channels; + out_samples = swr_convert( avr, + (uint8_t **)&output, max_out_samples, + (const uint8_t **)avFrame->data, in_samples); +#endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ + if (out_samples < 0) { + AUBIO_WRN("source_avcodec: error while resampling %s (%d)\n", + s->path, out_samples); goto beach; } *read_samples = out_samples; beach: - s->avFormatCtx = avFormatCtx; - s->avCodecCtx = avCodecCtx; - s->avFrame = avFrame; - s->avr = avr; - s->output = output; - av_packet_unref(&avPacket); } -void aubio_source_avcodec_do(aubio_source_avcodec_t * s, fvec_t * read_data, uint_t * read){ - if (s->multi == 1) aubio_source_avcodec_reset_resampler(s, 0); - uint_t i; +void aubio_source_avcodec_do(aubio_source_avcodec_t * s, fvec_t * read_data, + uint_t * read) { + uint_t i, j; uint_t end = 0; uint_t total_wrote = 0; - while (total_wrote < s->hop_size) { - end = MIN(s->read_samples - s->read_index, s->hop_size - total_wrote); + uint_t length = aubio_source_validate_input_length("source_avcodec", s->path, + s->hop_size, read_data->length); + if (!s->avr || !s->avFormatCtx || !s->avCodecCtx) { + AUBIO_ERR("source_avcodec: could not read from %s (file was closed)\n", + s->path); + *read= 0; + return; + } + while (total_wrote < length) { + end = MIN(s->read_samples - s->read_index, length - total_wrote); for (i = 0; i < end; i++) { - read_data->data[i + total_wrote] = s->output[i + s->read_index]; + read_data->data[i + total_wrote] = 0.; + for (j = 0; j < s->input_channels; j++) { + read_data->data[i + total_wrote] += + s->output[(i + s->read_index) * s->input_channels + j]; + } + read_data->data[i + total_wrote] *= 1./s->input_channels; } total_wrote += end; - if (total_wrote < s->hop_size) { + if (total_wrote < length) { uint_t avcodec_read = 0; aubio_source_avcodec_readframe(s, &avcodec_read); s->read_samples = avcodec_read; @@ -402,29 +511,37 @@ void aubio_source_avcodec_do(aubio_source_avcodec_t * s, fvec_t * read_data, uin s->read_index += end; } } - if (total_wrote < s->hop_size) { - for (i = end; i < s->hop_size; i++) { - read_data->data[i] = 0.; - } - } + + aubio_source_pad_output(read_data, total_wrote); + *read = total_wrote; } -void aubio_source_avcodec_do_multi(aubio_source_avcodec_t * s, fmat_t * read_data, uint_t * read){ - if (s->multi == 0) aubio_source_avcodec_reset_resampler(s, 1); +void aubio_source_avcodec_do_multi(aubio_source_avcodec_t * s, + fmat_t * read_data, uint_t * read) { uint_t i,j; uint_t end = 0; uint_t total_wrote = 0; - while (total_wrote < s->hop_size) { - end = MIN(s->read_samples - s->read_index, s->hop_size - total_wrote); - for (j = 0; j < read_data->height; j++) { + uint_t length = aubio_source_validate_input_length("source_avcodec", s->path, + s->hop_size, read_data->length); + uint_t channels = aubio_source_validate_input_channels("source_avcodec", + s->path, s->input_channels, read_data->height); + if (!s->avr || !s->avFormatCtx || !s->avCodecCtx) { + AUBIO_ERR("source_avcodec: could not read from %s (file was closed)\n", + s->path); + *read= 0; + return; + } + while (total_wrote < length) { + end = MIN(s->read_samples - s->read_index, length - total_wrote); + for (j = 0; j < channels; j++) { for (i = 0; i < end; i++) { read_data->data[j][i + total_wrote] = s->output[(i + s->read_index) * s->input_channels + j]; } } total_wrote += end; - if (total_wrote < s->hop_size) { + if (total_wrote < length) { uint_t avcodec_read = 0; aubio_source_avcodec_readframe(s, &avcodec_read); s->read_samples = avcodec_read; @@ -436,13 +553,9 @@ void aubio_source_avcodec_do_multi(aubio_source_avcodec_t * s, fmat_t * read_dat s->read_index += end; } } - if (total_wrote < s->hop_size) { - for (j = 0; j < read_data->height; j++) { - for (i = end; i < s->hop_size; i++) { - read_data->data[j][i] = 0.; - } - } - } + + aubio_source_pad_multi_output(read_data, s->input_channels, total_wrote); + *read = total_wrote; } @@ -455,22 +568,42 @@ uint_t aubio_source_avcodec_get_channels(const aubio_source_avcodec_t * s) { } uint_t aubio_source_avcodec_seek (aubio_source_avcodec_t * s, uint_t pos) { - int64_t resampled_pos = (uint_t)ROUND(pos * (s->input_samplerate * 1. / s->samplerate)); + int64_t resampled_pos = + (uint_t)ROUND(pos * (s->input_samplerate * 1. / s->samplerate)); int64_t min_ts = MAX(resampled_pos - 2000, 0); int64_t max_ts = MIN(resampled_pos + 2000, INT64_MAX); int seek_flags = AVSEEK_FLAG_FRAME | AVSEEK_FLAG_ANY; - int ret = avformat_seek_file(s->avFormatCtx, s->selected_stream, + int ret = AUBIO_FAIL; + if (s->avFormatCtx != NULL && s->avr != NULL) { + ret = AUBIO_OK; + } else { + AUBIO_ERR("source_avcodec: failed seeking in %s (file not opened?)", + s->path); + return ret; + } + if ((sint_t)pos < 0) { + AUBIO_ERR("source_avcodec: could not seek %s at %d (seeking position" + " should be >= 0)\n", s->path, pos); + return AUBIO_FAIL; + } + ret = avformat_seek_file(s->avFormatCtx, s->selected_stream, min_ts, resampled_pos, max_ts, seek_flags); if (ret < 0) { - AUBIO_ERR("Failed seeking to %d in file %s", pos, s->path); + AUBIO_ERR("source_avcodec: failed seeking to %d in file %s", + pos, s->path); } // reset read status s->eof = 0; s->read_index = 0; s->read_samples = 0; +#ifdef HAVE_AVRESAMPLE // reset the AVAudioResampleContext avresample_close(s->avr); avresample_open(s->avr); +#elif defined(HAVE_SWRESAMPLE) + swr_close(s->avr); + swr_init(s->avr); +#endif return ret; } @@ -484,23 +617,33 @@ uint_t aubio_source_avcodec_get_duration (aubio_source_avcodec_t * s) { uint_t aubio_source_avcodec_close(aubio_source_avcodec_t * s) { if (s->avr != NULL) { +#ifdef HAVE_AVRESAMPLE avresample_close( s->avr ); av_free ( s->avr ); +#elif defined(HAVE_SWRESAMPLE) + swr_close ( s->avr ); + swr_free ( &s->avr ); +#endif } s->avr = NULL; if (s->avCodecCtx != NULL) { +#ifndef HAVE_AUBIO_LIBAVCODEC_DEPRECATED + avcodec_free_context( &s->avCodecCtx ); +#else avcodec_close ( s->avCodecCtx ); +#endif } s->avCodecCtx = NULL; if (s->avFormatCtx != NULL) { - avformat_close_input ( &(s->avFormatCtx) ); + avformat_close_input(&s->avFormatCtx); + s->avFormatCtx = NULL; } - s->avFormatCtx = NULL; + av_packet_unref(&s->avPacket); return AUBIO_OK; } void del_aubio_source_avcodec(aubio_source_avcodec_t * s){ - if (!s) return; + AUBIO_ASSERT(s); aubio_source_avcodec_close(s); if (s->output != NULL) { av_free(s->output); @@ -509,8 +652,11 @@ void del_aubio_source_avcodec(aubio_source_avcodec_t * s){ if (s->avFrame != NULL) { av_frame_free( &(s->avFrame) ); } - if (s->path) AUBIO_FREE(s->path); s->avFrame = NULL; + if (s->path) { + AUBIO_FREE(s->path); + } + s->path = NULL; AUBIO_FREE(s); } diff --git a/src/io/source_sndfile.c b/src/io/source_sndfile.c index 5c8d5a5..3984465 100644 --- a/src/io/source_sndfile.c +++ b/src/io/source_sndfile.c @@ -18,23 +18,21 @@ */ - -#include "config.h" +#include "aubio_priv.h" #ifdef HAVE_SNDFILE #include <sndfile.h> -#include "aubio_priv.h" #include "fvec.h" #include "fmat.h" +#include "ioutils.h" #include "source_sndfile.h" #include "temporal/resampler.h" -#define MAX_CHANNELS 6 #define MAX_SIZE 4096 -#define MAX_SAMPLES MAX_CHANNELS * MAX_SIZE +#define MAX_SAMPLES AUBIO_MAX_CHANNELS * MAX_SIZE #if !HAVE_AUBIO_DOUBLE #define aubio_sf_read_smpl sf_read_float @@ -59,8 +57,9 @@ struct _aubio_source_sndfile_t { smpl_t ratio; uint_t input_hop_size; #ifdef HAVE_SAMPLERATE - aubio_resampler_t *resampler; + aubio_resampler_t **resamplers; fvec_t *input_data; + fmat_t *input_mat; #endif /* HAVE_SAMPLERATE */ // some temporary memory for sndfile to write at @@ -88,7 +87,6 @@ aubio_source_sndfile_t * new_aubio_source_sndfile(const char_t * path, uint_t sa s->hop_size = hop_size; s->channels = 1; - if (s->path) AUBIO_FREE(s->path); s->path = AUBIO_ARRAY(char_t, strnlen(path, PATH_MAX) + 1); strncpy(s->path, path, strnlen(path, PATH_MAX) + 1); @@ -98,7 +96,8 @@ aubio_source_sndfile_t * new_aubio_source_sndfile(const char_t * path, uint_t sa if (s->handle == NULL) { /* show libsndfile err msg */ - AUBIO_ERR("source_sndfile: Failed opening %s: %s\n", s->path, sf_strerror (NULL)); + AUBIO_ERR("source_sndfile: Failed opening %s (%s)\n", s->path, + sf_strerror (NULL)); goto beach; } @@ -125,14 +124,20 @@ aubio_source_sndfile_t * new_aubio_source_sndfile(const char_t * path, uint_t sa } #ifdef HAVE_SAMPLERATE - s->resampler = NULL; s->input_data = NULL; + s->input_mat = NULL; + s->resamplers = NULL; if (s->ratio != 1) { + uint_t i; + s->resamplers = AUBIO_ARRAY(aubio_resampler_t*, s->input_channels); s->input_data = new_fvec(s->input_hop_size); - s->resampler = new_aubio_resampler(s->ratio, 4); + s->input_mat = new_fmat(s->input_channels, s->input_hop_size); + for (i = 0; i < (uint_t)s->input_channels; i++) { + s->resamplers[i] = new_aubio_resampler(s->ratio, 4); + } if (s->ratio > 1) { // we would need to add a ring buffer for these - if ( (uint_t)(s->input_hop_size * s->ratio + .5) != s->hop_size ) { + if ( (uint_t)FLOOR(s->input_hop_size * s->ratio + .5) != s->hop_size ) { AUBIO_ERR("source_sndfile: can not upsample %s from %d to %d\n", s->path, s->input_samplerate, s->samplerate); goto beach; @@ -165,21 +170,34 @@ beach: void aubio_source_sndfile_do(aubio_source_sndfile_t * s, fvec_t * read_data, uint_t * read){ uint_t i,j, input_channels = s->input_channels; /* read from file into scratch_data */ - sf_count_t read_samples = aubio_sf_read_smpl (s->handle, s->scratch_data, s->scratch_size); + uint_t length = aubio_source_validate_input_length("source_sndfile", s->path, + s->hop_size, read_data->length); + sf_count_t read_samples = aubio_sf_read_smpl (s->handle, s->scratch_data, + s->scratch_size); + uint_t read_length = read_samples / s->input_channels; /* where to store de-interleaved data */ smpl_t *ptr_data; + + if (!s->handle) { + AUBIO_ERR("source_sndfile: could not read from %s (file was closed)\n", + s->path); + *read = 0; + return; + } + #ifdef HAVE_SAMPLERATE if (s->ratio != 1) { ptr_data = s->input_data->data; } else #endif /* HAVE_SAMPLERATE */ { + read_length = MIN(length, read_length); ptr_data = read_data->data; } /* de-interleaving and down-mixing data */ - for (j = 0; j < read_samples / input_channels; j++) { + for (j = 0; j < read_length; j++) { ptr_data[j] = 0; for (i = 0; i < input_channels; i++) { ptr_data[j] += s->scratch_data[input_channels*j+i]; @@ -188,83 +206,70 @@ void aubio_source_sndfile_do(aubio_source_sndfile_t * s, fvec_t * read_data, uin } #ifdef HAVE_SAMPLERATE - if (s->resampler) { - aubio_resampler_do(s->resampler, s->input_data, read_data); + if (s->resamplers) { + aubio_resampler_do(s->resamplers[0], s->input_data, read_data); } #endif /* HAVE_SAMPLERATE */ - *read = (int)FLOOR(s->ratio * read_samples / input_channels + .5); + *read = MIN(length, (uint_t)FLOOR(s->ratio * read_length + .5)); - if (*read < s->hop_size) { - for (j = *read; j < s->hop_size; j++) { - read_data->data[j] = 0; - } - } + aubio_source_pad_output (read_data, *read); } void aubio_source_sndfile_do_multi(aubio_source_sndfile_t * s, fmat_t * read_data, uint_t * read){ uint_t i,j, input_channels = s->input_channels; /* do actual reading */ - sf_count_t read_samples = aubio_sf_read_smpl (s->handle, s->scratch_data, s->scratch_size); + uint_t length = aubio_source_validate_input_length("source_sndfile", s->path, + s->hop_size, read_data->length); + uint_t channels = aubio_source_validate_input_channels("source_sndfile", + s->path, s->input_channels, read_data->height); + sf_count_t read_samples = aubio_sf_read_smpl (s->handle, s->scratch_data, + s->scratch_size); + uint_t read_length = read_samples / s->input_channels; /* where to store de-interleaved data */ smpl_t **ptr_data; + + if (!s->handle) { + AUBIO_ERR("source_sndfile: could not read from %s (file was closed)\n", + s->path); + *read = 0; + return; + } + #ifdef HAVE_SAMPLERATE if (s->ratio != 1) { - AUBIO_ERR("source_sndfile: no multi channel resampling yet\n"); - return; - //ptr_data = s->input_data->data; + ptr_data = s->input_mat->data; } else #endif /* HAVE_SAMPLERATE */ { + read_length = MIN(read_length, length); ptr_data = read_data->data; } - if (read_data->height < input_channels) { - // destination matrix has less channels than the file; copy only first - // channels of the file, de-interleaving data - for (j = 0; j < read_samples / input_channels; j++) { - for (i = 0; i < read_data->height; i++) { - ptr_data[i][j] = s->scratch_data[j * input_channels + i]; - } - } - } else { - // destination matrix has as many or more channels than the file; copy each - // channel from the file to the destination matrix, de-interleaving data - for (j = 0; j < read_samples / input_channels; j++) { - for (i = 0; i < input_channels; i++) { - ptr_data[i][j] = s->scratch_data[j * input_channels + i]; - } - } - } - - if (read_data->height > input_channels) { - // destination matrix has more channels than the file; copy last channel - // of the file to each additional channels, de-interleaving data - for (j = 0; j < read_samples / input_channels; j++) { - for (i = input_channels; i < read_data->height; i++) { - ptr_data[i][j] = s->scratch_data[j * input_channels + (input_channels - 1)]; - } + for (j = 0; j < read_length; j++) { + for (i = 0; i < channels; i++) { + ptr_data[i][j] = s->scratch_data[j * input_channels + i]; } } #ifdef HAVE_SAMPLERATE - if (s->resampler) { - //aubio_resampler_do(s->resampler, s->input_data, read_data); + if (s->resamplers) { + for (i = 0; i < input_channels; i++) { + fvec_t input_chan, read_chan; + input_chan.data = s->input_mat->data[i]; + input_chan.length = s->input_mat->length; + read_chan.data = read_data->data[i]; + read_chan.length = read_data->length; + aubio_resampler_do(s->resamplers[i], &input_chan, &read_chan); + } } #endif /* HAVE_SAMPLERATE */ - *read = (int)FLOOR(s->ratio * read_samples / input_channels + .5); - - if (*read < s->hop_size) { - for (i = 0; i < read_data->height; i++) { - for (j = *read; j < s->hop_size; j++) { - read_data->data[i][j] = 0.; - } - } - } + *read = MIN(length, (uint_t)FLOOR(s->ratio * read_length + .5)); + aubio_source_pad_multi_output(read_data, input_channels, *read); } uint_t aubio_source_sndfile_get_samplerate(aubio_source_sndfile_t * s) { @@ -284,7 +289,18 @@ uint_t aubio_source_sndfile_get_duration (const aubio_source_sndfile_t * s) { uint_t aubio_source_sndfile_seek (aubio_source_sndfile_t * s, uint_t pos) { uint_t resampled_pos = (uint_t)ROUND(pos / s->ratio); - sf_count_t sf_ret = sf_seek (s->handle, resampled_pos, SEEK_SET); + sf_count_t sf_ret; + if (s->handle == NULL) { + AUBIO_ERR("source_sndfile: failed seeking in %s (file not opened?)\n", + s->path); + return AUBIO_FAIL; + } + if ((sint_t)pos < 0) { + AUBIO_ERR("source_sndfile: could not seek %s at %d (seeking position" + " should be >= 0)\n", s->path, pos); + return AUBIO_FAIL; + } + sf_ret = sf_seek (s->handle, resampled_pos, SEEK_SET); if (sf_ret == -1) { AUBIO_ERR("source_sndfile: Failed seeking %s at %d: %s\n", s->path, pos, sf_strerror (NULL)); return AUBIO_FAIL; @@ -299,25 +315,35 @@ uint_t aubio_source_sndfile_seek (aubio_source_sndfile_t * s, uint_t pos) { uint_t aubio_source_sndfile_close (aubio_source_sndfile_t *s) { if (!s->handle) { - return AUBIO_FAIL; + return AUBIO_OK; } if(sf_close(s->handle)) { AUBIO_ERR("source_sndfile: Error closing file %s: %s\n", s->path, sf_strerror (NULL)); return AUBIO_FAIL; } + s->handle = NULL; return AUBIO_OK; } void del_aubio_source_sndfile(aubio_source_sndfile_t * s){ - if (!s) return; + AUBIO_ASSERT(s); aubio_source_sndfile_close(s); #ifdef HAVE_SAMPLERATE - if (s->resampler != NULL) { - del_aubio_resampler(s->resampler); + if (s->resamplers != NULL) { + uint_t i = 0, input_channels = s->input_channels; + for (i = 0; i < input_channels; i ++) { + if (s->resamplers[i] != NULL) { + del_aubio_resampler(s->resamplers[i]); + } + } + AUBIO_FREE(s->resamplers); } if (s->input_data) { del_fvec(s->input_data); } + if (s->input_mat) { + del_fmat(s->input_mat); + } #endif /* HAVE_SAMPLERATE */ if (s->path) AUBIO_FREE(s->path); AUBIO_FREE(s->scratch_data); diff --git a/src/io/source_wavread.c b/src/io/source_wavread.c index d1b1f3a..22c6719 100644 --- a/src/io/source_wavread.c +++ b/src/io/source_wavread.c @@ -18,20 +18,18 @@ */ -#include "config.h" +#include "aubio_priv.h" #ifdef HAVE_WAVREAD -#include "aubio_priv.h" #include "fvec.h" #include "fmat.h" +#include "ioutils.h" #include "source_wavread.h" -#include <errno.h> - #define AUBIO_WAVREAD_BUFSIZE 1024 -#define SHORT_TO_FLOAT(x) (smpl_t)(x * 3.0517578125e-05) +//#define SHORT_TO_FLOAT(x) (smpl_t)(x * 3.0517578125e-05) struct _aubio_source_wavread_t { uint_t hop_size; @@ -60,8 +58,12 @@ struct _aubio_source_wavread_t { fmat_t *output; }; -unsigned int read_little_endian (unsigned char *buf, unsigned int length); -unsigned int read_little_endian (unsigned char *buf, unsigned int length) { +static unsigned int read_little_endian (unsigned char *buf, + unsigned int length); + +static unsigned int read_little_endian (unsigned char *buf, + unsigned int length) +{ uint_t i, ret = 0; for (i = 0; i < length; i++) { ret += buf[i] << (i * 8); @@ -71,8 +73,8 @@ unsigned int read_little_endian (unsigned char *buf, unsigned int length) { aubio_source_wavread_t * new_aubio_source_wavread(const char_t * path, uint_t samplerate, uint_t hop_size) { aubio_source_wavread_t * s = AUBIO_NEW(aubio_source_wavread_t); - size_t bytes_read = 0, bytes_expected = 44; - unsigned char buf[5]; + size_t bytes_read = 0, bytes_junk = 0, bytes_expected = 44; + unsigned char buf[5] = "\0"; unsigned int format, channels, sr, byterate, blockalign, duration, bitspersample;//, data_size; if (path == NULL) { @@ -88,7 +90,6 @@ aubio_source_wavread_t * new_aubio_source_wavread(const char_t * path, uint_t sa goto beach; } - if (s->path) AUBIO_FREE(s->path); s->path = AUBIO_ARRAY(char_t, strnlen(path, PATH_MAX) + 1); strncpy(s->path, path, strnlen(path, PATH_MAX) + 1); @@ -97,7 +98,7 @@ aubio_source_wavread_t * new_aubio_source_wavread(const char_t * path, uint_t sa s->fid = fopen((const char *)path, "rb"); if (!s->fid) { - AUBIO_ERR("source_wavread: Failed opening %s (System error: %s)\n", s->path, strerror(errno)); + AUBIO_STRERR("source_wavread: Failed opening %s (%s)\n", s->path, errorstr); goto beach; } @@ -105,7 +106,7 @@ aubio_source_wavread_t * new_aubio_source_wavread(const char_t * path, uint_t sa bytes_read += fread(buf, 1, 4, s->fid); buf[4] = '\0'; if ( strcmp((const char *)buf, "RIFF") != 0 ) { - AUBIO_ERR("source_wavread: could not find RIFF header in %s\n", s->path); + AUBIO_ERR("source_wavread: Failed opening %s (could not find RIFF header)\n", s->path); goto beach; } @@ -116,15 +117,34 @@ aubio_source_wavread_t * new_aubio_source_wavread(const char_t * path, uint_t sa bytes_read += fread(buf, 1, 4, s->fid); buf[4] = '\0'; if ( strcmp((const char *)buf, "WAVE") != 0 ) { - AUBIO_ERR("source_wavread: wrong format in RIFF header in %s\n", s->path); + AUBIO_ERR("source_wavread: Failed opening %s (wrong format in RIFF header)\n", s->path); goto beach; } // Subchunk1ID bytes_read += fread(buf, 1, 4, s->fid); buf[4] = '\0'; + + // check if we have a JUNK Chunk + if ( strcmp((const char *)buf, "JUNK") == 0 ) { + bytes_junk = fread(buf, 1, 4, s->fid); + buf[4] = '\0'; + bytes_junk += read_little_endian(buf, 4); + if (fseek(s->fid, bytes_read + bytes_junk, SEEK_SET) != 0) { + AUBIO_STRERR("source_wavread: Failed opening %s (could not seek past JUNK Chunk: %s)\n", + s->path, errorstr); + goto beach; + } + bytes_read += bytes_junk; + bytes_expected += bytes_junk + 4; + // now really read the fmt chunk + bytes_read += fread(buf, 1, 4, s->fid); + buf[4] = '\0'; + } + + // get the fmt chunk if ( strcmp((const char *)buf, "fmt ") != 0 ) { - AUBIO_ERR("source_wavread: fmt RIFF header in %s\n", s->path); + AUBIO_ERR("source_wavread: Failed opening %s (could not find 'fmt ' in RIFF header)\n", s->path); goto beach; } @@ -133,18 +153,18 @@ aubio_source_wavread_t * new_aubio_source_wavread(const char_t * path, uint_t sa format = read_little_endian(buf, 4); if ( format != 16 ) { // TODO accept format 18 - AUBIO_ERR("source_wavread: file %s is not encoded with PCM\n", s->path); + AUBIO_ERR("source_wavread: Failed opening %s (not encoded with PCM)\n", s->path); goto beach; } if ( buf[1] || buf[2] | buf[3] ) { - AUBIO_ERR("source_wavread: Subchunk1Size should be 0, in %s\n", s->path); + AUBIO_ERR("source_wavread: Failed opening %s (Subchunk1Size should be 0)\n", s->path); goto beach; } // AudioFormat bytes_read += fread(buf, 1, 2, s->fid); if ( buf[0] != 1 || buf[1] != 0) { - AUBIO_ERR("source_wavread: AudioFormat should be PCM, in %s\n", s->path); + AUBIO_ERR("source_wavread: Failed opening %s (AudioFormat should be PCM)\n", s->path); goto beach; } @@ -167,6 +187,26 @@ aubio_source_wavread_t * new_aubio_source_wavread(const char_t * path, uint_t sa // BitsPerSample bytes_read += fread(buf, 1, 2, s->fid); bitspersample = read_little_endian(buf, 2); + + if ( channels == 0 ) { + AUBIO_ERR("source_wavread: Failed opening %s (number of channels can not be 0)\n", s->path); + goto beach; + } + + if ( (sint_t)sr <= 0 ) { + AUBIO_ERR("source_wavread: Failed opening %s (samplerate can not be <= 0)\n", s->path); + goto beach; + } + + if ( byterate == 0 ) { + AUBIO_ERR("source_wavread: Failed opening %s (byterate can not be 0)\n", s->path); + goto beach; + } + + if ( bitspersample == 0 ) { + AUBIO_ERR("source_wavread: Failed opening %s (bitspersample can not be 0)\n", s->path); + goto beach; + } #if 0 if ( bitspersample != 16 ) { AUBIO_ERR("source_wavread: can not process %dbit file %s\n", @@ -176,12 +216,12 @@ aubio_source_wavread_t * new_aubio_source_wavread(const char_t * path, uint_t sa #endif if ( byterate * 8 != sr * channels * bitspersample ) { - AUBIO_ERR("source_wavread: wrong byterate in %s\n", s->path); + AUBIO_ERR("source_wavread: Failed opening %s (wrong byterate)\n", s->path); goto beach; } if ( blockalign * 8 != channels * bitspersample ) { - AUBIO_ERR("source_wavread: wrong blockalign in %s\n", s->path); + AUBIO_ERR("source_wavread: Failed opening %s (wrong blockalign)\n", s->path); goto beach; } @@ -210,9 +250,23 @@ aubio_source_wavread_t * new_aubio_source_wavread(const char_t * path, uint_t sa // Subchunk2ID bytes_read += fread(buf, 1, 4, s->fid); buf[4] = '\0'; - if ( strcmp((const char *)buf, "data") != 0 ) { - AUBIO_ERR("source_wavread: data RIFF header not found in %s\n", s->path); - goto beach; + while ( strcmp((const char *)buf, "data") != 0 ) { + if (feof(s->fid) || ferror(s->fid)) { + AUBIO_ERR("source_wavread: no data RIFF header found in %s\n", s->path); + goto beach; + } + bytes_junk = fread(buf, 1, 4, s->fid); + buf[4] = '\0'; + bytes_junk += read_little_endian(buf, 4); + if (fseek(s->fid, bytes_read + bytes_junk, SEEK_SET) != 0) { + AUBIO_STRERR("source_wavread: could not seek past unknown chunk in %s (%s)\n", + s->path, errorstr); + goto beach; + } + bytes_read += bytes_junk; + bytes_expected += bytes_junk+ 4; + bytes_read += fread(buf, 1, 4, s->fid); + buf[4] = '\0'; } // Subchunk2Size @@ -292,8 +346,15 @@ void aubio_source_wavread_do(aubio_source_wavread_t * s, fvec_t * read_data, uin uint_t i, j; uint_t end = 0; uint_t total_wrote = 0; - while (total_wrote < s->hop_size) { - end = MIN(s->read_samples - s->read_index, s->hop_size - total_wrote); + uint_t length = aubio_source_validate_input_length("source_wavread", s->path, + s->hop_size, read_data->length); + if (s->fid == NULL) { + AUBIO_ERR("source_wavread: could not read from %s (file not opened)\n", + s->path); + return; + } + while (total_wrote < length) { + end = MIN(s->read_samples - s->read_index, length - total_wrote); for (i = 0; i < end; i++) { read_data->data[i + total_wrote] = 0; for (j = 0; j < s->input_channels; j++ ) { @@ -302,7 +363,7 @@ void aubio_source_wavread_do(aubio_source_wavread_t * s, fvec_t * read_data, uin read_data->data[i + total_wrote] /= (smpl_t)(s->input_channels); } total_wrote += end; - if (total_wrote < s->hop_size) { + if (total_wrote < length) { uint_t wavread_read = 0; aubio_source_wavread_readframe(s, &wavread_read); s->read_samples = wavread_read; @@ -314,11 +375,9 @@ void aubio_source_wavread_do(aubio_source_wavread_t * s, fvec_t * read_data, uin s->read_index += end; } } - if (total_wrote < s->hop_size) { - for (i = end; i < s->hop_size; i++) { - read_data->data[i] = 0.; - } - } + + aubio_source_pad_output (read_data, total_wrote); + *read = total_wrote; } @@ -326,15 +385,24 @@ void aubio_source_wavread_do_multi(aubio_source_wavread_t * s, fmat_t * read_dat uint_t i,j; uint_t end = 0; uint_t total_wrote = 0; - while (total_wrote < s->hop_size) { - end = MIN(s->read_samples - s->read_index, s->hop_size - total_wrote); - for (j = 0; j < read_data->height; j++) { + uint_t length = aubio_source_validate_input_length("source_wavread", s->path, + s->hop_size, read_data->length); + uint_t channels = aubio_source_validate_input_channels("source_wavread", + s->path, s->input_channels, read_data->height); + if (s->fid == NULL) { + AUBIO_ERR("source_wavread: could not read from %s (file not opened)\n", + s->path); + return; + } + while (total_wrote < length) { + end = MIN(s->read_samples - s->read_index, length - total_wrote); + for (j = 0; j < channels; j++) { for (i = 0; i < end; i++) { read_data->data[j][i + total_wrote] = s->output->data[j][i]; } } total_wrote += end; - if (total_wrote < s->hop_size) { + if (total_wrote < length) { uint_t wavread_read = 0; aubio_source_wavread_readframe(s, &wavread_read); s->read_samples = wavread_read; @@ -346,13 +414,9 @@ void aubio_source_wavread_do_multi(aubio_source_wavread_t * s, fmat_t * read_dat s->read_index += end; } } - if (total_wrote < s->hop_size) { - for (j = 0; j < read_data->height; j++) { - for (i = end; i < s->hop_size; i++) { - read_data->data[j][i] = 0.; - } - } - } + + aubio_source_pad_multi_output(read_data, s->input_channels, total_wrote); + *read = total_wrote; } @@ -366,12 +430,17 @@ uint_t aubio_source_wavread_get_channels(aubio_source_wavread_t * s) { uint_t aubio_source_wavread_seek (aubio_source_wavread_t * s, uint_t pos) { uint_t ret = 0; + if (s->fid == NULL) { + AUBIO_ERR("source_wavread: could not seek %s (file not opened?)\n", s->path, pos); + return AUBIO_FAIL; + } if ((sint_t)pos < 0) { + AUBIO_ERR("source_wavread: could not seek %s at %d (seeking position should be >= 0)\n", s->path, pos); return AUBIO_FAIL; } ret = fseek(s->fid, s->seek_start + pos * s->blockalign, SEEK_SET); if (ret != 0) { - AUBIO_ERR("source_wavread: could not seek %s at %d (%s)\n", s->path, pos, strerror(errno)); + AUBIO_STRERR("source_wavread: could not seek %s at %d (%s)\n", s->path, pos, errorstr); return AUBIO_FAIL; } // reset some values @@ -388,11 +457,11 @@ uint_t aubio_source_wavread_get_duration (const aubio_source_wavread_t * s) { } uint_t aubio_source_wavread_close (aubio_source_wavread_t * s) { - if (!s->fid) { - return AUBIO_FAIL; + if (s->fid == NULL) { + return AUBIO_OK; } if (fclose(s->fid)) { - AUBIO_ERR("source_wavread: could not close %s (%s)\n", s->path, strerror(errno)); + AUBIO_STRERR("source_wavread: could not close %s (%s)\n", s->path, errorstr); return AUBIO_FAIL; } s->fid = NULL; @@ -400,7 +469,7 @@ uint_t aubio_source_wavread_close (aubio_source_wavread_t * s) { } void del_aubio_source_wavread(aubio_source_wavread_t * s) { - if (!s) return; + AUBIO_ASSERT(s); aubio_source_wavread_close(s); if (s->short_output) AUBIO_FREE(s->short_output); if (s->output) del_fmat(s->output); diff --git a/src/io/utils_apple_audio.c b/src/io/utils_apple_audio.c index 2e651d5..a05d65d 100644 --- a/src/io/utils_apple_audio.c +++ b/src/io/utils_apple_audio.c @@ -1,4 +1,4 @@ -#include "config.h" +#include "aubio_priv.h" #if defined(HAVE_SOURCE_APPLE_AUDIO) || defined(HAVE_SINK_APPLE_AUDIO) @@ -6,18 +6,18 @@ #include <CoreFoundation/CoreFoundation.h> // ExtAudioFileRef, AudioStreamBasicDescription, AudioBufferList, ... #include <AudioToolbox/AudioToolbox.h> -#include "aubio_priv.h" int createAubioBufferList(AudioBufferList *bufferList, int channels, int segmentSize); void freeAudioBufferList(AudioBufferList *bufferList); CFURLRef getURLFromPath(const char * path); char_t *getPrintableOSStatusError(char_t *str, OSStatus error); -int createAubioBufferList(AudioBufferList * bufferList, int channels, int max_source_samples) { +int createAudioBufferList(AudioBufferList * bufferList, int channels, + int max_source_samples) { bufferList->mNumberBuffers = 1; bufferList->mBuffers[0].mNumberChannels = channels; - bufferList->mBuffers[0].mData = AUBIO_ARRAY(short, max_source_samples); - bufferList->mBuffers[0].mDataByteSize = max_source_samples * sizeof(short); + bufferList->mBuffers[0].mData = AUBIO_ARRAY(smpl_t, max_source_samples); + bufferList->mBuffers[0].mDataByteSize = max_source_samples * sizeof(smpl_t); return 0; } diff --git a/src/mathutils.c b/src/mathutils.c index 01b3519..35755fe 100644 --- a/src/mathutils.c +++ b/src/mathutils.c @@ -24,11 +24,11 @@ #include "fvec.h" #include "mathutils.h" #include "musicutils.h" -#include "config.h" /** Window types */ typedef enum { + aubio_win_ones, aubio_win_rectangle, aubio_win_hamming, aubio_win_hanning, @@ -64,7 +64,9 @@ uint_t fvec_set_window (fvec_t *win, char_t *window_type) { if (window_type == NULL) { AUBIO_ERR ("window type can not be null.\n"); return 1; - } else if (strcmp (window_type, "rectangle") == 0) + } else if (strcmp (window_type, "ones") == 0) + wintype = aubio_win_ones; + else if (strcmp (window_type, "rectangle") == 0) wintype = aubio_win_rectangle; else if (strcmp (window_type, "hamming") == 0) wintype = aubio_win_hamming; @@ -89,9 +91,11 @@ uint_t fvec_set_window (fvec_t *win, char_t *window_type) { return 1; } switch(wintype) { + case aubio_win_ones: + fvec_ones(win); + break; case aubio_win_rectangle: - for (i=0;i<size;i++) - w[i] = 0.5; + fvec_set_all(win, .5); break; case aubio_win_hamming: for (i=0;i<size;i++) @@ -155,45 +159,53 @@ smpl_t fvec_mean (fvec_t * s) { smpl_t tmp = 0.0; -#ifndef HAVE_ACCELERATE +#if defined(HAVE_INTEL_IPP) + aubio_ippsMean(s->data, (int)s->length, &tmp); + return tmp; +#elif defined(HAVE_ACCELERATE) + aubio_vDSP_meanv(s->data, 1, &tmp, s->length); + return tmp; +#else uint_t j; for (j = 0; j < s->length; j++) { tmp += s->data[j]; } - return tmp / (smpl_t) (s->length); -#else - aubio_vDSP_meanv(s->data, 1, &tmp, s->length); - return tmp; -#endif /* HAVE_ACCELERATE */ + return tmp / (smpl_t)(s->length); +#endif } smpl_t fvec_sum (fvec_t * s) { smpl_t tmp = 0.0; -#ifndef HAVE_ACCELERATE +#if defined(HAVE_INTEL_IPP) + aubio_ippsSum(s->data, (int)s->length, &tmp); +#elif defined(HAVE_ACCELERATE) + aubio_vDSP_sve(s->data, 1, &tmp, s->length); +#else uint_t j; for (j = 0; j < s->length; j++) { tmp += s->data[j]; } -#else - aubio_vDSP_sve(s->data, 1, &tmp, s->length); -#endif /* HAVE_ACCELERATE */ +#endif return tmp; } smpl_t fvec_max (fvec_t * s) { -#ifndef HAVE_ACCELERATE +#if defined(HAVE_INTEL_IPP) + smpl_t tmp = 0.; + aubio_ippsMax( s->data, (int)s->length, &tmp); +#elif defined(HAVE_ACCELERATE) + smpl_t tmp = 0.; + aubio_vDSP_maxv( s->data, 1, &tmp, s->length ); +#else uint_t j; - smpl_t tmp = 0.0; - for (j = 0; j < s->length; j++) { + smpl_t tmp = s->data[0]; + for (j = 1; j < s->length; j++) { tmp = (tmp > s->data[j]) ? tmp : s->data[j]; } -#else - smpl_t tmp = 0.; - aubio_vDSP_maxv(s->data, 1, &tmp, s->length); #endif return tmp; } @@ -201,15 +213,18 @@ fvec_max (fvec_t * s) smpl_t fvec_min (fvec_t * s) { -#ifndef HAVE_ACCELERATE +#if defined(HAVE_INTEL_IPP) + smpl_t tmp = 0.; + aubio_ippsMin(s->data, (int)s->length, &tmp); +#elif defined(HAVE_ACCELERATE) + smpl_t tmp = 0.; + aubio_vDSP_minv(s->data, 1, &tmp, s->length); +#else uint_t j; smpl_t tmp = s->data[0]; - for (j = 0; j < s->length; j++) { + for (j = 1; j < s->length; j++) { tmp = (tmp < s->data[j]) ? tmp : s->data[j]; } -#else - smpl_t tmp = 0.; - aubio_vDSP_minv(s->data, 1, &tmp, s->length); #endif return tmp; } @@ -226,10 +241,10 @@ fvec_min_elem (fvec_t * s) } #else smpl_t tmp = 0.; - uint_t pos = 0.; - aubio_vDSP_minvi(s->data, 1, &tmp, (vDSP_Length *)&pos, s->length); + vDSP_Length pos = 0; + aubio_vDSP_minvi(s->data, 1, &tmp, &pos, s->length); #endif - return pos; + return (uint_t)pos; } uint_t @@ -244,10 +259,10 @@ fvec_max_elem (fvec_t * s) } #else smpl_t tmp = 0.; - uint_t pos = 0.; - aubio_vDSP_maxvi(s->data, 1, &tmp, (vDSP_Length *)&pos, s->length); + vDSP_Length pos = 0; + aubio_vDSP_maxvi(s->data, 1, &tmp, &pos, s->length); #endif - return pos; + return (uint_t)pos; } void @@ -256,7 +271,7 @@ fvec_shift (fvec_t * s) uint_t half = s->length / 2, start = half, j; // if length is odd, middle element is moved to the end if (2 * half < s->length) start ++; -#ifndef HAVE_ATLAS +#ifndef HAVE_BLAS for (j = 0; j < half; j++) { ELEM_SWAP (s->data[j], s->data[j + start]); } @@ -276,7 +291,7 @@ fvec_ishift (fvec_t * s) uint_t half = s->length / 2, start = half, j; // if length is odd, middle element is moved to the beginning if (2 * half < s->length) start ++; -#ifndef HAVE_ATLAS +#ifndef HAVE_BLAS for (j = 0; j < half; j++) { ELEM_SWAP (s->data[j], s->data[j + start]); } @@ -290,11 +305,30 @@ fvec_ishift (fvec_t * s) } } +void fvec_push(fvec_t *in, smpl_t new_elem) { + uint_t i; + for (i = 0; i < in->length - 1; i++) { + in->data[i] = in->data[i + 1]; + } + in->data[in->length - 1] = new_elem; +} + +void fvec_clamp(fvec_t *in, smpl_t absmax) { + uint_t i; + for (i = 0; i < in->length; i++) { + if (in->data[i] > 0 && in->data[i] > ABS(absmax)) { + in->data[i] = absmax; + } else if (in->data[i] < 0 && in->data[i] < -ABS(absmax)) { + in->data[i] = -absmax; + } + } +} + smpl_t aubio_level_lin (const fvec_t * f) { smpl_t energy = 0.; -#ifndef HAVE_ATLAS +#ifndef HAVE_BLAS uint_t j; for (j = 0; j < f->length; j++) { energy += SQR (f->data[j]); @@ -353,6 +387,15 @@ fvec_add (fvec_t * o, smpl_t val) } } +void +fvec_mul (fvec_t *o, smpl_t val) +{ + uint_t j; + for (j = 0; j < o->length; j++) { + o->data[j] *= val; + } +} + void fvec_adapt_thres(fvec_t * vec, fvec_t * tmp, uint_t post, uint_t pre) { uint_t length = vec->length, j; @@ -488,7 +531,7 @@ aubio_freqtomidi (smpl_t freq) if (freq < 2. || freq > 100000.) return 0.; // avoid nans and infs /* log(freq/A-2)/log(2) */ midi = freq / 6.875; - midi = LOG (midi) / 0.69314718055995; + midi = LOG (midi) / 0.6931471805599453; midi *= 12; midi -= 3; return midi; @@ -500,7 +543,7 @@ aubio_miditofreq (smpl_t midi) smpl_t freq; if (midi > 140.) return 0.; // avoid infs freq = (midi + 3.) / 12.; - freq = EXP (freq * 0.69314718055995); + freq = EXP (freq * 0.6931471805599453); freq *= 6.875; return freq; } @@ -551,6 +594,17 @@ aubio_next_power_of_two (uint_t a) return i; } +uint_t +aubio_power_of_two_order (uint_t a) +{ + int order = 0; + int temp = aubio_next_power_of_two(a); + while (temp >>= 1) { + ++order; + } + return order; +} + smpl_t aubio_db_spl (const fvec_t * o) { diff --git a/src/mathutils.h b/src/mathutils.h index 6638f69..4336d7e 100644 --- a/src/mathutils.h +++ b/src/mathutils.h @@ -117,6 +117,17 @@ of the resulting spectrum. See Amalia de Götzen's paper referred to above. */ void fvec_ishift (fvec_t * v); +/** push a new element to the end of a vector, erasing the first element and + * sliding all others + + \param in vector to push to + \param new_elem new_element to add at the end of the vector + + In numpy words, this is equivalent to: in = np.concatenate([in, [new_elem]])[1:] + +*/ +void fvec_push(fvec_t *in, smpl_t new_elem); + /** compute the sum of all elements of a vector \param v vector to compute the sum of @@ -182,6 +193,14 @@ void fvec_alpha_normalise (fvec_t * v, smpl_t p); */ void fvec_add (fvec_t * v, smpl_t c); +/** multiply each elements of a vector by a scalar + + \param v vector to add constant to + \param s constant to scale v with + +*/ +void fvec_mul (fvec_t * v, smpl_t s); + /** remove the minimum value of the vector to each elements \param v vector to remove minimum from @@ -301,6 +320,9 @@ uint_t aubio_is_power_of_two(uint_t a); /** return the next power of power of 2 greater than a */ uint_t aubio_next_power_of_two(uint_t a); +/** return the log2 factor of the given power of 2 value a */ +uint_t aubio_power_of_two_order(uint_t a); + /** compute normalised autocorrelation function \param input vector to compute autocorrelation from diff --git a/src/musicutils.c b/src/musicutils.c new file mode 100644 index 0000000..14ef849 --- /dev/null +++ b/src/musicutils.c @@ -0,0 +1,85 @@ +/* + Copyright (C) 2018 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#include "aubio_priv.h" +#include "musicutils.h" + +smpl_t +aubio_hztomel (smpl_t freq) +{ + const smpl_t lin_space = 3./200.; + const smpl_t split_hz = 1000.; + const smpl_t split_mel = split_hz * lin_space; + const smpl_t log_space = 27./LOG(6400/1000.); + if (freq < 0) { + AUBIO_WRN("hztomel: input frequency should be >= 0\n"); + return 0; + } + if (freq < split_hz) + { + return freq * lin_space; + } else { + return split_mel + log_space * LOG (freq / split_hz); + } + +} + +smpl_t +aubio_meltohz (smpl_t mel) +{ + const smpl_t lin_space = 200./3.; + const smpl_t split_hz = 1000.; + const smpl_t split_mel = split_hz / lin_space; + const smpl_t logSpacing = POW(6400/1000., 1/27.); + if (mel < 0) { + AUBIO_WRN("meltohz: input mel should be >= 0\n"); + return 0; + } + if (mel < split_mel) { + return lin_space * mel; + } else { + return split_hz * POW(logSpacing, mel - split_mel); + } +} + +smpl_t +aubio_hztomel_htk (smpl_t freq) +{ + const smpl_t split_hz = 700.; + const smpl_t log_space = 1127.; + if (freq < 0) { + AUBIO_WRN("hztomel_htk: input frequency should be >= 0\n"); + return 0; + } + return log_space * LOG (1 + freq / split_hz); +} + +smpl_t +aubio_meltohz_htk (smpl_t mel) +{ + const smpl_t split_hz = 700.; + const smpl_t log_space = 1./1127.; + if (mel < 0) { + AUBIO_WRN("meltohz_htk: input frequency should be >= 0\n"); + return 0; + } + return split_hz * ( EXP ( mel * log_space) - 1.); +} + diff --git a/src/musicutils.h b/src/musicutils.h index f71d20b..a659495 100644 --- a/src/musicutils.h +++ b/src/musicutils.h @@ -86,6 +86,105 @@ smpl_t aubio_bintofreq (smpl_t bin, smpl_t samplerate, smpl_t fftsize); /** convert frequency (Hz) to frequency bin */ smpl_t aubio_freqtobin (smpl_t freq, smpl_t samplerate, smpl_t fftsize); +/** convert frequency (Hz) to mel + + \param freq input frequency, in Hz + + \return output mel + + Converts a scalar from the frequency domain to the mel scale using Slaney + Auditory Toolbox's implementation: + + If \f$ f < 1000 \f$, \f$ m = 3 f / 200 \f$. + + If \f$ f >= 1000 \f$, \f$ m = 1000 + 27 \frac{{ln}(f) - ln(1000))} + {{ln}(6400) - ln(1000)} + \f$ + + See also + -------- + + aubio_meltohz(), aubio_hztomel_htk(). + +*/ +smpl_t aubio_hztomel (smpl_t freq); + +/** convert mel to frequency (Hz) + + \param mel input mel + + \return output frequency, in Hz + + Converts a scalar from the mel scale to the frequency domain using Slaney + Auditory Toolbox's implementation: + + If \f$ f < 1000 \f$, \f$ f = 200 m/3 \f$. + + If \f$ f \geq 1000 \f$, \f$ f = 1000 + \left(\frac{6400}{1000}\right) + ^{\frac{m - 1000}{27}} \f$ + + See also + -------- + + aubio_hztomel(), aubio_meltohz_htk(). + + References + ---------- + + Malcolm Slaney, *Auditory Toolbox Version 2, Technical Report #1998-010* + https://engineering.purdue.edu/~malcolm/interval/1998-010/ + +*/ +smpl_t aubio_meltohz (smpl_t mel); + +/** convert frequency (Hz) to mel + + \param freq input frequency, in Hz + + \return output mel + + Converts a scalar from the frequency domain to the mel scale, using the + equation defined by O'Shaughnessy, as implemented in the HTK speech + recognition toolkit: + + \f$ m = 1127 + ln(1 + \frac{f}{700}) \f$ + + See also + -------- + + aubio_meltohz_htk(), aubio_hztomel(). + + References + ---------- + + Douglas O'Shaughnessy (1987). *Speech communication: human and machine*. + Addison-Wesley. p. 150. ISBN 978-0-201-16520-3. + + HTK Speech Recognition Toolkit: http://htk.eng.cam.ac.uk/ + + */ +smpl_t aubio_hztomel_htk (smpl_t freq); + +/** convert mel to frequency (Hz) + + \param mel input mel + + \return output frequency, in Hz + + Converts a scalar from the mel scale to the frequency domain, using the + equation defined by O'Shaughnessy, as implemented in the HTK speech + recognition toolkit: + + \f$ f = 700 * {e}^\left(\frac{f}{1127} - 1\right) \f$ + + See also + -------- + + aubio_hztomel_htk(), aubio_meltohz(). + +*/ +smpl_t aubio_meltohz_htk (smpl_t mel); + /** convert frequency (Hz) to midi value (0-128) */ smpl_t aubio_freqtomidi (smpl_t freq); @@ -156,6 +255,14 @@ uint_t aubio_silence_detection (const fvec_t * v, smpl_t threshold); */ smpl_t aubio_level_detection (const fvec_t * v, smpl_t threshold); +/** clamp the values of a vector within the range [-abs(max), abs(max)] + + \param in vector to clamp + \param absmax maximum value over which input vector elements should be clamped + +*/ +void fvec_clamp(fvec_t *in, smpl_t absmax); + #ifdef __cplusplus } #endif diff --git a/src/notes/notes.c b/src/notes/notes.c index 22a7cd8..fab3f83 100644 --- a/src/notes/notes.c +++ b/src/notes/notes.c @@ -1,5 +1,5 @@ /* - Copyright (C) 2014 Paul Brossier <piem@aubio.org> + Copyright (C) 2014-2018 Paul Brossier <piem@aubio.org> This file is part of aubio. @@ -24,6 +24,13 @@ #include "onset/onset.h" #include "notes/notes.h" +#define AUBIO_DEFAULT_NOTES_SILENCE -70. +#define AUBIO_DEFAULT_NOTES_RELEASE_DROP 10. +// increase to 10. for .1 cent precision +// or to 100. for .01 cent precision +#define AUBIO_DEFAULT_CENT_PRECISION 1. +#define AUBIO_DEFAULT_NOTES_MINIOI_MS 30. + struct _aubio_notes_t { uint_t onset_buf_size; @@ -50,6 +57,9 @@ struct _aubio_notes_t { smpl_t silence_threshold; uint_t isready; + + smpl_t last_onset_level; + smpl_t release_drop_level; }; aubio_notes_t * new_aubio_notes (const char_t * method, @@ -73,25 +83,34 @@ aubio_notes_t * new_aubio_notes (const char_t * method, o->isready = 0; o->onset = new_aubio_onset (onset_method, o->onset_buf_size, o->hop_size, o->samplerate); + if (o->onset == NULL) goto fail; if (o->onset_threshold != 0.) aubio_onset_set_threshold (o->onset, o->onset_threshold); o->onset_output = new_fvec (1); o->pitch = new_aubio_pitch (pitch_method, o->pitch_buf_size, o->hop_size, o->samplerate); + if (o->pitch == NULL) goto fail; if (o->pitch_tolerance != 0.) aubio_pitch_set_tolerance (o->pitch, o->pitch_tolerance); + aubio_pitch_set_unit (o->pitch, "midi"); o->pitch_output = new_fvec (1); if (strcmp(method, "default") != 0) { - AUBIO_ERR("unknown notes detection method %s, using default.\n", - method); + AUBIO_ERR("notes: unknown notes detection method \"%s\"\n", method); goto fail; } o->note_buffer = new_fvec(o->median); o->note_buffer2 = new_fvec(o->median); + if (!o->onset_output || !o->pitch_output || + !o->note_buffer || !o->note_buffer2) goto fail; + o->curnote = -1.; o->newnote = 0.; - o->silence_threshold = -90.; + aubio_notes_set_silence(o, AUBIO_DEFAULT_NOTES_SILENCE); + aubio_notes_set_minioi_ms (o, AUBIO_DEFAULT_NOTES_MINIOI_MS); + + o->last_onset_level = AUBIO_DEFAULT_NOTES_SILENCE; + o->release_drop_level = AUBIO_DEFAULT_NOTES_RELEASE_DROP; return o; @@ -100,6 +119,55 @@ fail: return NULL; } +uint_t aubio_notes_set_silence(aubio_notes_t *o, smpl_t silence) +{ + uint_t err = AUBIO_OK; + if (aubio_pitch_set_silence(o->pitch, silence) != AUBIO_OK) { + err = AUBIO_FAIL; + } + if (aubio_onset_set_silence(o->onset, silence) != AUBIO_OK) { + err = AUBIO_FAIL; + } + o->silence_threshold = silence; + return err; +} + +smpl_t aubio_notes_get_silence(const aubio_notes_t *o) +{ + return aubio_pitch_get_silence(o->pitch); +} + +uint_t aubio_notes_set_minioi_ms (aubio_notes_t *o, smpl_t minioi_ms) +{ + uint_t err = AUBIO_OK; + if (!o->onset || (aubio_onset_set_minioi_ms(o->onset, minioi_ms) != 0)) { + err = AUBIO_FAIL; + } + return err; +} + +smpl_t aubio_notes_get_minioi_ms(const aubio_notes_t *o) +{ + return aubio_onset_get_minioi_ms(o->onset); +} + +uint_t aubio_notes_set_release_drop(aubio_notes_t *o, smpl_t release_drop_level) +{ + uint_t err = AUBIO_OK; + if (release_drop_level <= 0.) { + AUBIO_ERR("notes: release_drop should be >= 0, got %f\n", release_drop_level); + err = AUBIO_FAIL; + } else { + o->release_drop_level = release_drop_level; + } + return err; +} + +smpl_t aubio_notes_get_release_drop(const aubio_notes_t *o) +{ + return o->release_drop_level; +} + /** append new note candidate to the note_buffer and return filtered value. we * need to copy the input array as fvec_median destroy its input data.*/ static void @@ -109,18 +177,16 @@ note_append (fvec_t * note_buffer, smpl_t curnote) for (i = 0; i < note_buffer->length - 1; i++) { note_buffer->data[i] = note_buffer->data[i + 1]; } - note_buffer->data[note_buffer->length - 1] = curnote; + //note_buffer->data[note_buffer->length - 1] = ROUND(10.*curnote)/10.; + note_buffer->data[note_buffer->length - 1] = ROUND(AUBIO_DEFAULT_CENT_PRECISION*curnote); return; } -static uint_t +static smpl_t aubio_notes_get_latest_note (aubio_notes_t *o) { - uint_t i; - for (i = 0; i < o->note_buffer->length; i++) { - o->note_buffer2->data[i] = o->note_buffer->data[i]; - } - return fvec_median (o->note_buffer2); + fvec_copy(o->note_buffer, o->note_buffer2); + return fvec_median (o->note_buffer2) / AUBIO_DEFAULT_CENT_PRECISION; } @@ -146,6 +212,7 @@ void aubio_notes_do (aubio_notes_t *o, const fvec_t * input, fvec_t * notes) //send_noteon(o->curnote,0); //notes->data[0] = o->curnote; //notes->data[1] = 0.; + //AUBIO_WRN("notes: sending note-off at onset, not enough level\n"); notes->data[2] = o->curnote; } else { if (o->median) { @@ -153,6 +220,7 @@ void aubio_notes_do (aubio_notes_t *o, const fvec_t * input, fvec_t * notes) } else { /* kill old note */ //send_noteon(o->curnote,0, o->samplerate); + //AUBIO_WRN("notes: sending note-off at onset, new onset detected\n"); notes->data[2] = o->curnote; /* get and send new one */ //send_noteon(new_pitch,127+(int)floor(curlevel), o->samplerate); @@ -160,17 +228,33 @@ void aubio_notes_do (aubio_notes_t *o, const fvec_t * input, fvec_t * notes) notes->data[1] = 127 + (int)floor(curlevel); o->curnote = new_pitch; } + o->last_onset_level = curlevel; } } else { - if (o->median) { + if (curlevel < o->last_onset_level - o->release_drop_level) + { + // send note off + //AUBIO_WRN("notes: sending note-off, release detected\n"); + notes->data[0] = 0; + notes->data[1] = 0; + notes->data[2] = o->curnote; + // reset last_onset_level to silence_threshold + o->last_onset_level = o->silence_threshold; + o->curnote = 0; + } + else if (o->median) + { if (o->isready > 0) o->isready++; if (o->isready == o->median) { /* kill old note */ //send_noteon(curnote,0); - notes->data[2] = o->curnote; - notes->data[3] = 0; + if (o->curnote != 0) + { + //AUBIO_WRN("notes: sending note-off, new note detected\n"); + notes->data[2] = o->curnote; + } o->newnote = aubio_notes_get_latest_note(o); o->curnote = o->newnote; /* get and send new one */ diff --git a/src/notes/notes.h b/src/notes/notes.h index fc12bad..f256be3 100644 --- a/src/notes/notes.h +++ b/src/notes/notes.h @@ -18,6 +18,12 @@ */ +/** \file + + Note detection object + +*/ + #ifndef _AUBIO_NOTES_H #define _AUBIO_NOTES_H @@ -51,12 +57,85 @@ void del_aubio_notes(aubio_notes_t * o); /** execute note detection on an input signal frame \param o note detection object as returned by new_aubio_notes() - \param in input signal of size [hop_size] - \param out output notes of size [3] ? FIXME + \param input input signal of size [hop_size] + \param output output notes, fvec of length 3 + + The notes output is a vector of length 3 containing: + - 0. the midi note value, or 0 if no note was found + - 1. the note velocity + - 2. the midi note to turn off */ void aubio_notes_do (aubio_notes_t *o, const fvec_t * input, fvec_t * output); +/** set notes detection silence threshold + + \param o notes detection object as returned by new_aubio_notes() + \param silence new silence detection threshold + + \return 0 on success, non-zero otherwise + +*/ +uint_t aubio_notes_set_silence(aubio_notes_t * o, smpl_t silence); + +/** get notes detection silence threshold + + \param o notes detection object as returned by new_aubio_notes() + + \return current silence threshold + +*/ +smpl_t aubio_notes_get_silence(const aubio_notes_t * o); + +/** get notes detection minimum inter-onset interval, in millisecond + + \param o notes detection object as returned by new_aubio_notes() + + \return current minimum inter onset interval + + */ +smpl_t aubio_notes_get_minioi_ms(const aubio_notes_t *o); + +/** set notes detection minimum inter-onset interval, in millisecond + + \param o notes detection object as returned by new_aubio_notes() + \param minioi_ms new inter-onset interval + + \return 0 on success, non-zero otherwise + +*/ +uint_t aubio_notes_set_minioi_ms (aubio_notes_t *o, smpl_t minioi_ms); + +/** get notes object release drop level, in dB + + \param o notes detection object as returned by new_aubio_notes() + + \return current release drop level, in dB + + */ +smpl_t aubio_notes_get_release_drop (const aubio_notes_t *o); + +/** set note release drop level, in dB + + This function sets the release_drop_level parameter, in dB. When a new note + is found, the current level in dB is measured. If the measured level drops + under that initial level - release_drop_level, then a note-off will be + emitted. + + Defaults to `10`, in dB. + + \note This parameter was added in version `0.4.8`. Results obtained with + earlier versions can be reproduced by setting this value to `100`, so that + note-off will not be played until the next note. + + \param o notes detection object as returned by new_aubio_notes() + \param release_drop new release drop level, in dB + + \return 0 on success, non-zero otherwise + +*/ +uint_t aubio_notes_set_release_drop (aubio_notes_t *o, smpl_t release_drop); + #ifdef __cplusplus } #endif diff --git a/src/onset/onset.c b/src/onset/onset.c index 9d241df..6123d51 100644 --- a/src/onset/onset.c +++ b/src/onset/onset.c @@ -23,10 +23,13 @@ #include "cvec.h" #include "spectral/specdesc.h" #include "spectral/phasevoc.h" +#include "spectral/awhitening.h" #include "onset/peakpicker.h" #include "mathutils.h" #include "onset/onset.h" +void aubio_onset_default_parameters (aubio_onset_t *o, const char_t * method); + /** structure to store object state */ struct _aubio_onset_t { aubio_pvoc_t * pv; /**< phase vocoder */ @@ -42,6 +45,11 @@ struct _aubio_onset_t { uint_t total_frames; /**< total number of frames processed since the beginning */ uint_t last_onset; /**< last detected onset location, in frames */ + + uint_t apply_compression; + smpl_t lambda_compression; + uint_t apply_awhitening; /**< apply adaptive spectral whitening */ + aubio_spectral_whitening_t *spectral_whitening; }; /* execute onset detection function on iput buffer */ @@ -49,6 +57,16 @@ void aubio_onset_do (aubio_onset_t *o, const fvec_t * input, fvec_t * onset) { smpl_t isonset = 0; aubio_pvoc_do (o->pv,input, o->fftgrain); + /* + if (apply_filtering) { + } + */ + if (o->apply_awhitening) { + aubio_spectral_whitening_do(o->spectral_whitening, o->fftgrain); + } + if (o->apply_compression) { + cvec_logmag(o->fftgrain, o->lambda_compression); + } aubio_specdesc_do (o->od, o->fftgrain, o->desc); aubio_peakpicker_do(o->pp, o->desc, onset); isonset = onset->data[0]; @@ -57,10 +75,17 @@ void aubio_onset_do (aubio_onset_t *o, const fvec_t * input, fvec_t * onset) //AUBIO_DBG ("silent onset, not marking as onset\n"); isonset = 0; } else { + // we have an onset uint_t new_onset = o->total_frames + (uint_t)ROUND(isonset * o->hop_size); + // check if last onset time was more than minioi ago if (o->last_onset + o->minioi < new_onset) { - //AUBIO_DBG ("accepted detection, marking as onset\n"); - o->last_onset = new_onset; + // start of file: make sure (new_onset - delay) >= 0 + if (o->last_onset > 0 && o->delay > new_onset) { + isonset = 0; + } else { + //AUBIO_DBG ("accepted detection, marking as onset\n"); + o->last_onset = MAX(o->delay, new_onset); + } } else { //AUBIO_DBG ("doubled onset, not marking as onset\n"); isonset = 0; @@ -99,6 +124,32 @@ smpl_t aubio_onset_get_last_ms (const aubio_onset_t *o) return aubio_onset_get_last_s (o) * 1000.; } +uint_t aubio_onset_set_awhitening (aubio_onset_t *o, uint_t enable) +{ + o->apply_awhitening = enable == 1 ? 1 : 0; + return AUBIO_OK; +} + +smpl_t aubio_onset_get_awhitening (aubio_onset_t *o) +{ + return o->apply_awhitening; +} + +uint_t aubio_onset_set_compression (aubio_onset_t *o, smpl_t lambda) +{ + if (lambda < 0.) { + return AUBIO_FAIL; + } + o->lambda_compression = lambda; + o->apply_compression = (o->lambda_compression > 0.) ? 1 : 0; + return AUBIO_OK; +} + +smpl_t aubio_onset_get_compression (aubio_onset_t *o) +{ + return o->apply_compression ? o->lambda_compression : 0; +} + uint_t aubio_onset_set_silence(aubio_onset_t * o, smpl_t silence) { o->silence = silence; return AUBIO_OK; @@ -190,7 +241,7 @@ aubio_onset_t * new_aubio_onset (const char_t * onset_mode, AUBIO_ERR("onset: got buffer_size %d, but can not be < 2\n", buf_size); goto beach; } else if (buf_size < hop_size) { - AUBIO_ERR("onset: hop size (%d) is larger than win size (%d)\n", buf_size, hop_size); + AUBIO_ERR("onset: hop size (%d) is larger than win size (%d)\n", hop_size, buf_size); goto beach; } else if ((sint_t)samplerate < 1) { AUBIO_ERR("onset: samplerate (%d) can not be < 1\n", samplerate); @@ -207,29 +258,98 @@ aubio_onset_t * new_aubio_onset (const char_t * onset_mode, o->od = new_aubio_specdesc(onset_mode,buf_size); o->fftgrain = new_cvec(buf_size); o->desc = new_fvec(1); + o->spectral_whitening = new_aubio_spectral_whitening(buf_size, hop_size, samplerate); - /* set some default parameter */ - aubio_onset_set_threshold (o, 0.3); - aubio_onset_set_delay(o, 4.3 * hop_size); - aubio_onset_set_minioi_ms(o, 20.); - aubio_onset_set_silence(o, -70.); + if (!o->pv || !o->pp || !o->od || !o->fftgrain + || !o->desc || !o->spectral_whitening) + goto beach; /* initialize internal variables */ - o->last_onset = 0; - o->total_frames = 0; + aubio_onset_set_default_parameters (o, onset_mode); + + aubio_onset_reset(o); return o; beach: - AUBIO_FREE(o); + del_aubio_onset(o); return NULL; } +void aubio_onset_reset (aubio_onset_t *o) { + o->last_onset = 0; + o->total_frames = 0; +} + +uint_t aubio_onset_set_default_parameters (aubio_onset_t * o, const char_t * onset_mode) +{ + uint_t ret = AUBIO_OK; + /* set some default parameter */ + aubio_onset_set_threshold (o, 0.3); + aubio_onset_set_delay (o, 4.3 * o->hop_size); + aubio_onset_set_minioi_ms (o, 50.); + aubio_onset_set_silence (o, -70.); + // disable spectral whitening + aubio_onset_set_awhitening (o, 0); + // disable logarithmic magnitude + aubio_onset_set_compression (o, 0.); + + /* method specific optimisations */ + if (strcmp (onset_mode, "energy") == 0) { + } else if (strcmp (onset_mode, "hfc") == 0 || strcmp (onset_mode, "default") == 0) { + aubio_onset_set_threshold (o, 0.058); + aubio_onset_set_compression (o, 1.); + } else if (strcmp (onset_mode, "complexdomain") == 0 + || strcmp (onset_mode, "complex") == 0) { + aubio_onset_set_delay (o, 4.6 * o->hop_size); + aubio_onset_set_threshold (o, 0.15); + aubio_onset_set_awhitening(o, 1); + aubio_onset_set_compression (o, 1.); + } else if (strcmp (onset_mode, "phase") == 0) { + o->apply_compression = 0; + aubio_onset_set_awhitening (o, 0); + } else if (strcmp (onset_mode, "wphase") == 0) { + // use defaults for now + } else if (strcmp (onset_mode, "mkl") == 0) { + aubio_onset_set_threshold (o, 0.05); + aubio_onset_set_awhitening(o, 1); + aubio_onset_set_compression (o, 0.02); + } else if (strcmp (onset_mode, "kl") == 0) { + aubio_onset_set_threshold (o, 0.35); + aubio_onset_set_awhitening(o, 1); + aubio_onset_set_compression (o, 0.02); + } else if (strcmp (onset_mode, "specflux") == 0) { + aubio_onset_set_threshold (o, 0.18); + aubio_onset_set_awhitening(o, 1); + aubio_spectral_whitening_set_relax_time(o->spectral_whitening, 100); + aubio_spectral_whitening_set_floor(o->spectral_whitening, 1.); + aubio_onset_set_compression (o, 10.); + } else if (strcmp (onset_mode, "specdiff") == 0) { + } else if (strcmp (onset_mode, "old_default") == 0) { + // used to reproduce results obtained with the previous version + aubio_onset_set_threshold (o, 0.3); + aubio_onset_set_minioi_ms (o, 20.); + aubio_onset_set_compression (o, 0.); + } else { + AUBIO_WRN("onset: unknown spectral descriptor type %s, " + "using default parameters.\n", onset_mode); + ret = AUBIO_FAIL; + } + return ret; +} + void del_aubio_onset (aubio_onset_t *o) { - del_aubio_specdesc(o->od); - del_aubio_peakpicker(o->pp); - del_aubio_pvoc(o->pv); - del_fvec(o->desc); - del_cvec(o->fftgrain); + if (o->spectral_whitening) + del_aubio_spectral_whitening(o->spectral_whitening); + if (o->od) + del_aubio_specdesc(o->od); + if (o->pp) + del_aubio_peakpicker(o->pp); + if (o->pv) + del_aubio_pvoc(o->pv); + if (o->desc) + del_fvec(o->desc); + if (o->fftgrain) + del_cvec(o->fftgrain); AUBIO_FREE(o); } diff --git a/src/onset/onset.h b/src/onset/onset.h index e788603..b115e6a 100644 --- a/src/onset/onset.h +++ b/src/onset/onset.h @@ -117,6 +117,44 @@ smpl_t aubio_onset_get_last_s (const aubio_onset_t *o); */ smpl_t aubio_onset_get_last_ms (const aubio_onset_t *o); +/** set onset detection adaptive whitening + + \param o onset detection object as returned by new_aubio_onset() + \param enable 1 to enable, 0 to disable + + \return 0 if successful, 1 otherwise + +*/ +uint_t aubio_onset_set_awhitening(aubio_onset_t * o, uint_t enable); + +/** get onset detection adaptive whitening + + \param o onset detection object as returned by new_aubio_onset() + + \return 1 if enabled, 0 otherwise + +*/ +smpl_t aubio_onset_get_awhitening(aubio_onset_t * o); + +/** set or disable log compression + + \param o onset detection object as returned by new_aubio_onset() + \param lambda logarithmic compression factor, 0 to disable + + \return 0 if successful, 1 otherwise + + */ +uint_t aubio_onset_set_compression(aubio_onset_t *o, smpl_t lambda); + +/** get onset detection log compression + + \param o onset detection object as returned by new_aubio_onset() + + \returns 0 if disabled, compression factor otherwise + + */ +smpl_t aubio_onset_get_compression(aubio_onset_t *o); + /** set onset detection silence threshold \param o onset detection object as returned by new_aubio_onset() @@ -274,6 +312,27 @@ smpl_t aubio_onset_get_delay_ms(const aubio_onset_t * o); */ smpl_t aubio_onset_get_threshold(const aubio_onset_t * o); +/** set default parameters + + \param o onset detection object as returned by new_aubio_onset() + \param onset_mode detection mode to adjust + + This function is called at the end of new_aubio_onset(). + + */ +uint_t aubio_onset_set_default_parameters (aubio_onset_t * o, const char_t * onset_mode); + +/** reset onset detection + + \param o onset detection object as returned by new_aubio_onset() + + Reset current time and last onset to 0. + + This function is called at the end of new_aubio_onset(). + + */ +void aubio_onset_reset(aubio_onset_t * o); + /** delete onset detection object \param o onset detection object to delete diff --git a/src/onset/peakpicker.c b/src/onset/peakpicker.c index 0d83bec..fa1ee38 100644 --- a/src/onset/peakpicker.c +++ b/src/onset/peakpicker.c @@ -92,27 +92,21 @@ aubio_peakpicker_do (aubio_peakpicker_t * p, fvec_t * onset, fvec_t * out) fvec_t *thresholded = p->thresholded; fvec_t *scratch = p->scratch; smpl_t mean = 0., median = 0.; - uint_t length = p->win_post + p->win_pre + 1; uint_t j = 0; - /* store onset in onset_keep */ - /* shift all elements but last, then write last */ - for (j = 0; j < length - 1; j++) { - onset_keep->data[j] = onset_keep->data[j + 1]; - onset_proc->data[j] = onset_keep->data[j]; - } - onset_keep->data[length - 1] = onset->data[0]; - onset_proc->data[length - 1] = onset->data[0]; + /* push new novelty to the end */ + fvec_push(onset_keep, onset->data[0]); + /* store a copy */ + fvec_copy(onset_keep, onset_proc); - /* filter onset_proc */ - /** \bug filtfilt calculated post+pre times, should be only once !? */ + /* filter this copy */ aubio_filter_do_filtfilt (p->biquad, onset_proc, scratch); /* calculate mean and median for onset_proc */ mean = fvec_mean (onset_proc); - /* copy to scratch */ - for (j = 0; j < length; j++) - scratch->data[j] = onset_proc->data[j]; + + /* copy to scratch and compute its median */ + fvec_copy(onset_proc, scratch); median = p->thresholdfn (scratch); /* shift peek array */ @@ -185,7 +179,9 @@ new_aubio_peakpicker (void) generated with octave butter function: [b,a] = butter(2, 0.34); */ t->biquad = new_aubio_filter_biquad (0.15998789, 0.31997577, 0.15998789, - -0.59488894, 0.23484048); + // FIXME: broken since c9e20ca, revert for now + //-0.59488894, 0.23484048); + 0.23484048, 0); return t; } diff --git a/src/pitch/pitch.c b/src/pitch/pitch.c index bee0ea3..40cd7fc 100644 --- a/src/pitch/pitch.c +++ b/src/pitch/pitch.c @@ -32,6 +32,7 @@ #include "pitch/pitchfcomb.h" #include "pitch/pitchschmitt.h" #include "pitch/pitchyinfft.h" +#include "pitch/pitchyinfast.h" #include "pitch/pitchspecacf.h" #include "pitch/pitch.h" @@ -45,6 +46,7 @@ typedef enum aubio_pitcht_schmitt, /**< `schmitt`, Schmitt trigger */ aubio_pitcht_fcomb, /**< `fcomb`, Fast comb filter */ aubio_pitcht_yinfft, /**< `yinfft`, Spectral YIN */ + aubio_pitcht_yinfast, /**< `yinfast`, YIN fast */ aubio_pitcht_specacf, /**< `specacf`, Spectral autocorrelation */ aubio_pitcht_default = aubio_pitcht_yinfft, /**< `default` */ @@ -94,12 +96,13 @@ static void aubio_pitch_do_yin (aubio_pitch_t * p, const fvec_t * ibuf, fvec_t * static void aubio_pitch_do_schmitt (aubio_pitch_t * p, const fvec_t * ibuf, fvec_t * obuf); static void aubio_pitch_do_fcomb (aubio_pitch_t * p, const fvec_t * ibuf, fvec_t * obuf); static void aubio_pitch_do_yinfft (aubio_pitch_t * p, const fvec_t * ibuf, fvec_t * obuf); +static void aubio_pitch_do_yinfast (aubio_pitch_t * p, const fvec_t * ibuf, fvec_t * obuf); static void aubio_pitch_do_specacf (aubio_pitch_t * p, const fvec_t * ibuf, fvec_t * obuf); -/* conversion functions for frequency conversions */ -smpl_t freqconvbin (smpl_t f, uint_t samplerate, uint_t bufsize); -smpl_t freqconvmidi (smpl_t f, uint_t samplerate, uint_t bufsize); -smpl_t freqconvpass (smpl_t f, uint_t samplerate, uint_t bufsize); +/* internal functions for frequency conversion */ +static smpl_t freqconvbin (smpl_t f, uint_t samplerate, uint_t bufsize); +static smpl_t freqconvmidi (smpl_t f, uint_t samplerate, uint_t bufsize); +static smpl_t freqconvpass (smpl_t f, uint_t samplerate, uint_t bufsize); /* adapter to stack ibuf new samples at the end of buf, and trim `buf` to `bufsize` */ void aubio_pitch_slideblock (aubio_pitch_t * p, const fvec_t * ibuf); @@ -111,8 +114,14 @@ new_aubio_pitch (const char_t * pitch_mode, { aubio_pitch_t *p = AUBIO_NEW (aubio_pitch_t); aubio_pitch_type pitch_type; + if (pitch_mode == NULL) { + AUBIO_ERR ("pitch: can not use ‘NULL‘ for pitch detection method\n"); + goto beach; + } if (strcmp (pitch_mode, "mcomb") == 0) pitch_type = aubio_pitcht_mcomb; + else if (strcmp (pitch_mode, "yinfast") == 0) + pitch_type = aubio_pitcht_yinfast; else if (strcmp (pitch_mode, "yinfft") == 0) pitch_type = aubio_pitcht_yinfft; else if (strcmp (pitch_mode, "yin") == 0) @@ -126,9 +135,8 @@ new_aubio_pitch (const char_t * pitch_mode, else if (strcmp (pitch_mode, "default") == 0) pitch_type = aubio_pitcht_default; else { - AUBIO_ERR ("unknown pitch detection method %s, using default.\n", - pitch_mode); - pitch_type = aubio_pitcht_default; + AUBIO_ERR ("pitch: unknown pitch detection method ‘%s’\n", pitch_mode); + goto beach; } // check parameters are valid @@ -139,7 +147,7 @@ new_aubio_pitch (const char_t * pitch_mode, AUBIO_ERR("pitch: got buffer_size %d, but can not be < 1\n", bufsize); goto beach; } else if (bufsize < hopsize) { - AUBIO_ERR("pitch: hop size (%d) is larger than win size (%d)\n", bufsize, hopsize); + AUBIO_ERR("pitch: hop size (%d) is larger than win size (%d)\n", hopsize, bufsize); goto beach; } else if ((sint_t)samplerate < 1) { AUBIO_ERR("pitch: samplerate (%d) can not be < 1\n", samplerate); @@ -156,6 +164,7 @@ new_aubio_pitch (const char_t * pitch_mode, case aubio_pitcht_yin: p->buf = new_fvec (bufsize); p->p_object = new_aubio_pitchyin (bufsize); + if (!p->p_object) goto beach; p->detect_cb = aubio_pitch_do_yin; p->conf_cb = (aubio_pitch_get_conf_t)aubio_pitchyin_get_confidence; aubio_pitchyin_set_tolerance (p->p_object, 0.15); @@ -163,6 +172,7 @@ new_aubio_pitch (const char_t * pitch_mode, case aubio_pitcht_mcomb: p->filtered = new_fvec (hopsize); p->pv = new_aubio_pvoc (bufsize, hopsize); + if (!p->pv) goto beach; p->fftgrain = new_cvec (bufsize); p->p_object = new_aubio_pitchmcomb (bufsize, hopsize); p->filter = new_aubio_filter_c_weighting (samplerate); @@ -171,6 +181,7 @@ new_aubio_pitch (const char_t * pitch_mode, case aubio_pitcht_fcomb: p->buf = new_fvec (bufsize); p->p_object = new_aubio_pitchfcomb (bufsize, hopsize); + if (!p->p_object) goto beach; p->detect_cb = aubio_pitch_do_fcomb; break; case aubio_pitcht_schmitt: @@ -181,13 +192,23 @@ new_aubio_pitch (const char_t * pitch_mode, case aubio_pitcht_yinfft: p->buf = new_fvec (bufsize); p->p_object = new_aubio_pitchyinfft (samplerate, bufsize); + if (!p->p_object) goto beach; p->detect_cb = aubio_pitch_do_yinfft; p->conf_cb = (aubio_pitch_get_conf_t)aubio_pitchyinfft_get_confidence; aubio_pitchyinfft_set_tolerance (p->p_object, 0.85); break; + case aubio_pitcht_yinfast: + p->buf = new_fvec (bufsize); + p->p_object = new_aubio_pitchyinfast (bufsize); + if (!p->p_object) goto beach; + p->detect_cb = aubio_pitch_do_yinfast; + p->conf_cb = (aubio_pitch_get_conf_t)aubio_pitchyinfast_get_confidence; + aubio_pitchyinfast_set_tolerance (p->p_object, 0.15); + break; case aubio_pitcht_specacf: p->buf = new_fvec (bufsize); p->p_object = new_aubio_pitchspecacf (bufsize); + if (!p->p_object) goto beach; p->detect_cb = aubio_pitch_do_specacf; p->conf_cb = (aubio_pitch_get_conf_t)aubio_pitchspecacf_get_tolerance; aubio_pitchspecacf_set_tolerance (p->p_object, 0.85); @@ -198,6 +219,8 @@ new_aubio_pitch (const char_t * pitch_mode, return p; beach: + if (p->filtered) del_fvec(p->filtered); + if (p->buf) del_fvec(p->buf); AUBIO_FREE(p); return NULL; } @@ -229,6 +252,10 @@ del_aubio_pitch (aubio_pitch_t * p) del_fvec (p->buf); del_aubio_pitchyinfft (p->p_object); break; + case aubio_pitcht_yinfast: + del_fvec (p->buf); + del_aubio_pitchyinfast (p->p_object); + break; case aubio_pitcht_specacf: del_fvec (p->buf); del_aubio_pitchspecacf (p->p_object); @@ -283,7 +310,8 @@ aubio_pitch_set_unit (aubio_pitch_t * p, const char_t * pitch_unit) else if (strcmp (pitch_unit, "default") == 0) pitch_mode = aubio_pitchm_default; else { - AUBIO_ERR ("unknown pitch detection unit %s, using default\n", pitch_unit); + AUBIO_WRN("pitch: unknown pitch detection unit ‘%s’, using default\n", + pitch_unit); pitch_mode = aubio_pitchm_default; err = AUBIO_FAIL; } @@ -318,12 +346,35 @@ aubio_pitch_set_tolerance (aubio_pitch_t * p, smpl_t tol) case aubio_pitcht_yinfft: aubio_pitchyinfft_set_tolerance (p->p_object, tol); break; + case aubio_pitcht_yinfast: + aubio_pitchyinfast_set_tolerance (p->p_object, tol); + break; default: break; } return AUBIO_OK; } +smpl_t +aubio_pitch_get_tolerance (aubio_pitch_t * p) +{ + smpl_t tolerance = 1.; + switch (p->type) { + case aubio_pitcht_yin: + tolerance = aubio_pitchyin_get_tolerance (p->p_object); + break; + case aubio_pitcht_yinfft: + tolerance = aubio_pitchyinfft_get_tolerance (p->p_object); + break; + case aubio_pitcht_yinfast: + tolerance = aubio_pitchyinfast_get_tolerance (p->p_object); + break; + default: + break; + } + return tolerance; +} + uint_t aubio_pitch_set_silence (aubio_pitch_t * p, smpl_t silence) { @@ -331,7 +382,7 @@ aubio_pitch_set_silence (aubio_pitch_t * p, smpl_t silence) p->silence = silence; return AUBIO_OK; } else { - AUBIO_ERR("pitch: could not set silence to %.2f", silence); + AUBIO_WRN("pitch: could not set silence to %.2f\n", silence); return AUBIO_FAIL; } } @@ -396,6 +447,21 @@ aubio_pitch_do_yinfft (aubio_pitch_t * p, const fvec_t * ibuf, fvec_t * obuf) } void +aubio_pitch_do_yinfast (aubio_pitch_t * p, const fvec_t * ibuf, fvec_t * obuf) +{ + smpl_t pitch = 0.; + aubio_pitch_slideblock (p, ibuf); + aubio_pitchyinfast_do (p->p_object, p->buf, obuf); + pitch = obuf->data[0]; + if (pitch > 0) { + pitch = p->samplerate / (pitch + 0.); + } else { + pitch = 0.; + } + obuf->data[0] = pitch; +} + +void aubio_pitch_do_specacf (aubio_pitch_t * p, const fvec_t * ibuf, fvec_t * out) { smpl_t pitch = 0., period; diff --git a/src/pitch/pitch.h b/src/pitch/pitch.h index e0ae545..b807a6a 100644 --- a/src/pitch/pitch.h +++ b/src/pitch/pitch.h @@ -81,6 +81,11 @@ extern "C" { see http://recherche.ircam.fr/equipes/pcm/pub/people/cheveign.html + \b \p yinfast : Yinfast algorithm + + This algorithm is equivalent to the YIN algorithm, but computed in the + spectral domain for efficiency. See also `python/demos/demo_yin_compare.py`. + \b \p yinfft : Yinfft algorithm This algorithm was derived from the YIN algorithm. In this implementation, a @@ -117,6 +122,14 @@ void aubio_pitch_do (aubio_pitch_t * o, const fvec_t * in, fvec_t * out); */ uint_t aubio_pitch_set_tolerance (aubio_pitch_t * o, smpl_t tol); +/** get yin or yinfft tolerance threshold + + \param o pitch detection object as returned by new_aubio_pitch() + \return tolerance (default is 0.15 for yin and 0.85 for yinfft) + +*/ +smpl_t aubio_pitch_get_tolerance (aubio_pitch_t * o); + /** deletion of the pitch detection object \param o pitch detection object as returned by new_aubio_pitch() @@ -142,6 +155,8 @@ aubio_pitch_t *new_aubio_pitch (const char_t * method, \param o pitch detection object as returned by new_aubio_pitch() \param mode set pitch units for output + mode can be one of "Hz", "midi", "cent", or "bin". Defaults to "Hz". + \return 0 if successfull, non-zero otherwise */ diff --git a/src/pitch/pitchfcomb.c b/src/pitch/pitchfcomb.c index 2bdc509..5cc49b9 100644 --- a/src/pitch/pitchfcomb.c +++ b/src/pitch/pitchfcomb.c @@ -53,12 +53,17 @@ new_aubio_pitchfcomb (uint_t bufsize, uint_t hopsize) aubio_pitchfcomb_t *p = AUBIO_NEW (aubio_pitchfcomb_t); p->fftSize = bufsize; p->stepSize = hopsize; + p->fft = new_aubio_fft (bufsize); + if (!p->fft) goto beach; p->winput = new_fvec (bufsize); p->fftOut = new_cvec (bufsize); p->fftLastPhase = new_fvec (bufsize); - p->fft = new_aubio_fft (bufsize); p->win = new_aubio_window ("hanning", bufsize); return p; + +beach: + AUBIO_FREE(p); + return NULL; } /* input must be stepsize long */ diff --git a/src/pitch/pitchmcomb.c b/src/pitch/pitchmcomb.c index f9ec610..9ce160e 100644 --- a/src/pitch/pitchmcomb.c +++ b/src/pitch/pitchmcomb.c @@ -37,6 +37,7 @@ void aubio_pitchmcomb_combdet (aubio_pitchmcomb_t * p, const fvec_t * newmag); /* not used but useful : sort by amplitudes (or anything else) * sort_pitchpeak(peaks, length); */ +#if 0 /** spectral_peak comparison function (must return signed int) */ static sint_t aubio_pitchmcomb_sort_peak_comp (const void *x, const void *y); /** sort spectral_peak against their mag */ @@ -44,13 +45,16 @@ void aubio_pitchmcomb_sort_peak (aubio_spectralpeak_t * peaks, uint_t nbins); /** select the best candidates */ uint_t aubio_pitch_cands (aubio_pitchmcomb_t * p, const cvec_t * fftgrain, smpl_t * cands); +#endif /** sort spectral_candidate against their comb ene */ void aubio_pitchmcomb_sort_cand_ene (aubio_spectralcandidate_t ** candidates, uint_t nbins); +#if 0 /** sort spectral_candidate against their frequency */ void aubio_pitchmcomb_sort_cand_freq (aubio_spectralcandidate_t ** candidates, uint_t nbins); +#endif struct _aubio_pitchmcomb_t { @@ -133,6 +137,7 @@ aubio_pitchmcomb_do (aubio_pitchmcomb_t * p, const cvec_t * fftgrain, fvec_t * o } */ } +#if 0 uint_t aubio_pitch_cands (aubio_pitchmcomb_t * p, const cvec_t * fftgrain, smpl_t * cands) { @@ -163,6 +168,7 @@ aubio_pitch_cands (aubio_pitchmcomb_t * p, const cvec_t * fftgrain, smpl_t * can return 0; } } +#endif void aubio_pitchmcomb_spectral_pp (aubio_pitchmcomb_t * p, const fvec_t * newmag) @@ -313,6 +319,7 @@ aubio_pitchmcomb_get_root_peak (aubio_spectralpeak_t * peaks, uint_t length) return pos; } +#if 0 void aubio_pitchmcomb_sort_peak (aubio_spectralpeak_t * peaks, uint_t nbins) { @@ -342,7 +349,6 @@ aubio_pitchmcomb_sort_cand_ene (aubio_spectralcandidate_t ** candidates, } } - void aubio_pitchmcomb_sort_cand_freq (aubio_spectralcandidate_t ** candidates, uint_t nbins) @@ -356,6 +362,7 @@ aubio_pitchmcomb_sort_cand_freq (aubio_spectralcandidate_t ** candidates, } } } +#endif aubio_pitchmcomb_t * new_aubio_pitchmcomb (uint_t bufsize, uint_t hopsize) diff --git a/src/pitch/pitchspecacf.c b/src/pitch/pitchspecacf.c index a010041..170cfa3 100644 --- a/src/pitch/pitchspecacf.c +++ b/src/pitch/pitchspecacf.c @@ -42,15 +42,20 @@ aubio_pitchspecacf_t * new_aubio_pitchspecacf (uint_t bufsize) { aubio_pitchspecacf_t *p = AUBIO_NEW (aubio_pitchspecacf_t); + p->fft = new_aubio_fft (bufsize); + if (!p->fft) goto beach; p->win = new_aubio_window ("hanningz", bufsize); p->winput = new_fvec (bufsize); - p->fft = new_aubio_fft (bufsize); p->fftout = new_fvec (bufsize); p->sqrmag = new_fvec (bufsize); p->acf = new_fvec (bufsize / 2 + 1); p->tol = 1.; p->confidence = 0.; return p; + +beach: + AUBIO_FREE(p); + return NULL; } void @@ -87,6 +92,7 @@ del_aubio_pitchspecacf (aubio_pitchspecacf_t * p) del_aubio_fft (p->fft); del_fvec (p->sqrmag); del_fvec (p->fftout); + del_fvec (p->acf); AUBIO_FREE (p); } diff --git a/src/pitch/pitchyin.c b/src/pitch/pitchyin.c index b65e3f7..85b4566 100644 --- a/src/pitch/pitchyin.c +++ b/src/pitch/pitchyin.c @@ -36,9 +36,10 @@ struct _aubio_pitchyin_t { fvec_t *yin; smpl_t tol; - smpl_t confidence; + uint_t peak_pos; }; +#if 0 /** compute difference function \param input input signal @@ -60,6 +61,7 @@ void aubio_pitchyin_getcum (fvec_t * yinbuf); */ uint_t aubio_pitchyin_getpitch (const fvec_t * yinbuf); +#endif aubio_pitchyin_t * new_aubio_pitchyin (uint_t bufsize) @@ -67,6 +69,7 @@ new_aubio_pitchyin (uint_t bufsize) aubio_pitchyin_t *o = AUBIO_NEW (aubio_pitchyin_t); o->yin = new_fvec (bufsize / 2); o->tol = 0.15; + o->peak_pos = 0; return o; } @@ -77,6 +80,7 @@ del_aubio_pitchyin (aubio_pitchyin_t * o) AUBIO_FREE (o); } +#if 0 /* outputs the difference function */ void aubio_pitchyin_diff (fvec_t * input, fvec_t * yin) @@ -126,46 +130,49 @@ aubio_pitchyin_getpitch (const fvec_t * yin) //AUBIO_DBG("No pitch found"); return 0; } - +#endif /* all the above in one */ void aubio_pitchyin_do (aubio_pitchyin_t * o, const fvec_t * input, fvec_t * out) { - smpl_t tol = o->tol; - fvec_t *yin = o->yin; - uint_t j, tau = 0; + const smpl_t tol = o->tol; + fvec_t* yin = o->yin; + const smpl_t *input_data = input->data; + const uint_t length = yin->length; + smpl_t *yin_data = yin->data; + uint_t j, tau; sint_t period; - smpl_t tmp = 0., tmp2 = 0.; - yin->data[0] = 1.; - for (tau = 1; tau < yin->length; tau++) { - yin->data[tau] = 0.; - for (j = 0; j < yin->length; j++) { - tmp = input->data[j] - input->data[j + tau]; - yin->data[tau] += SQR (tmp); + smpl_t tmp, tmp2 = 0.; + + yin_data[0] = 1.; + for (tau = 1; tau < length; tau++) { + yin_data[tau] = 0.; + for (j = 0; j < length; j++) { + tmp = input_data[j] - input_data[j + tau]; + yin_data[tau] += SQR (tmp); } - tmp2 += yin->data[tau]; + tmp2 += yin_data[tau]; if (tmp2 != 0) { yin->data[tau] *= tau / tmp2; } else { yin->data[tau] = 1.; } period = tau - 3; - if (tau > 4 && (yin->data[period] < tol) && - (yin->data[period] < yin->data[period + 1])) { - out->data[0] = fvec_quadratic_peak_pos (yin, period); - goto beach; + if (tau > 4 && (yin_data[period] < tol) && + (yin_data[period] < yin_data[period + 1])) { + o->peak_pos = (uint_t)period; + out->data[0] = fvec_quadratic_peak_pos (yin, o->peak_pos); + return; } } - out->data[0] = fvec_quadratic_peak_pos (yin, fvec_min_elem (yin)); -beach: - return; + o->peak_pos = (uint_t)fvec_min_elem (yin); + out->data[0] = fvec_quadratic_peak_pos (yin, o->peak_pos); } smpl_t aubio_pitchyin_get_confidence (aubio_pitchyin_t * o) { - o->confidence = 1. - fvec_min (o->yin); - return o->confidence; + return 1. - o->yin->data[o->peak_pos]; } uint_t diff --git a/src/pitch/pitchyinfast.c b/src/pitch/pitchyinfast.c new file mode 100644 index 0000000..b2dcadc --- /dev/null +++ b/src/pitch/pitchyinfast.c @@ -0,0 +1,201 @@ +/* + Copyright (C) 2003-2017 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +/* This algorithm was developed by A. de Cheveigné and H. Kawahara and + * published in: + * + * de Cheveigné, A., Kawahara, H. (2002) "YIN, a fundamental frequency + * estimator for speech and music", J. Acoust. Soc. Am. 111, 1917-1930. + * + * see http://recherche.ircam.fr/equipes/pcm/pub/people/cheveign.html + */ + +#include "aubio_priv.h" +#include "fvec.h" +#include "mathutils.h" +#include "cvec.h" +#include "spectral/fft.h" +#include "pitch/pitchyinfast.h" + +struct _aubio_pitchyinfast_t +{ + fvec_t *yin; + smpl_t tol; + uint_t peak_pos; + fvec_t *tmpdata; + fvec_t *sqdiff; + fvec_t *kernel; + fvec_t *samples_fft; + fvec_t *kernel_fft; + aubio_fft_t *fft; +}; + +aubio_pitchyinfast_t * +new_aubio_pitchyinfast (uint_t bufsize) +{ + aubio_pitchyinfast_t *o = AUBIO_NEW (aubio_pitchyinfast_t); + o->yin = new_fvec (bufsize / 2); + o->tmpdata = new_fvec (bufsize); + o->sqdiff = new_fvec (bufsize / 2); + o->kernel = new_fvec (bufsize); + o->samples_fft = new_fvec (bufsize); + o->kernel_fft = new_fvec (bufsize); + o->fft = new_aubio_fft (bufsize); + if (!o->yin || !o->tmpdata || !o->tmpdata || !o->sqdiff + || !o->kernel || !o->samples_fft || !o->kernel || !o->fft) + { + del_aubio_pitchyinfast(o); + return NULL; + } + o->tol = 0.15; + o->peak_pos = 0; + return o; +} + +void +del_aubio_pitchyinfast (aubio_pitchyinfast_t * o) +{ + if (o->yin) + del_fvec (o->yin); + if (o->tmpdata) + del_fvec (o->tmpdata); + if (o->sqdiff) + del_fvec (o->sqdiff); + if (o->kernel) + del_fvec (o->kernel); + if (o->samples_fft) + del_fvec (o->samples_fft); + if (o->kernel_fft) + del_fvec (o->kernel_fft); + if (o->fft) + del_aubio_fft (o->fft); + AUBIO_FREE (o); +} + +/* all the above in one */ +void +aubio_pitchyinfast_do (aubio_pitchyinfast_t * o, const fvec_t * input, fvec_t * out) +{ + const smpl_t tol = o->tol; + fvec_t* yin = o->yin; + const uint_t length = yin->length; + uint_t B = o->tmpdata->length; + uint_t W = o->yin->length; // B / 2 + fvec_t tmp_slice, kernel_ptr; + uint_t tau; + sint_t period; + smpl_t tmp2 = 0.; + + // compute r_t(0) + r_t+tau(0) + { + fvec_t *squares = o->tmpdata; + fvec_weighted_copy(input, input, squares); +#if 0 + for (tau = 0; tau < W; tau++) { + tmp_slice.data = squares->data + tau; + tmp_slice.length = W; + o->sqdiff->data[tau] = fvec_sum(&tmp_slice); + } +#else + tmp_slice.data = squares->data; + tmp_slice.length = W; + o->sqdiff->data[0] = fvec_sum(&tmp_slice); + for (tau = 1; tau < W; tau++) { + o->sqdiff->data[tau] = o->sqdiff->data[tau-1]; + o->sqdiff->data[tau] -= squares->data[tau-1]; + o->sqdiff->data[tau] += squares->data[W+tau-1]; + } +#endif + fvec_add(o->sqdiff, o->sqdiff->data[0]); + } + // compute r_t(tau) = -2.*ifft(fft(samples)*fft(samples[W-1::-1])) + { + fvec_t *compmul = o->tmpdata; + fvec_t *rt_of_tau = o->samples_fft; + aubio_fft_do_complex(o->fft, input, o->samples_fft); + // build kernel, take a copy of first half of samples + tmp_slice.data = input->data; + tmp_slice.length = W; + kernel_ptr.data = o->kernel->data + 1; + kernel_ptr.length = W; + fvec_copy(&tmp_slice, &kernel_ptr); + // reverse them + fvec_rev(&kernel_ptr); + // compute fft(kernel) + aubio_fft_do_complex(o->fft, o->kernel, o->kernel_fft); + // compute complex product + compmul->data[0] = o->kernel_fft->data[0] * o->samples_fft->data[0]; + for (tau = 1; tau < W; tau++) { + compmul->data[tau] = o->kernel_fft->data[tau] * o->samples_fft->data[tau]; + compmul->data[tau] -= o->kernel_fft->data[B-tau] * o->samples_fft->data[B-tau]; + } + compmul->data[W] = o->kernel_fft->data[W] * o->samples_fft->data[W]; + for (tau = 1; tau < W; tau++) { + compmul->data[B-tau] = o->kernel_fft->data[B-tau] * o->samples_fft->data[tau]; + compmul->data[B-tau] += o->kernel_fft->data[tau] * o->samples_fft->data[B-tau]; + } + // compute inverse fft + aubio_fft_rdo_complex(o->fft, compmul, rt_of_tau); + // compute square difference r_t(tau) = sqdiff - 2 * r_t_tau[W-1:-1] + for (tau = 0; tau < W; tau++) { + yin->data[tau] = o->sqdiff->data[tau] - 2. * rt_of_tau->data[tau+W]; + } + } + + // now build yin and look for first minimum + fvec_zeros(out); + yin->data[0] = 1.; + for (tau = 1; tau < length; tau++) { + tmp2 += yin->data[tau]; + if (tmp2 != 0) { + yin->data[tau] *= tau / tmp2; + } else { + yin->data[tau] = 1.; + } + period = tau - 3; + if (tau > 4 && (yin->data[period] < tol) && + (yin->data[period] < yin->data[period + 1])) { + o->peak_pos = (uint_t)period; + out->data[0] = fvec_quadratic_peak_pos (yin, o->peak_pos); + return; + } + } + // use global minimum + o->peak_pos = (uint_t)fvec_min_elem (yin); + out->data[0] = fvec_quadratic_peak_pos (yin, o->peak_pos); +} + +smpl_t +aubio_pitchyinfast_get_confidence (aubio_pitchyinfast_t * o) { + return 1. - o->yin->data[o->peak_pos]; +} + +uint_t +aubio_pitchyinfast_set_tolerance (aubio_pitchyinfast_t * o, smpl_t tol) +{ + o->tol = tol; + return 0; +} + +smpl_t +aubio_pitchyinfast_get_tolerance (aubio_pitchyinfast_t * o) +{ + return o->tol; +} diff --git a/src/pitch/pitchyinfast.h b/src/pitch/pitchyinfast.h new file mode 100644 index 0000000..abb8139 --- /dev/null +++ b/src/pitch/pitchyinfast.h @@ -0,0 +1,102 @@ +/* + Copyright (C) 2003-2017 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +/** \file + + Pitch detection using YIN algorithm (fast implementation) + + This algorithm was developed by A. de Cheveigne and H. Kawahara and + published in: + + De Cheveigné, A., Kawahara, H. (2002) "YIN, a fundamental frequency + estimator for speech and music", J. Acoust. Soc. Am. 111, 1917-1930. + + This implementation compute the autocorrelation function using time domain + convolution computed in the spectral domain. + + see http://recherche.ircam.fr/equipes/pcm/pub/people/cheveign.html + http://recherche.ircam.fr/equipes/pcm/cheveign/ps/2002_JASA_YIN_proof.pdf + +*/ + +#ifndef AUBIO_PITCHYINFAST_H +#define AUBIO_PITCHYINFAST_H + +#ifdef __cplusplus +extern "C" { +#endif + +/** pitch detection object */ +typedef struct _aubio_pitchyinfast_t aubio_pitchyinfast_t; + +/** creation of the pitch detection object + + \param buf_size size of the input buffer to analyse + +*/ +aubio_pitchyinfast_t *new_aubio_pitchyinfast (uint_t buf_size); + +/** deletion of the pitch detection object + + \param o pitch detection object as returned by new_aubio_pitchyin() + +*/ +void del_aubio_pitchyinfast (aubio_pitchyinfast_t * o); + +/** execute pitch detection an input buffer + + \param o pitch detection object as returned by new_aubio_pitchyin() + \param samples_in input signal vector (length as specified at creation time) + \param cands_out pitch period candidates, in samples + +*/ +void aubio_pitchyinfast_do (aubio_pitchyinfast_t * o, const fvec_t * samples_in, fvec_t * cands_out); + + +/** set tolerance parameter for YIN algorithm + + \param o YIN pitch detection object + \param tol tolerance parameter for minima selection [default 0.15] + +*/ +uint_t aubio_pitchyinfast_set_tolerance (aubio_pitchyinfast_t * o, smpl_t tol); + +/** get tolerance parameter for YIN algorithm + + \param o YIN pitch detection object + \return tolerance parameter for minima selection [default 0.15] + +*/ +smpl_t aubio_pitchyinfast_get_tolerance (aubio_pitchyinfast_t * o); + +/** get current confidence of YIN algorithm + + \param o YIN pitch detection object + \return confidence parameter + +*/ +smpl_t aubio_pitchyinfast_get_confidence (aubio_pitchyinfast_t * o); + +#ifdef __cplusplus +} +#endif + +#endif /* AUBIO_PITCHYINFAST_H */ + diff --git a/src/pitch/pitchyinfft.c b/src/pitch/pitchyinfft.c index 98de63c..b613f60 100644 --- a/src/pitch/pitchyinfft.c +++ b/src/pitch/pitchyinfft.c @@ -36,7 +36,7 @@ struct _aubio_pitchyinfft_t aubio_fft_t *fft; /**< fft object to compute square difference function */ fvec_t *yinfft; /**< Yin function */ smpl_t tol; /**< Yin tolerance */ - smpl_t confidence; /**< confidence */ + uint_t peak_pos; /**< currently selected peak pos*/ uint_t short_period; /** shortest period under which to check for octave error */ }; @@ -44,7 +44,7 @@ static const smpl_t freqs[] = { 0., 20., 25., 31.5, 40., 50., 63., 80., 100., 125., 160., 200., 250., 315., 400., 500., 630., 800., 1000., 1250., 1600., 2000., 2500., 3150., 4000., 5000., 6300., 8000., 9000., 10000., - 12500., 15000., 20000., 25100 + 12500., 15000., 20000., 25100., -1. }; static const smpl_t weight[] = { @@ -62,15 +62,20 @@ new_aubio_pitchyinfft (uint_t samplerate, uint_t bufsize) aubio_pitchyinfft_t *p = AUBIO_NEW (aubio_pitchyinfft_t); p->winput = new_fvec (bufsize); p->fft = new_aubio_fft (bufsize); + if (!p->fft) goto beach; p->fftout = new_fvec (bufsize); p->sqrmag = new_fvec (bufsize); p->yinfft = new_fvec (bufsize / 2 + 1); p->tol = 0.85; + p->peak_pos = 0; p->win = new_aubio_window ("hanningz", bufsize); p->weight = new_fvec (bufsize / 2 + 1); for (i = 0; i < p->weight->length; i++) { freq = (smpl_t) i / (smpl_t) bufsize *(smpl_t) samplerate; - while (freq > freqs[j]) { + while (freq > freqs[j] && freqs[j] > 0) { + //AUBIO_DBG("freq %3.5f > %3.5f \tsamplerate %d (Hz) \t" + // "(weight length %d, bufsize %d) %d %d\n", freq, freqs[j], + // samplerate, p->weight->length, bufsize, i, j); j += 1; } a0 = weight[j - 1]; @@ -95,6 +100,11 @@ new_aubio_pitchyinfft (uint_t samplerate, uint_t bufsize) // check for octave errors above 1300 Hz p->short_period = (uint_t)ROUND(samplerate / 1300.); return p; + +beach: + if (p->winput) del_fvec(p->winput); + AUBIO_FREE(p); + return NULL; } void @@ -155,11 +165,13 @@ aubio_pitchyinfft_do (aubio_pitchyinfft_t * p, const fvec_t * input, fvec_t * ou /* should compare the minimum value of each interpolated peaks */ halfperiod = FLOOR (tau / 2 + .5); if (yin->data[halfperiod] < p->tol) - output->data[0] = fvec_quadratic_peak_pos (yin, halfperiod); + p->peak_pos = halfperiod; else - output->data[0] = fvec_quadratic_peak_pos (yin, tau); + p->peak_pos = tau; + output->data[0] = fvec_quadratic_peak_pos (yin, p->peak_pos); } } else { + p->peak_pos = 0; output->data[0] = 0.; } } @@ -179,8 +191,7 @@ del_aubio_pitchyinfft (aubio_pitchyinfft_t * p) smpl_t aubio_pitchyinfft_get_confidence (aubio_pitchyinfft_t * o) { - o->confidence = 1. - fvec_min (o->yinfft); - return o->confidence; + return 1. - o->yinfft->data[o->peak_pos]; } uint_t diff --git a/src/spectral/awhitening.c b/src/spectral/awhitening.c new file mode 100644 index 0000000..1543544 --- /dev/null +++ b/src/spectral/awhitening.c @@ -0,0 +1,121 @@ +/* + * Copyright (C) 2003-2015 Paul Brossier <piem@aubio.org> + * + * This file is part of aubio. + * + * aubio is free software: you can redistribute it and/or modify it under the + * terms of the GNU General Public License as published by the Free Software + * Foundation, either version 3 of the License, or (at your option) any later + * version. + * + * aubio is distributed in the hope that it will be useful, but WITHOUT ANY + * WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS + * FOR A PARTICULAR PURPOSE. See the GNU General Public License for more + * details. + * + * You should have received a copy of the GNU General Public License along with + * aubio. If not, see <http://www.gnu.org/licenses/>. + * + */ + +#include "aubio_priv.h" +#include "fvec.h" +#include "cvec.h" +#include "mathutils.h" +#include "spectral/awhitening.h" + +#define aubio_spectral_whitening_default_relax_time 250 // in seconds, between 22 and 446 +#define aubio_spectral_whitening_default_decay 0.001 // -60dB attenuation +#define aubio_spectral_whitening_default_floor 1.e-4 // from 1.e-6 to .2 + +/** structure to store object state */ +struct _aubio_spectral_whitening_t { + uint_t buf_size; + uint_t hop_size; + uint_t samplerate; + smpl_t relax_time; + smpl_t r_decay; + smpl_t floor; + fvec_t *peak_values; +}; + +void +aubio_spectral_whitening_do (aubio_spectral_whitening_t * o, cvec_t * fftgrain) +{ + uint_t i = 0; + uint_t length = MIN(fftgrain->length, o->peak_values->length); + for (i = 0; i < length; i++) { + smpl_t tmp = MAX(o->r_decay * o->peak_values->data[i], o->floor); + o->peak_values->data[i] = MAX(fftgrain->norm[i], tmp); + fftgrain->norm[i] /= o->peak_values->data[i]; + } +} + +aubio_spectral_whitening_t * +new_aubio_spectral_whitening (uint_t buf_size, uint_t hop_size, uint_t samplerate) +{ + aubio_spectral_whitening_t *o = AUBIO_NEW (aubio_spectral_whitening_t); + if ((sint_t)buf_size < 1) { + AUBIO_ERR("spectral_whitening: got buffer_size %d, but can not be < 1\n", buf_size); + goto beach; + } else if ((sint_t)hop_size < 1) { + AUBIO_ERR("spectral_whitening: got hop_size %d, but can not be < 1\n", hop_size); + goto beach; + } else if ((sint_t)samplerate < 1) { + AUBIO_ERR("spectral_whitening: got samplerate %d, but can not be < 1\n", samplerate); + goto beach; + } + o->peak_values = new_fvec (buf_size / 2 + 1); + o->buf_size = buf_size; + o->hop_size = hop_size; + o->samplerate = samplerate; + o->floor = aubio_spectral_whitening_default_floor; + aubio_spectral_whitening_set_relax_time (o, aubio_spectral_whitening_default_relax_time); + aubio_spectral_whitening_reset (o); + return o; + +beach: + AUBIO_FREE(o); + return NULL; +} + +uint_t +aubio_spectral_whitening_set_relax_time (aubio_spectral_whitening_t * o, smpl_t relax_time) +{ + o->relax_time = relax_time; + o->r_decay = POW (aubio_spectral_whitening_default_decay, + (o->hop_size / (float) o->samplerate) / o->relax_time); + return AUBIO_OK; +} + +smpl_t +aubio_spectral_whitening_get_relax_time (aubio_spectral_whitening_t * o) +{ + return o->relax_time; +} + +uint_t +aubio_spectral_whitening_set_floor (aubio_spectral_whitening_t *o, smpl_t floor) +{ + o->floor = floor; + return AUBIO_OK; +} + +smpl_t aubio_spectral_whitening_get_floor (aubio_spectral_whitening_t *o) +{ + return o->floor; +} + +void +aubio_spectral_whitening_reset (aubio_spectral_whitening_t * o) +{ + /* cover the case n == 0. */ + fvec_set_all (o->peak_values, o->floor); +} + +void +del_aubio_spectral_whitening (aubio_spectral_whitening_t * o) +{ + del_fvec (o->peak_values); + AUBIO_FREE (o); +} diff --git a/src/spectral/awhitening.h b/src/spectral/awhitening.h new file mode 100644 index 0000000..64150e7 --- /dev/null +++ b/src/spectral/awhitening.h @@ -0,0 +1,125 @@ +/* + Copyright (C) 2003-2015 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +/** \file + + Spectral adaptive whitening + + References: + + D. Stowell and M. D. Plumbley. Adaptive whitening for improved real-time + audio onset detection. In Proceedings of the International Computer Music + Conference (ICMC), 2007, Copenhagen, Denmark. + + http://www.eecs.qmul.ac.uk/~markp/2007/StowellPlumbley07-icmc.pdf + + S. Böck,, F. Krebs, and M. Schedl. Evaluating the Online Capabilities of + Onset Detection Methods. In Proceedings of the 13th International Society for + Music Information Retrieval Conference (ISMIR), 2012, Porto, Portugal. + + http://ismir2012.ismir.net/event/papers/049_ISMIR_2012.pdf + http://www.cp.jku.at/research/papers/Boeck_etal_ISMIR_2012.pdf + +*/ + + +#ifndef _AUBIO_SPECTRAL_WHITENING_H +#define _AUBIO_SPECTRAL_WHITENING_H + +#ifdef __cplusplus +extern "C" { +#endif + +/** spectral whitening structure */ +typedef struct _aubio_spectral_whitening_t aubio_spectral_whitening_t; + +/** execute spectral adaptive whitening, in-place + + \param o spectral whitening object as returned by new_aubio_spectral_whitening() + \param fftgrain input signal spectrum as computed by aubio_pvoc_do() or aubio_fft_do() + +*/ +void aubio_spectral_whitening_do (aubio_spectral_whitening_t * o, + cvec_t * fftgrain); + +/** creation of a spectral whitening object + + \param buf_size window size of input grains + \param hop_size number of samples between two consecutive input grains + \param samplerate sampling rate of the input signal + +*/ +aubio_spectral_whitening_t *new_aubio_spectral_whitening (uint_t buf_size, + uint_t hop_size, + uint_t samplerate); + +/** reset spectral whitening object + + \param o spectral whitening object as returned by new_aubio_spectral_whitening() + + */ +void aubio_spectral_whitening_reset (aubio_spectral_whitening_t * o); + +/** set relaxation time for spectral whitening + + \param o spectral whitening object as returned by new_aubio_spectral_whitening() + \param relax_time relaxation time in seconds between 20 and 500, defaults 250 + + */ +uint_t aubio_spectral_whitening_set_relax_time (aubio_spectral_whitening_t * o, + smpl_t relax_time); + +/** get relaxation time of spectral whitening + + \param o spectral whitening object as returned by new_aubio_spectral_whitening() + \return relaxation time in seconds + +*/ +smpl_t aubio_spectral_whitening_get_relax_time (aubio_spectral_whitening_t * o); + +/** set floor for spectral whitening + + \param o spectral whitening object as returned by new_aubio_spectral_whitening() + \param floor value (typically between 1.e-6 and .2, defaults to 1.e-4) + + */ +uint_t aubio_spectral_whitening_set_floor (aubio_spectral_whitening_t * o, + smpl_t floor); + +/** get floor of spectral whitening + + \param o spectral whitening object as returned by new_aubio_spectral_whitening() + \return floor value + +*/ +smpl_t aubio_spectral_whitening_get_floor (aubio_spectral_whitening_t * o); + +/** deletion of a spectral whitening + + \param o spectral whitening object as returned by new_aubio_spectral_whitening() + +*/ +void del_aubio_spectral_whitening (aubio_spectral_whitening_t * o); + +#ifdef __cplusplus +} +#endif + +#endif /* _AUBIO_SPECTRAL_WHITENING_H */ diff --git a/src/spectral/dct.c b/src/spectral/dct.c new file mode 100644 index 0000000..16cd2a9 --- /dev/null +++ b/src/spectral/dct.c @@ -0,0 +1,174 @@ +/* + Copyright (C) 2018 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +/** \file + + Discrete Cosine Transform + + Functions aubio_dct_do() and aubio_dct_rdo() are equivalent to MATLAB/Octave + dct() and idct() functions, as well as scipy.fftpack.dct(x, norm='ortho') and + scipy.fftpack.idct(x, norm='ortho') + + \example spectral/test-dct.c + +*/ + +#include "aubio_priv.h" +#include "fvec.h" +#include "spectral/dct.h" + +// function pointers prototypes +typedef void (*aubio_dct_do_t)(aubio_dct_t * s, const fvec_t * input, fvec_t * output); +typedef void (*aubio_dct_rdo_t)(aubio_dct_t * s, const fvec_t * input, fvec_t * output); +typedef void (*del_aubio_dct_t)(aubio_dct_t * s); + +#if defined(HAVE_ACCELERATE) +typedef struct _aubio_dct_accelerate_t aubio_dct_accelerate_t; +extern aubio_dct_accelerate_t * new_aubio_dct_accelerate (uint_t size); +extern void aubio_dct_accelerate_do(aubio_dct_accelerate_t *s, const fvec_t *input, fvec_t *output); +extern void aubio_dct_accelerate_rdo(aubio_dct_accelerate_t *s, const fvec_t *input, fvec_t *output); +extern void del_aubio_dct_accelerate (aubio_dct_accelerate_t *s); +#elif defined(HAVE_FFTW3) +typedef struct _aubio_dct_fftw_t aubio_dct_fftw_t; +extern aubio_dct_fftw_t * new_aubio_dct_fftw (uint_t size); +extern void aubio_dct_fftw_do(aubio_dct_fftw_t *s, const fvec_t *input, fvec_t *output); +extern void aubio_dct_fftw_rdo(aubio_dct_fftw_t *s, const fvec_t *input, fvec_t *output); +extern void del_aubio_dct_fftw (aubio_dct_fftw_t *s); +#elif defined(HAVE_INTEL_IPP) +typedef struct _aubio_dct_ipp_t aubio_dct_ipp_t; +extern aubio_dct_ipp_t * new_aubio_dct_ipp (uint_t size); +extern void aubio_dct_ipp_do(aubio_dct_ipp_t *s, const fvec_t *input, fvec_t *output); +extern void aubio_dct_ipp_rdo(aubio_dct_ipp_t *s, const fvec_t *input, fvec_t *output); +extern void del_aubio_dct_ipp (aubio_dct_ipp_t *s); +#else +typedef struct _aubio_dct_ooura_t aubio_dct_ooura_t; +extern aubio_dct_ooura_t * new_aubio_dct_ooura (uint_t size); +extern void aubio_dct_ooura_do(aubio_dct_ooura_t *s, const fvec_t *input, fvec_t *output); +extern void aubio_dct_ooura_rdo(aubio_dct_ooura_t *s, const fvec_t *input, fvec_t *output); +extern void del_aubio_dct_ooura (aubio_dct_ooura_t *s); +#endif + +// plain mode +typedef struct _aubio_dct_plain_t aubio_dct_plain_t; +extern aubio_dct_plain_t * new_aubio_dct_plain (uint_t size); +extern void aubio_dct_plain_do(aubio_dct_plain_t *s, const fvec_t *input, fvec_t *output); +extern void aubio_dct_plain_rdo(aubio_dct_plain_t *s, const fvec_t *input, fvec_t *output); +extern void del_aubio_dct_plain (aubio_dct_plain_t *s); + +struct _aubio_dct_t { + void *dct; + aubio_dct_do_t dct_do; + aubio_dct_rdo_t dct_rdo; + del_aubio_dct_t del_dct; +}; + +aubio_dct_t* new_aubio_dct (uint_t size) { + aubio_dct_t * s = AUBIO_NEW(aubio_dct_t); +#if defined(HAVE_ACCELERATE) + // vDSP supports sizes = f * 2 ** n, where n >= 4 and f in [1, 3, 5, 15] + // see https://developer.apple.com/documentation/accelerate/1449930-vdsp_dct_createsetup + { + uint_t radix = size; + uint_t order = 0; + while ((radix >= 1) && ((radix / 2) * 2 == radix)) { + radix /= 2; + order++; + } + if (order < 4 || (radix != 1 && radix != 3 && radix != 5 && radix != 15)) { + goto plain; + } + } + s->dct = (void *)new_aubio_dct_accelerate (size); + if (s->dct) { + s->dct_do = (aubio_dct_do_t)aubio_dct_accelerate_do; + s->dct_rdo = (aubio_dct_rdo_t)aubio_dct_accelerate_rdo; + s->del_dct = (del_aubio_dct_t)del_aubio_dct_accelerate; + return s; + } +#elif defined(HAVE_FFTW3) + // fftw supports any positive integer size + s->dct = (void *)new_aubio_dct_fftw (size); + if (s->dct) { + s->dct_do = (aubio_dct_do_t)aubio_dct_fftw_do; + s->dct_rdo = (aubio_dct_rdo_t)aubio_dct_fftw_rdo; + s->del_dct = (del_aubio_dct_t)del_aubio_dct_fftw; + return s; + } else { + AUBIO_WRN("dct: unexpected error while creating dct_fftw with size %d\n", + size); + goto plain; + } +#elif defined(HAVE_INTEL_IPP) + // unclear from the docs, but intel ipp seems to support any size + s->dct = (void *)new_aubio_dct_ipp (size); + if (s->dct) { + s->dct_do = (aubio_dct_do_t)aubio_dct_ipp_do; + s->dct_rdo = (aubio_dct_rdo_t)aubio_dct_ipp_rdo; + s->del_dct = (del_aubio_dct_t)del_aubio_dct_ipp; + return s; + } else { + AUBIO_WRN("dct: unexpected error while creating dct_ipp with size %d\n", + size); + goto plain; + } +#else + // ooura support sizes that are power of 2 + if (aubio_is_power_of_two(size) != 1 || size == 1) { + goto plain; + } + s->dct = (void *)new_aubio_dct_ooura (size); + if (s->dct) { + s->dct_do = (aubio_dct_do_t)aubio_dct_ooura_do; + s->dct_rdo = (aubio_dct_rdo_t)aubio_dct_ooura_rdo; + s->del_dct = (del_aubio_dct_t)del_aubio_dct_ooura; + return s; + } +#endif + // falling back to plain mode + AUBIO_WRN("dct: no optimised implementation could be created for size %d\n", + size); +plain: + s->dct = (void *)new_aubio_dct_plain (size); + if (s->dct) { + s->dct_do = (aubio_dct_do_t)aubio_dct_plain_do; + s->dct_rdo = (aubio_dct_rdo_t)aubio_dct_plain_rdo; + s->del_dct = (del_aubio_dct_t)del_aubio_dct_plain; + return s; + } else { + goto beach; + } +beach: + AUBIO_ERROR("dct: failed creating with size %d, should be > 0\n", size); + del_aubio_dct(s); + return NULL; +} + +void del_aubio_dct(aubio_dct_t *s) { + if (s->dct && s->del_dct) s->del_dct (s->dct); + AUBIO_FREE (s); +} + +void aubio_dct_do(aubio_dct_t *s, const fvec_t *input, fvec_t *output) { + s->dct_do ((void *)s->dct, input, output); +} + +void aubio_dct_rdo(aubio_dct_t *s, const fvec_t *input, fvec_t *output) { + s->dct_rdo ((void *)s->dct, input, output); +} diff --git a/src/spectral/dct.h b/src/spectral/dct.h new file mode 100644 index 0000000..6ceb006 --- /dev/null +++ b/src/spectral/dct.h @@ -0,0 +1,85 @@ +/* + Copyright (C) 2017 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +/** \file + + Discrete Cosine Transform + + Functions aubio_dct_do() and aubio_dct_rdo() are equivalent to MATLAB/Octave + dct() and idct() functions, as well as scipy.fftpack.dct(x, norm='ortho') and + scipy.fftpack.idct(x, norm='ortho') + + \example spectral/test-dct.c + +*/ + +#ifndef AUBIO_DCT_H +#define AUBIO_DCT_H + +#ifdef __cplusplus +extern "C" { +#endif + +/** DCT object + + This object computes forward and backward DCT type 2 with orthonormal + scaling. + +*/ +typedef struct _aubio_dct_t aubio_dct_t; + +/** create new DCT computation object + + \param size length of the DCT + +*/ +aubio_dct_t * new_aubio_dct(uint_t size); + +/** compute forward DCT + + \param s dct object as returned by new_aubio_dct + \param input input signal + \param dct_output transformed input array + +*/ +void aubio_dct_do (aubio_dct_t *s, const fvec_t * input, fvec_t * dct_output); + +/** compute backward DCT + + \param s dct object as returned by new_aubio_dct + \param input input signal + \param idct_output transformed input array + +*/ +void aubio_dct_rdo (aubio_dct_t *s, const fvec_t * input, fvec_t * idct_output); + + +/** delete DCT object + + \param s dct object as returned by new_aubio_dct + +*/ +void del_aubio_dct (aubio_dct_t *s); + +#ifdef __cplusplus +} +#endif + +#endif /* AUBIO_DCT_H */ diff --git a/src/spectral/dct_accelerate.c b/src/spectral/dct_accelerate.c new file mode 100644 index 0000000..4765c0d --- /dev/null +++ b/src/spectral/dct_accelerate.c @@ -0,0 +1,102 @@ +/* + Copyright (C) 2017 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#include "aubio_priv.h" +#include "fvec.h" +#include "spectral/dct.h" + +#if defined(HAVE_ACCELERATE) + +#if HAVE_AUBIO_DOUBLE +#warning "no double-precision dct with accelerate" +#endif + +struct _aubio_dct_accelerate_t { + uint_t size; + fvec_t *tmp; + vDSP_DFT_Setup setup; + vDSP_DFT_Setup setupInv; +}; + +typedef struct _aubio_dct_accelerate_t aubio_dct_accelerate_t; + +void del_aubio_dct_accelerate (aubio_dct_accelerate_t *s); + +aubio_dct_accelerate_t * new_aubio_dct_accelerate (uint_t size) { + aubio_dct_accelerate_t * s = AUBIO_NEW(aubio_dct_accelerate_t); + + if ((sint_t)size < 16 || !aubio_is_power_of_two(size)) { + AUBIO_ERR("dct: can only create with sizes greater than 16 and" + " that are powers of two, requested %d\n", size); + goto beach; + } + + s->setup = vDSP_DCT_CreateSetup(NULL, (vDSP_Length)size, vDSP_DCT_II); + s->setupInv = vDSP_DCT_CreateSetup(NULL, (vDSP_Length)size, vDSP_DCT_III); + if (s->setup == NULL || s->setupInv == NULL) { + goto beach; + } + + s->size = size; + + return s; + +beach: + del_aubio_dct_accelerate(s); + return NULL; +} + +void del_aubio_dct_accelerate(aubio_dct_accelerate_t *s) { + if (s->setup) vDSP_DFT_DestroySetup(s->setup); + if (s->setupInv) vDSP_DFT_DestroySetup(s->setupInv); + AUBIO_FREE(s); +} + +void aubio_dct_accelerate_do(aubio_dct_accelerate_t *s, const fvec_t *input, fvec_t *output) { + + vDSP_DCT_Execute(s->setup, (const float *)input->data, (float *)output->data); + + // apply orthonormal scaling + output->data[0] *= SQRT(1./s->size); + smpl_t scaler = SQRT(2./s->size); + + aubio_vDSP_vsmul(output->data + 1, 1, &scaler, output->data + 1, 1, + output->length - 1); + +} + +void aubio_dct_accelerate_rdo(aubio_dct_accelerate_t *s, const fvec_t *input, fvec_t *output) { + + output->data[0] = input->data[0] / SQRT(1./s->size); + smpl_t scaler = 1./SQRT(2./s->size); + + aubio_vDSP_vsmul(input->data + 1, 1, &scaler, output->data + 1, 1, + output->length - 1); + + vDSP_DCT_Execute(s->setupInv, (const float *)output->data, + (float *)output->data); + + scaler = 2./s->size; + + aubio_vDSP_vsmul(output->data, 1, &scaler, output->data, 1, output->length); + +} + +#endif //defined(HAVE_ACCELERATE) diff --git a/src/spectral/dct_fftw.c b/src/spectral/dct_fftw.c new file mode 100644 index 0000000..7d6c32c --- /dev/null +++ b/src/spectral/dct_fftw.c @@ -0,0 +1,130 @@ +/* + Copyright (C) 2017 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#include "aubio_priv.h" +#include "fvec.h" +#include "spectral/dct.h" + +#ifdef HAVE_FFTW3 + +#include <fftw3.h> +#include <pthread.h> + +#ifdef HAVE_FFTW3F +#if HAVE_AUBIO_DOUBLE +#error "Using aubio in double precision with fftw3 in single precision" +#endif /* HAVE_AUBIO_DOUBLE */ +#else /* HAVE_FFTW3F */ +#if !HAVE_AUBIO_DOUBLE +#error "Using aubio in single precision with fftw3 in double precision" +#endif /* HAVE_AUBIO_DOUBLE */ +#endif /* HAVE_FFTW3F */ + +#ifdef HAVE_FFTW3F +#define fftw_malloc fftwf_malloc +#define fftw_free fftwf_free +#define fftw_execute fftwf_execute +#define fftw_plan_dft_r2c_1d fftwf_plan_dft_r2c_1d +#define fftw_plan_dft_c2r_1d fftwf_plan_dft_c2r_1d +#define fftw_plan_r2r_1d fftwf_plan_r2r_1d +#define fftw_plan fftwf_plan +#define fftw_destroy_plan fftwf_destroy_plan +#endif + +// defined in src/spectral/fft.c +extern pthread_mutex_t aubio_fftw_mutex; + +typedef struct _aubio_dct_fftw_t aubio_dct_fftw_t; + +struct _aubio_dct_fftw_t { + uint_t size; + fvec_t *in, *out; + smpl_t *data; + fftw_plan pfw, pbw; + smpl_t scalers[5]; +}; + +aubio_dct_fftw_t * new_aubio_dct_fftw (uint_t size) { + aubio_dct_fftw_t * s = AUBIO_NEW(aubio_dct_fftw_t); + if ((sint_t)size <= 0) { + AUBIO_ERR("dct_fftw: can only create with size > 0, requested %d\n", + size); + goto beach; + } + s->size = size; + s->in = new_fvec(size); + s->out = new_fvec(size); + pthread_mutex_lock(&aubio_fftw_mutex); + s->data = (smpl_t *)fftw_malloc(sizeof(smpl_t) * size); + s->pfw = fftw_plan_r2r_1d(size, s->in->data, s->data, FFTW_REDFT10, + FFTW_ESTIMATE); + s->pbw = fftw_plan_r2r_1d(size, s->data, s->out->data, FFTW_REDFT01, + FFTW_ESTIMATE); + pthread_mutex_unlock(&aubio_fftw_mutex); + s->scalers[0] = SQRT(1./(4.*s->size)); + s->scalers[1] = SQRT(1./(2.*s->size)); + s->scalers[2] = 1. / s->scalers[0]; + s->scalers[3] = 1. / s->scalers[1]; + s->scalers[4] = .5 / s->size; + return s; +beach: + AUBIO_FREE(s); + return NULL; +} + +void del_aubio_dct_fftw(aubio_dct_fftw_t *s) { + pthread_mutex_lock(&aubio_fftw_mutex); + fftw_destroy_plan(s->pfw); + fftw_destroy_plan(s->pbw); + fftw_free(s->data); + pthread_mutex_unlock(&aubio_fftw_mutex); + del_fvec(s->in); + del_fvec(s->out); + AUBIO_FREE(s); +} + +void aubio_dct_fftw_do(aubio_dct_fftw_t *s, const fvec_t *input, fvec_t *output) { + uint_t i; + fvec_copy(input, s->in); + fftw_execute(s->pfw); + //fvec_copy(s->out, output); + s->data[0] *= s->scalers[0]; + for (i = 1; i < s->size; i++) { + s->data[i] *= s->scalers[1]; + } + memcpy(output->data, s->data, output->length * sizeof(smpl_t)); +} + +void aubio_dct_fftw_rdo(aubio_dct_fftw_t *s, const fvec_t *input, fvec_t *output) { + uint_t i; + memcpy(s->data, input->data, input->length * sizeof(smpl_t)); + //s->data[0] *= .5; + s->data[0] *= s->scalers[2]; + for (i = 1; i < s->size; i++) { + s->data[i] *= s->scalers[3]; + } + fftw_execute(s->pbw); + for (i = 0; i < s->size; i++) { + s->out->data[i] *= s->scalers[4]; + } + fvec_copy(s->out, output); +} + +#endif //HAVE_FFTW3 diff --git a/src/spectral/dct_ipp.c b/src/spectral/dct_ipp.c new file mode 100644 index 0000000..273aa00 --- /dev/null +++ b/src/spectral/dct_ipp.c @@ -0,0 +1,150 @@ +/* + Copyright (C) 2017 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#include "aubio_priv.h" +#include "fvec.h" +#include "spectral/dct.h" + +#if defined(HAVE_INTEL_IPP) + +#if !HAVE_AUBIO_DOUBLE +#define aubio_IppFloat Ipp32f +#define aubio_ippsDCTFwdSpec IppsDCTFwdSpec_32f +#define aubio_ippsDCTInvSpec IppsDCTInvSpec_32f +#define aubio_ippsDCTFwdGetSize ippsDCTFwdGetSize_32f +#define aubio_ippsDCTInvGetSize ippsDCTInvGetSize_32f +#define aubio_ippsDCTFwdInit ippsDCTFwdInit_32f +#define aubio_ippsDCTInvInit ippsDCTInvInit_32f +#define aubio_ippsDCTFwd ippsDCTFwd_32f +#define aubio_ippsDCTInv ippsDCTInv_32f +#else /* HAVE_AUBIO_DOUBLE */ +#define aubio_IppFloat Ipp64f +#define aubio_ippsDCTFwdSpec IppsDCTFwdSpec_64f +#define aubio_ippsDCTInvSpec IppsDCTInvSpec_64f +#define aubio_ippsDCTFwdGetSize ippsDCTFwdGetSize_64f +#define aubio_ippsDCTInvGetSize ippsDCTInvGetSize_64f +#define aubio_ippsDCTFwdInit ippsDCTFwdInit_64f +#define aubio_ippsDCTInvInit ippsDCTInvInit_64f +#define aubio_ippsDCTFwd ippsDCTFwd_64f +#define aubio_ippsDCTInv ippsDCTInv_64f +#endif + +typedef struct _aubio_dct_ipp_t aubio_dct_ipp_t; + +struct _aubio_dct_ipp_t { + uint_t size; + Ipp8u* pSpecFwd; + Ipp8u* pSpecInv; + Ipp8u* pSpecBuffer; + Ipp8u* pBuffer; + aubio_ippsDCTFwdSpec* pFwdDCTSpec; + aubio_ippsDCTInvSpec* pInvDCTSpec; +}; + +void del_aubio_dct_ipp (aubio_dct_ipp_t *s); + +aubio_dct_ipp_t * new_aubio_dct_ipp (uint_t size) { + aubio_dct_ipp_t * s = AUBIO_NEW(aubio_dct_ipp_t); + + const IppHintAlgorithm qualityHint = ippAlgHintAccurate; // ippAlgHintFast; + int pSpecSize, pSpecBufferSize, pBufferSize; + IppStatus status; + + if ((sint_t)size <= 0) { + AUBIO_ERR("dct: can only create with sizes greater than 0, requested %d\n", + size); + goto beach; + } + + status = aubio_ippsDCTFwdGetSize(size, qualityHint, &pSpecSize, + &pSpecBufferSize, &pBufferSize); + if (status != ippStsNoErr) { + AUBIO_ERR("dct: failed to initialize dct. IPP error: %d\n", status); + goto beach; + } + + //AUBIO_INF("dct: fwd initialized with %d %d %d\n", pSpecSize, pSpecBufferSize, + // pBufferSize); + + s->pSpecFwd = ippsMalloc_8u(pSpecSize); + s->pSpecInv = ippsMalloc_8u(pSpecSize); + if (pSpecSize > 0) { + s->pSpecBuffer = ippsMalloc_8u(pSpecBufferSize); + } else { + s->pSpecBuffer = NULL; + } + s->pBuffer = ippsMalloc_8u(pBufferSize); + + status = aubio_ippsDCTInvGetSize(size, qualityHint, &pSpecSize, + &pSpecBufferSize, &pBufferSize); + if (status != ippStsNoErr) { + AUBIO_ERR("dct: failed to initialize dct. IPP error: %d\n", status); + goto beach; + } + + //AUBIO_INF("dct: inv initialized with %d %d %d\n", pSpecSize, pSpecBufferSize, + // pBufferSize); + + status = aubio_ippsDCTFwdInit(&(s->pFwdDCTSpec), size, qualityHint, s->pSpecFwd, + s->pSpecBuffer); + if (status != ippStsNoErr) { + AUBIO_ERR("dct: failed to initialize fwd dct. IPP error: %d\n", status); + goto beach; + } + + status = aubio_ippsDCTInvInit(&(s->pInvDCTSpec), size, qualityHint, s->pSpecInv, + s->pSpecBuffer); + if (status != ippStsNoErr) { + AUBIO_ERR("dct: failed to initialize inv dct. IPP error: %d\n", status); + goto beach; + } + + s->size = size; + + return s; + +beach: + del_aubio_dct_ipp(s); + return NULL; +} + +void del_aubio_dct_ipp(aubio_dct_ipp_t *s) { + ippFree(s->pSpecFwd); + ippFree(s->pSpecInv); + ippFree(s->pSpecBuffer); + ippFree(s->pBuffer); + AUBIO_FREE(s); +} + +void aubio_dct_ipp_do(aubio_dct_ipp_t *s, const fvec_t *input, fvec_t *output) { + + aubio_ippsDCTFwd((const aubio_IppFloat*)input->data, + (aubio_IppFloat*)output->data, s->pFwdDCTSpec, s->pBuffer); + +} + +void aubio_dct_ipp_rdo(aubio_dct_ipp_t *s, const fvec_t *input, fvec_t *output) { + + aubio_ippsDCTInv((const aubio_IppFloat*)input->data, + (aubio_IppFloat*)output->data, s->pInvDCTSpec, s->pBuffer); + +} + +#endif //defined(HAVE_INTEL_IPP) diff --git a/src/spectral/dct_ooura.c b/src/spectral/dct_ooura.c new file mode 100644 index 0000000..ba6a64b --- /dev/null +++ b/src/spectral/dct_ooura.c @@ -0,0 +1,96 @@ +/* + Copyright (C) 2017 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#include "aubio_priv.h" +#include "fvec.h" +#include "spectral/dct.h" + +#if !defined(HAVE_ACCELERATE) && !defined(HAVE_FFTW3) && !defined(HAVE_INTEL_IPP) + +typedef struct _aubio_dct_ooura_t aubio_dct_ooura_t; + +extern void aubio_ooura_ddct(int, int, smpl_t *, int *, smpl_t *); + +struct _aubio_dct_ooura_t { + uint_t size; + fvec_t *input; + smpl_t *w; + int *ip; + smpl_t scalers[5]; +}; + +aubio_dct_ooura_t * new_aubio_dct_ooura (uint_t size) { + aubio_dct_ooura_t * s = AUBIO_NEW(aubio_dct_ooura_t); + if (aubio_is_power_of_two(size) != 1 || (sint_t)size <= 0) { + AUBIO_ERR("dct_ooura: can only create with sizes power of two, requested %d\n", + size); + goto beach; + } + s->size = size; + s->input = new_fvec(s->size); + s->w = AUBIO_ARRAY(smpl_t, s->size * 5 / 4); + s->ip = AUBIO_ARRAY(int, 3 + (1 << (int)FLOOR(LOG(s->size/2) / LOG(2))) / 2); + s->ip[0] = 0; + s->scalers[0] = 2. * SQRT(1./(4.*s->size)); + s->scalers[1] = 2. * SQRT(1./(2.*s->size)); + s->scalers[2] = 1. / s->scalers[0]; + s->scalers[3] = 1. / s->scalers[1]; + s->scalers[4] = 2. / s->size; + return s; +beach: + AUBIO_FREE(s); + return NULL; +} + +void del_aubio_dct_ooura(aubio_dct_ooura_t *s) { + del_fvec(s->input); + AUBIO_FREE(s->ip); + AUBIO_FREE(s->w); + AUBIO_FREE(s); +} + +void aubio_dct_ooura_do(aubio_dct_ooura_t *s, const fvec_t *input, fvec_t *output) { + uint_t i = 0; + fvec_copy(input, s->input); + aubio_ooura_ddct(s->size, -1, s->input->data, s->ip, s->w); + // apply orthonormal scaling + s->input->data[0] *= s->scalers[0]; + for (i = 1; i < s->input->length; i++) { + s->input->data[i] *= s->scalers[1]; + } + fvec_copy(s->input, output); +} + +void aubio_dct_ooura_rdo(aubio_dct_ooura_t *s, const fvec_t *input, fvec_t *output) { + uint_t i = 0; + fvec_copy(input, s->input); + s->input->data[0] *= s->scalers[2]; + for (i = 1; i < s->input->length; i++) { + s->input->data[i] *= s->scalers[3]; + } + s->input->data[0] *= .5; + aubio_ooura_ddct(s->size, 1, s->input->data, s->ip, s->w); + for (i = 0; i < s->input->length; i++) { + s->input->data[i] *= s->scalers[4]; + } + fvec_copy(s->input, output); +} + +#endif //!defined(HAVE_ACCELERATE) && !defined(HAVE_FFTW3) diff --git a/src/spectral/dct_plain.c b/src/spectral/dct_plain.c new file mode 100644 index 0000000..696fa9d --- /dev/null +++ b/src/spectral/dct_plain.c @@ -0,0 +1,105 @@ +/* + Copyright (C) 2018 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#include "aubio_priv.h" +#include "fvec.h" +#include "fmat.h" +#include "spectral/dct.h" + +typedef struct _aubio_dct_plain_t aubio_dct_plain_t; + +struct _aubio_dct_plain_t { + uint_t size; + fmat_t *dct_coeffs; /** DCT type II orthonormal transform, size * size */ + fmat_t *idct_coeffs; /** DCT type III orthonormal transform, size * size */ +}; + +void del_aubio_dct_plain (aubio_dct_plain_t *s); + +aubio_dct_plain_t * new_aubio_dct_plain (uint_t size) { + aubio_dct_plain_t * s = AUBIO_NEW(aubio_dct_plain_t); + uint_t i, j; + smpl_t scaling; + if (aubio_is_power_of_two (size) == 1 && size > 16) { + AUBIO_WRN("dct_plain: using plain dct but size %d is a power of two\n", size); + } + if ((sint_t)size <= 0) { + AUBIO_ERR("dct_plain: can only create with size > 0, requested %d\n", + size); + goto failure; + } + + s->size = size; + + s->dct_coeffs = new_fmat (size, size); + s->idct_coeffs = new_fmat (size, size); + + /* compute DCT type-II transformation matrix + dct_coeffs[j][i] = cos ( j * (i+.5) * PI / n_filters ) + */ + scaling = SQRT (2. / size); + for (i = 0; i < size; i++) { + for (j = 1; j < size; j++) { + s->dct_coeffs->data[j][i] = + scaling * COS (j * (i + 0.5) * PI / size ); + } + s->dct_coeffs->data[0][i] = 1. / SQRT (size); + } + + /* compute DCT type-III transformation matrix + idct_coeffs[j][i] = cos ( i * (j+.5) * PI / n_filters ) + */ + scaling = SQRT (2. / size); + for (j = 0; j < size; j++) { + for (i = 1; i < size; i++) { + s->idct_coeffs->data[j][i] = + scaling * COS (i * (j + 0.5) * PI / size ); + } + s->idct_coeffs->data[j][0] = 1. / SQRT (size); + } + return s; +failure: + del_aubio_dct_plain(s); + return NULL; +} + +void del_aubio_dct_plain (aubio_dct_plain_t *s) { + if (s->dct_coeffs) + del_fmat(s->dct_coeffs); + if (s->idct_coeffs) + del_fmat(s->idct_coeffs); + AUBIO_FREE(s); +} + +void aubio_dct_plain_do(aubio_dct_plain_t *s, const fvec_t *input, fvec_t *output) { + if (input->length != output->length || input->length != s->size) { + AUBIO_WRN("dct_plain: using input length %d, but output length = %d and size = %d\n", + input->length, output->length, s->size); + } + fmat_vecmul(s->dct_coeffs, input, output); +} + +void aubio_dct_plain_rdo(aubio_dct_plain_t *s, const fvec_t *input, fvec_t *output) { + if (input->length != output->length || input->length != s->size) { + AUBIO_WRN("dct_plain: using input length %d, but output length = %d and size = %d\n", + input->length, output->length, s->size); + } + fmat_vecmul(s->idct_coeffs, input, output); +} diff --git a/src/spectral/fft.c b/src/spectral/fft.c index e8dfc1a..598fe22 100644 --- a/src/spectral/fft.c +++ b/src/spectral/fft.c @@ -77,8 +77,7 @@ typedef FFTW_TYPE fft_data_t; // a global mutex for FFTW thread safety pthread_mutex_t aubio_fftw_mutex = PTHREAD_MUTEX_INITIALIZER; -#else -#ifdef HAVE_ACCELERATE // using ACCELERATE +#elif defined HAVE_ACCELERATE // using ACCELERATE // https://developer.apple.com/library/mac/#documentation/Accelerate/Reference/vDSPRef/Reference/reference.html #include <Accelerate/Accelerate.h> @@ -90,11 +89,12 @@ pthread_mutex_t aubio_fftw_mutex = PTHREAD_MUTEX_INITIALIZER; #define aubio_vDSP_zvphas vDSP_zvphas #define aubio_vDSP_vsadd vDSP_vsadd #define aubio_vDSP_vsmul vDSP_vsmul -#define aubio_vDSP_create_fftsetup vDSP_create_fftsetup -#define aubio_vDSP_destroy_fftsetup vDSP_destroy_fftsetup #define aubio_DSPComplex DSPComplex #define aubio_DSPSplitComplex DSPSplitComplex -#define aubio_FFTSetup FFTSetup +#define aubio_vDSP_DFT_Setup vDSP_DFT_Setup +#define aubio_vDSP_DFT_zrop_CreateSetup vDSP_DFT_zrop_CreateSetup +#define aubio_vDSP_DFT_Execute vDSP_DFT_Execute +#define aubio_vDSP_DFT_DestroySetup vDSP_DFT_DestroySetup #define aubio_vvsqrt vvsqrtf #else #define aubio_vDSP_ctoz vDSP_ctozD @@ -104,40 +104,74 @@ pthread_mutex_t aubio_fftw_mutex = PTHREAD_MUTEX_INITIALIZER; #define aubio_vDSP_zvphas vDSP_zvphasD #define aubio_vDSP_vsadd vDSP_vsaddD #define aubio_vDSP_vsmul vDSP_vsmulD -#define aubio_vDSP_create_fftsetup vDSP_create_fftsetupD -#define aubio_vDSP_destroy_fftsetup vDSP_destroy_fftsetupD #define aubio_DSPComplex DSPDoubleComplex #define aubio_DSPSplitComplex DSPDoubleSplitComplex -#define aubio_FFTSetup FFTSetupD +#define aubio_vDSP_DFT_Setup vDSP_DFT_SetupD +#define aubio_vDSP_DFT_zrop_CreateSetup vDSP_DFT_zrop_CreateSetupD +#define aubio_vDSP_DFT_Execute vDSP_DFT_ExecuteD +#define aubio_vDSP_DFT_DestroySetup vDSP_DFT_DestroySetupD #define aubio_vvsqrt vvsqrt #endif /* HAVE_AUBIO_DOUBLE */ -#else // using OOURA +#elif defined HAVE_INTEL_IPP // using INTEL IPP + +#if !HAVE_AUBIO_DOUBLE +#define aubio_IppFloat Ipp32f +#define aubio_IppComplex Ipp32fc +#define aubio_FFTSpec FFTSpec_R_32f +#define aubio_ippsMalloc_complex ippsMalloc_32fc +#define aubio_ippsFFTInit_R ippsFFTInit_R_32f +#define aubio_ippsFFTGetSize_R ippsFFTGetSize_R_32f +#define aubio_ippsFFTInv_CCSToR ippsFFTInv_CCSToR_32f +#define aubio_ippsFFTFwd_RToCCS ippsFFTFwd_RToCCS_32f +#define aubio_ippsAtan2 ippsAtan2_32f_A21 +#else /* HAVE_AUBIO_DOUBLE */ +#define aubio_IppFloat Ipp64f +#define aubio_IppComplex Ipp64fc +#define aubio_FFTSpec FFTSpec_R_64f +#define aubio_ippsMalloc_complex ippsMalloc_64fc +#define aubio_ippsFFTInit_R ippsFFTInit_R_64f +#define aubio_ippsFFTGetSize_R ippsFFTGetSize_R_64f +#define aubio_ippsFFTInv_CCSToR ippsFFTInv_CCSToR_64f +#define aubio_ippsFFTFwd_RToCCS ippsFFTFwd_RToCCS_64f +#define aubio_ippsAtan2 ippsAtan2_64f_A50 +#endif + + +#else // using OOURA // let's use ooura instead -extern void rdft(int, int, smpl_t *, int *, smpl_t *); +extern void aubio_ooura_rdft(int, int, smpl_t *, int *, smpl_t *); -#endif /* HAVE_ACCELERATE */ -#endif /* HAVE_FFTW3 */ +#endif struct _aubio_fft_t { uint_t winsize; uint_t fft_size; + #ifdef HAVE_FFTW3 // using FFTW3 real_t *in, *out; fftw_plan pfw, pbw; - fft_data_t * specdata; /* complex spectral data */ -#else -#ifdef HAVE_ACCELERATE // using ACCELERATE - int log2fftsize; - aubio_FFTSetup fftSetup; + fft_data_t * specdata; /* complex spectral data */ + +#elif defined HAVE_ACCELERATE // using ACCELERATE + aubio_vDSP_DFT_Setup fftSetupFwd; + aubio_vDSP_DFT_Setup fftSetupBwd; aubio_DSPSplitComplex spec; smpl_t *in, *out; + +#elif defined HAVE_INTEL_IPP // using Intel IPP + smpl_t *in, *out; + Ipp8u* memSpec; + Ipp8u* memInit; + Ipp8u* memBuffer; + struct aubio_FFTSpec* fftSpec; + aubio_IppComplex* complexOut; #else // using OOURA smpl_t *in, *out; smpl_t *w; int *ip; -#endif /* HAVE_ACCELERATE */ -#endif /* HAVE_FFTW3 */ +#endif /* using OOURA */ + fvec_t * compspec; }; @@ -147,6 +181,7 @@ aubio_fft_t * new_aubio_fft (uint_t winsize) { AUBIO_ERR("fft: got winsize %d, but can not be < 2\n", winsize); goto beach; } + #ifdef HAVE_FFTW3 uint_t i; s->winsize = winsize; @@ -175,21 +210,76 @@ aubio_fft_t * new_aubio_fft (uint_t winsize) { for (i = 0; i < s->fft_size; i++) { s->specdata[i] = 0.; } -#else -#ifdef HAVE_ACCELERATE // using ACCELERATE + +#elif defined HAVE_ACCELERATE // using ACCELERATE + { + uint_t radix = winsize; + uint_t order = 0; + while ((radix / 2) * 2 == radix) { + radix /= 2; + order++; + } + if (order < 4 || (radix != 1 && radix != 3 && radix != 5 && radix != 15)) { + AUBIO_ERR("fft: vDSP/Accelerate supports FFT with sizes = " + "f * 2 ** n, where n > 4 and f in [1, 3, 5, 15], but requested %d. " + "Use the closest power of two, or try recompiling aubio with " + "--enable-fftw3.\n", winsize); + goto beach; + } + } s->winsize = winsize; s->fft_size = winsize; s->compspec = new_fvec(winsize); - s->log2fftsize = (uint_t)log2f(s->fft_size); s->in = AUBIO_ARRAY(smpl_t, s->fft_size); s->out = AUBIO_ARRAY(smpl_t, s->fft_size); s->spec.realp = AUBIO_ARRAY(smpl_t, s->fft_size/2); s->spec.imagp = AUBIO_ARRAY(smpl_t, s->fft_size/2); - s->fftSetup = aubio_vDSP_create_fftsetup(s->log2fftsize, FFT_RADIX2); + s->fftSetupFwd = aubio_vDSP_DFT_zrop_CreateSetup(NULL, + s->fft_size, vDSP_DFT_FORWARD); + s->fftSetupBwd = aubio_vDSP_DFT_zrop_CreateSetup(s->fftSetupFwd, + s->fft_size, vDSP_DFT_INVERSE); + +#elif defined HAVE_INTEL_IPP // using Intel IPP + const IppHintAlgorithm qualityHint = ippAlgHintAccurate; // OR ippAlgHintFast; + const int flags = IPP_FFT_NODIV_BY_ANY; // we're scaling manually afterwards + int order = aubio_power_of_two_order(winsize); + int sizeSpec, sizeInit, sizeBuffer; + IppStatus status; + + if (winsize <= 4 || aubio_is_power_of_two(winsize) != 1) + { + AUBIO_ERR("intel IPP fft: can only create with sizes > 4 and power of two, requested %d," + " try recompiling aubio with --enable-fftw3\n", winsize); + goto beach; + } + + status = aubio_ippsFFTGetSize_R(order, flags, qualityHint, + &sizeSpec, &sizeInit, &sizeBuffer); + if (status != ippStsNoErr) { + AUBIO_ERR("fft: failed to initialize fft. IPP error: %d\n", status); + goto beach; + } + s->fft_size = s->winsize = winsize; + s->compspec = new_fvec(winsize); + s->in = AUBIO_ARRAY(smpl_t, s->winsize); + s->out = AUBIO_ARRAY(smpl_t, s->winsize); + s->memSpec = ippsMalloc_8u(sizeSpec); + s->memBuffer = ippsMalloc_8u(sizeBuffer); + if (sizeInit > 0 ) { + s->memInit = ippsMalloc_8u(sizeInit); + } + s->complexOut = aubio_ippsMalloc_complex(s->fft_size / 2 + 1); + status = aubio_ippsFFTInit_R( + &s->fftSpec, order, flags, qualityHint, s->memSpec, s->memInit); + if (status != ippStsNoErr) { + AUBIO_ERR("fft: failed to initialize. IPP error: %d\n", status); + goto beach; + } + #else // using OOURA if (aubio_is_power_of_two(winsize) != 1) { - AUBIO_ERR("fft: can only create with sizes power of two," - " requested %d\n", winsize); + AUBIO_ERR("fft: can only create with sizes power of two, requested %d," + " try recompiling aubio with --enable-fftw3\n", winsize); goto beach; } s->winsize = winsize; @@ -200,9 +290,10 @@ aubio_fft_t * new_aubio_fft (uint_t winsize) { s->ip = AUBIO_ARRAY(int , s->fft_size); s->w = AUBIO_ARRAY(smpl_t, s->fft_size); s->ip[0] = 0; -#endif /* HAVE_ACCELERATE */ -#endif /* HAVE_FFTW3 */ +#endif /* using OOURA */ + return s; + beach: AUBIO_FREE(s); return NULL; @@ -210,23 +301,33 @@ beach: void del_aubio_fft(aubio_fft_t * s) { /* destroy data */ - del_fvec(s->compspec); #ifdef HAVE_FFTW3 // using FFTW3 + pthread_mutex_lock(&aubio_fftw_mutex); fftw_destroy_plan(s->pfw); fftw_destroy_plan(s->pbw); fftw_free(s->specdata); -#else /* HAVE_FFTW3 */ -#ifdef HAVE_ACCELERATE // using ACCELERATE + pthread_mutex_unlock(&aubio_fftw_mutex); + +#elif defined HAVE_ACCELERATE // using ACCELERATE AUBIO_FREE(s->spec.realp); AUBIO_FREE(s->spec.imagp); - aubio_vDSP_destroy_fftsetup(s->fftSetup); + aubio_vDSP_DFT_DestroySetup(s->fftSetupBwd); + aubio_vDSP_DFT_DestroySetup(s->fftSetupFwd); + +#elif defined HAVE_INTEL_IPP // using Intel IPP + ippFree(s->memSpec); + ippFree(s->memInit); + ippFree(s->memBuffer); + ippFree(s->complexOut); + #else // using OOURA AUBIO_FREE(s->w); AUBIO_FREE(s->ip); -#endif /* HAVE_ACCELERATE */ -#endif /* HAVE_FFTW3 */ - AUBIO_FREE(s->out); +#endif + + del_fvec(s->compspec); AUBIO_FREE(s->in); + AUBIO_FREE(s->out); AUBIO_FREE(s); } @@ -249,6 +350,7 @@ void aubio_fft_do_complex(aubio_fft_t * s, const fvec_t * input, fvec_t * compsp #else memcpy(s->in, input->data, s->winsize * sizeof(smpl_t)); #endif /* HAVE_MEMCPY_HACKS */ + #ifdef HAVE_FFTW3 // using FFTW3 fftw_execute(s->pfw); #ifdef HAVE_COMPLEX_H @@ -263,12 +365,13 @@ void aubio_fft_do_complex(aubio_fft_t * s, const fvec_t * input, fvec_t * compsp compspec->data[i] = s->specdata[i]; } #endif /* HAVE_COMPLEX_H */ -#else /* HAVE_FFTW3 */ -#ifdef HAVE_ACCELERATE // using ACCELERATE + +#elif defined HAVE_ACCELERATE // using ACCELERATE // convert real data to even/odd format used in vDSP aubio_vDSP_ctoz((aubio_DSPComplex*)s->in, 2, &s->spec, 1, s->fft_size/2); // compute the FFT - aubio_vDSP_fft_zrip(s->fftSetup, &s->spec, 1, s->log2fftsize, FFT_FORWARD); + aubio_vDSP_DFT_Execute(s->fftSetupFwd, s->spec.realp, s->spec.imagp, + s->spec.realp, s->spec.imagp); // convert from vDSP complex split to [ r0, r1, ..., rN, iN-1, .., i2, i1] compspec->data[0] = s->spec.realp[0]; compspec->data[s->fft_size / 2] = s->spec.imagp[0]; @@ -279,16 +382,28 @@ void aubio_fft_do_complex(aubio_fft_t * s, const fvec_t * input, fvec_t * compsp // apply scaling smpl_t scale = 1./2.; aubio_vDSP_vsmul(compspec->data, 1, &scale, compspec->data, 1, s->fft_size); + +#elif defined HAVE_INTEL_IPP // using Intel IPP + + // apply fft + aubio_ippsFFTFwd_RToCCS(s->in, (aubio_IppFloat*)s->complexOut, s->fftSpec, s->memBuffer); + // convert complex buffer to [ r0, r1, ..., rN, iN-1, .., i2, i1] + compspec->data[0] = s->complexOut[0].re; + compspec->data[s->fft_size / 2] = s->complexOut[s->fft_size / 2].re; + for (i = 1; i < s->fft_size / 2; i++) { + compspec->data[i] = s->complexOut[i].re; + compspec->data[s->fft_size - i] = s->complexOut[i].im; + } + #else // using OOURA - rdft(s->winsize, 1, s->in, s->ip, s->w); + aubio_ooura_rdft(s->winsize, 1, s->in, s->ip, s->w); compspec->data[0] = s->in[0]; compspec->data[s->winsize / 2] = s->in[1]; for (i = 1; i < s->fft_size - 1; i++) { compspec->data[i] = s->in[2 * i]; compspec->data[s->winsize - i] = - s->in[2 * i + 1]; } -#endif /* HAVE_ACCELERATE */ -#endif /* HAVE_FFTW3 */ +#endif /* using OOURA */ } void aubio_fft_rdo_complex(aubio_fft_t * s, const fvec_t * compspec, fvec_t * output) { @@ -311,8 +426,8 @@ void aubio_fft_rdo_complex(aubio_fft_t * s, const fvec_t * compspec, fvec_t * ou for (i = 0; i < output->length; i++) { output->data[i] = s->out[i]*renorm; } -#else /* HAVE_FFTW3 */ -#ifdef HAVE_ACCELERATE // using ACCELERATE + +#elif defined HAVE_ACCELERATE // using ACCELERATE // convert from real imag [ r0, r1, ..., rN, iN-1, .., i2, i1] // to vDSP packed format [ r0, rN, r1, i1, ..., rN-1, iN-1 ] s->out[0] = compspec->data[0]; @@ -324,12 +439,30 @@ void aubio_fft_rdo_complex(aubio_fft_t * s, const fvec_t * compspec, fvec_t * ou // convert to split complex format used in vDSP aubio_vDSP_ctoz((aubio_DSPComplex*)s->out, 2, &s->spec, 1, s->fft_size/2); // compute the FFT - aubio_vDSP_fft_zrip(s->fftSetup, &s->spec, 1, s->log2fftsize, FFT_INVERSE); + aubio_vDSP_DFT_Execute(s->fftSetupBwd, s->spec.realp, s->spec.imagp, + s->spec.realp, s->spec.imagp); // convert result to real output aubio_vDSP_ztoc(&s->spec, 1, (aubio_DSPComplex*)output->data, 2, s->fft_size/2); // apply scaling smpl_t scale = 1.0 / s->winsize; aubio_vDSP_vsmul(output->data, 1, &scale, output->data, 1, s->fft_size); + +#elif defined HAVE_INTEL_IPP // using Intel IPP + + // convert from real imag [ r0, 0, ..., rN, iN-1, .., i2, i1, iN-1] to complex format + s->complexOut[0].re = compspec->data[0]; + s->complexOut[0].im = 0; + s->complexOut[s->fft_size / 2].re = compspec->data[s->fft_size / 2]; + s->complexOut[s->fft_size / 2].im = 0.0; + for (i = 1; i < s->fft_size / 2; i++) { + s->complexOut[i].re = compspec->data[i]; + s->complexOut[i].im = compspec->data[s->fft_size - i]; + } + // apply fft + aubio_ippsFFTInv_CCSToR((const aubio_IppFloat *)s->complexOut, output->data, s->fftSpec, s->memBuffer); + // apply scaling + aubio_ippsMulC(output->data, 1.0 / s->winsize, output->data, s->fft_size); + #else // using OOURA smpl_t scale = 2.0 / s->winsize; s->out[0] = compspec->data[0]; @@ -338,12 +471,11 @@ void aubio_fft_rdo_complex(aubio_fft_t * s, const fvec_t * compspec, fvec_t * ou s->out[2 * i] = compspec->data[i]; s->out[2 * i + 1] = - compspec->data[s->winsize - i]; } - rdft(s->winsize, -1, s->out, s->ip, s->w); + aubio_ooura_rdft(s->winsize, -1, s->out, s->ip, s->w); for (i=0; i < s->winsize; i++) { output->data[i] = s->out[i] * scale; } -#endif /* HAVE_ACCELERATE */ -#endif /* HAVE_FFTW3 */ +#endif } void aubio_fft_get_spectrum(const fvec_t * compspec, cvec_t * spectrum) { @@ -363,15 +495,42 @@ void aubio_fft_get_phas(const fvec_t * compspec, cvec_t * spectrum) { } else { spectrum->phas[0] = 0.; } +#if defined(HAVE_INTEL_IPP) + // convert from real imag [ r0, r1, ..., rN, iN-1, ..., i2, i1, i0] + // to [ r0, r1, ..., rN, i0, i1, i2, ..., iN-1] + for (i = 1; i < spectrum->length / 2; i++) { + ELEM_SWAP(compspec->data[compspec->length - i], + compspec->data[spectrum->length + i - 1]); + } + aubio_ippsAtan2(compspec->data + spectrum->length, + compspec->data + 1, spectrum->phas + 1, spectrum->length - 1); + // revert the imaginary part back again + for (i = 1; i < spectrum->length / 2; i++) { + ELEM_SWAP(compspec->data[spectrum->length + i - 1], + compspec->data[compspec->length - i]); + } +#else for (i=1; i < spectrum->length - 1; i++) { spectrum->phas[i] = ATAN2(compspec->data[compspec->length-i], compspec->data[i]); } - if (compspec->data[compspec->length/2] < 0) { - spectrum->phas[spectrum->length - 1] = PI; +#endif +#ifdef HAVE_FFTW3 + // for even length only, make sure last element is 0 or PI + if (2 * (compspec->length / 2) == compspec->length) { +#endif + if (compspec->data[compspec->length/2] < 0) { + spectrum->phas[spectrum->length - 1] = PI; + } else { + spectrum->phas[spectrum->length - 1] = 0.; + } +#ifdef HAVE_FFTW3 } else { - spectrum->phas[spectrum->length - 1] = 0.; + i = spectrum->length - 1; + spectrum->phas[i] = ATAN2(compspec->data[compspec->length-i], + compspec->data[i]); } +#endif } void aubio_fft_get_norm(const fvec_t * compspec, cvec_t * spectrum) { @@ -381,8 +540,19 @@ void aubio_fft_get_norm(const fvec_t * compspec, cvec_t * spectrum) { spectrum->norm[i] = SQRT(SQR(compspec->data[i]) + SQR(compspec->data[compspec->length - i]) ); } - spectrum->norm[spectrum->length-1] = - ABS(compspec->data[compspec->length/2]); +#ifdef HAVE_FFTW3 + // for even length, make sure last element is > 0 + if (2 * (compspec->length / 2) == compspec->length) { +#endif + spectrum->norm[spectrum->length-1] = + ABS(compspec->data[compspec->length/2]); +#ifdef HAVE_FFTW3 + } else { + i = spectrum->length - 1; + spectrum->norm[i] = SQRT(SQR(compspec->data[i]) + + SQR(compspec->data[compspec->length - i]) ); + } +#endif } void aubio_fft_get_imag(const cvec_t * spectrum, fvec_t * compspec) { diff --git a/src/spectral/fft.h b/src/spectral/fft.h index 9c8a99c..21072c8 100644 --- a/src/spectral/fft.h +++ b/src/spectral/fft.h @@ -27,7 +27,7 @@ - [FFTW3](http://www.fftw.org) - [vDSP](https://developer.apple.com/library/mac/#documentation/Accelerate/Reference/vDSPRef/Reference/reference.html) - \example src/spectral/test-fft.c + \example spectral/test-fft.c */ diff --git a/src/spectral/filterbank.c b/src/spectral/filterbank.c index 0323700..e82d93d 100644 --- a/src/spectral/filterbank.c +++ b/src/spectral/filterbank.c @@ -23,6 +23,7 @@ #include "fvec.h" #include "fmat.h" #include "cvec.h" +#include "vecutils.h" #include "spectral/filterbank.h" #include "mathutils.h" @@ -32,6 +33,8 @@ struct _aubio_filterbank_t uint_t win_s; uint_t n_filters; fmat_t *filters; + smpl_t norm; + smpl_t power; }; aubio_filterbank_t * @@ -39,13 +42,29 @@ new_aubio_filterbank (uint_t n_filters, uint_t win_s) { /* allocate space for filterbank object */ aubio_filterbank_t *fb = AUBIO_NEW (aubio_filterbank_t); + + if ((sint_t)n_filters <= 0) { + AUBIO_ERR("filterbank: n_filters should be > 0, got %d\n", n_filters); + goto fail; + } + if ((sint_t)win_s <= 0) { + AUBIO_ERR("filterbank: win_s should be > 0, got %d\n", win_s); + goto fail; + } fb->win_s = win_s; fb->n_filters = n_filters; /* allocate filter tables, a matrix of length win_s and of height n_filters */ fb->filters = new_fmat (n_filters, win_s / 2 + 1); + fb->norm = 1; + + fb->power = 1; + return fb; +fail: + AUBIO_FREE (fb); + return NULL; } void @@ -67,6 +86,8 @@ aubio_filterbank_do (aubio_filterbank_t * f, const cvec_t * in, fvec_t * out) tmp.length = in->length; tmp.data = in->norm; + if (f->power != 1.) fvec_pow(&tmp, f->power); + fmat_vecmul(f->filters, &tmp, out); return; @@ -84,3 +105,30 @@ aubio_filterbank_set_coeffs (aubio_filterbank_t * f, const fmat_t * filter_coeff fmat_copy(filter_coeffs, f->filters); return 0; } + +uint_t +aubio_filterbank_set_norm (aubio_filterbank_t *f, smpl_t norm) +{ + if (norm != 0 && norm != 1) return AUBIO_FAIL; + f->norm = norm; + return AUBIO_OK; +} + +smpl_t +aubio_filterbank_get_norm (aubio_filterbank_t *f) +{ + return f->norm; +} + +uint_t +aubio_filterbank_set_power (aubio_filterbank_t *f, smpl_t power) +{ + f->power = power; + return AUBIO_OK; +} + +smpl_t +aubio_filterbank_get_power (aubio_filterbank_t *f) +{ + return f->power; +} diff --git a/src/spectral/filterbank.h b/src/spectral/filterbank.h index 769b5e7..876b14d 100644 --- a/src/spectral/filterbank.h +++ b/src/spectral/filterbank.h @@ -83,6 +83,47 @@ fmat_t *aubio_filterbank_get_coeffs (const aubio_filterbank_t * f); */ uint_t aubio_filterbank_set_coeffs (aubio_filterbank_t * f, const fmat_t * filters); +/** set norm parameter + + \param f filterbank object, as returned by new_aubio_filterbank() + \param norm `1` to norm the filters, `0` otherwise. + + If set to `0`, the filters will not be normalized. If set to `1`, + each filter will be normalized to one. Defaults to `1`. + + This function should be called *before* setting the filters with one of + aubio_filterbank_set_triangle_bands(), aubio_filterbank_set_mel_coeffs(), + aubio_filterbank_set_mel_coeffs_htk(), or + aubio_filterbank_set_mel_coeffs_slaney(). + + */ +uint_t aubio_filterbank_set_norm (aubio_filterbank_t *f, smpl_t norm); + +/** get norm parameter + + \param f filterbank object, as returned by new_aubio_filterbank() + \returns `1` if norm is set, `0` otherwise. Defaults to `1`. + + */ +smpl_t aubio_filterbank_get_norm (aubio_filterbank_t *f); + +/** set power parameter + + \param f filterbank object, as returned by new_aubio_filterbank() + \param power Raise norm of the input spectrum norm to this power before + computing filterbank. Defaults to `1`. + + */ +uint_t aubio_filterbank_set_power (aubio_filterbank_t *f, smpl_t power); + +/** get power parameter + + \param f filterbank object, as returned by new_aubio_filterbank() + \return current power parameter. Defaults to `1`. + + */ +smpl_t aubio_filterbank_get_power (aubio_filterbank_t *f); + #ifdef __cplusplus } #endif diff --git a/src/spectral/filterbank_mel.c b/src/spectral/filterbank_mel.c index f059540..37713a3 100644 --- a/src/spectral/filterbank_mel.c +++ b/src/spectral/filterbank_mel.c @@ -54,9 +54,21 @@ aubio_filterbank_set_triangle_bands (aubio_filterbank_t * fb, n_filters, freqs->length - 2); } - if (freqs->data[freqs->length - 1] > samplerate / 2) { - AUBIO_WRN ("Nyquist frequency is %fHz, but highest frequency band ends at \ -%fHz\n", samplerate / 2, freqs->data[freqs->length - 1]); + for (fn = 0; fn < freqs->length; fn++) { + if (freqs->data[fn] < 0) { + AUBIO_ERR("filterbank_mel: freqs must contain only positive values.\n"); + return AUBIO_FAIL; + } else if (freqs->data[fn] > samplerate / 2) { + AUBIO_WRN("filterbank_mel: freqs should contain only " + "values < samplerate / 2.\n"); + } else if (fn > 0 && freqs->data[fn] < freqs->data[fn-1]) { + AUBIO_ERR("filterbank_mel: freqs should be a list of frequencies " + "sorted from low to high, but freq[%d] < freq[%d-1]\n", fn, fn); + return AUBIO_FAIL; + } else if (fn > 0 && freqs->data[fn] == freqs->data[fn-1]) { + AUBIO_WRN("filterbank_mel: set_triangle_bands received a list " + "with twice the frequency %f\n", freqs->data[fn]); + } } /* convenience reference to lower/center/upper frequency for each triangle */ @@ -78,9 +90,13 @@ aubio_filterbank_set_triangle_bands (aubio_filterbank_t * fb, } /* compute triangle heights so that each triangle has unit area */ - for (fn = 0; fn < n_filters; fn++) { - triangle_heights->data[fn] = - 2. / (upper_freqs->data[fn] - lower_freqs->data[fn]); + if (aubio_filterbank_get_norm(fb)) { + for (fn = 0; fn < n_filters; fn++) { + triangle_heights->data[fn] = + 2. / (upper_freqs->data[fn] - lower_freqs->data[fn]); + } + } else { + fvec_ones (triangle_heights); } /* fill fft_freqs lookup table, which assigns the frequency in hz to each bin */ @@ -92,17 +108,6 @@ aubio_filterbank_set_triangle_bands (aubio_filterbank_t * fb, /* zeroing of all filters */ fmat_zeros (filters); - if (fft_freqs->data[1] >= lower_freqs->data[0]) { - /* - 1 to make sure we don't miss the smallest power of two */ - uint_t min_win_s = - (uint_t) FLOOR (samplerate / lower_freqs->data[0]) - 1; - AUBIO_WRN ("Lowest frequency bin (%.2fHz) is higher than lowest frequency \ -band (%.2f-%.2fHz). Consider increasing the window size from %d to %d.\n", - fft_freqs->data[1], lower_freqs->data[0], - upper_freqs->data[0], (win_s - 1) * 2, - aubio_next_power_of_two (min_win_s)); - } - /* building each filter table */ for (fn = 0; fn < n_filters; fn++) { @@ -116,9 +121,8 @@ band (%.2f-%.2fHz). Consider increasing the window size from %d to %d.\n", } /* compute positive slope step size */ - riseInc = - triangle_heights->data[fn] / - (center_freqs->data[fn] - lower_freqs->data[fn]); + riseInc = triangle_heights->data[fn] + / (center_freqs->data[fn] - lower_freqs->data[fn]); /* compute coefficients in positive slope */ for (; bin < win_s - 1; bin++) { @@ -132,9 +136,8 @@ band (%.2f-%.2fHz). Consider increasing the window size from %d to %d.\n", } /* compute negative slope step size */ - downInc = - triangle_heights->data[fn] / - (upper_freqs->data[fn] - center_freqs->data[fn]); + downInc = triangle_heights->data[fn] + / (upper_freqs->data[fn] - center_freqs->data[fn]); /* compute coefficents in negative slope */ for (; bin < win_s - 1; bin++) { @@ -160,30 +163,34 @@ band (%.2f-%.2fHz). Consider increasing the window size from %d to %d.\n", del_fvec (triangle_heights); del_fvec (fft_freqs); - return 0; + return AUBIO_OK; } uint_t aubio_filterbank_set_mel_coeffs_slaney (aubio_filterbank_t * fb, smpl_t samplerate) { - uint_t retval; - /* Malcolm Slaney parameters */ - smpl_t lowestFrequency = 133.3333; - smpl_t linearSpacing = 66.66666666; - smpl_t logSpacing = 1.0711703; + const smpl_t lowestFrequency = 133.3333; + const smpl_t linearSpacing = 66.66666666; + const smpl_t logSpacing = 1.0711703; - uint_t linearFilters = 13; - uint_t logFilters = 27; - uint_t n_filters = linearFilters + logFilters; - - uint_t fn; /* filter counter */ + const uint_t linearFilters = 13; + const uint_t logFilters = 27; + const uint_t n_filters = linearFilters + logFilters; + uint_t fn, retval; smpl_t lastlinearCF; /* buffers to compute filter frequencies */ - fvec_t *freqs = new_fvec (n_filters + 2); + fvec_t *freqs; + + if (samplerate <= 0) { + AUBIO_ERR("filterbank: set_mel_coeffs_slaney samplerate should be > 0\n"); + return AUBIO_FAIL; + } + + freqs = new_fvec (n_filters + 2); /* first step: fill all the linear filter frequencies */ for (fn = 0; fn < linearFilters; fn++) { @@ -205,3 +212,87 @@ aubio_filterbank_set_mel_coeffs_slaney (aubio_filterbank_t * fb, return retval; } + +static uint_t aubio_filterbank_check_freqs (aubio_filterbank_t *fb UNUSED, + smpl_t samplerate, smpl_t *freq_min, smpl_t *freq_max) +{ + if (samplerate <= 0) { + AUBIO_ERR("filterbank: set_mel_coeffs samplerate should be > 0\n"); + return AUBIO_FAIL; + } + if (*freq_max < 0) { + AUBIO_ERR("filterbank: set_mel_coeffs freq_max should be > 0\n"); + return AUBIO_FAIL; + } else if (*freq_max == 0) { + *freq_max = samplerate / 2.; + } + if (*freq_min < 0) { + AUBIO_ERR("filterbank: set_mel_coeffs freq_min should be > 0\n"); + return AUBIO_FAIL; + } + return AUBIO_OK; +} + +uint_t +aubio_filterbank_set_mel_coeffs (aubio_filterbank_t * fb, smpl_t samplerate, + smpl_t freq_min, smpl_t freq_max) +{ + uint_t m, retval; + smpl_t start = freq_min, end = freq_max, step; + fvec_t *freqs; + fmat_t *coeffs = aubio_filterbank_get_coeffs(fb); + uint_t n_bands = coeffs->height; + + if (aubio_filterbank_check_freqs(fb, samplerate, &start, &end)) { + return AUBIO_FAIL; + } + + start = aubio_hztomel(start); + end = aubio_hztomel(end); + + freqs = new_fvec(n_bands + 2); + step = (end - start) / (n_bands + 1); + + for (m = 0; m < n_bands + 2; m++) + { + freqs->data[m] = MIN(aubio_meltohz(start + step * m), samplerate/2.); + } + + retval = aubio_filterbank_set_triangle_bands (fb, freqs, samplerate); + + /* destroy vector used to store frequency limits */ + del_fvec (freqs); + return retval; +} + +uint_t +aubio_filterbank_set_mel_coeffs_htk (aubio_filterbank_t * fb, smpl_t samplerate, + smpl_t freq_min, smpl_t freq_max) +{ + uint_t m, retval; + smpl_t start = freq_min, end = freq_max, step; + fvec_t *freqs; + fmat_t *coeffs = aubio_filterbank_get_coeffs(fb); + uint_t n_bands = coeffs->height; + + if (aubio_filterbank_check_freqs(fb, samplerate, &start, &end)) { + return AUBIO_FAIL; + } + + start = aubio_hztomel_htk(start); + end = aubio_hztomel_htk(end); + + freqs = new_fvec (n_bands + 2); + step = (end - start) / (n_bands + 1); + + for (m = 0; m < n_bands + 2; m++) + { + freqs->data[m] = MIN(aubio_meltohz_htk(start + step * m), samplerate/2.); + } + + retval = aubio_filterbank_set_triangle_bands (fb, freqs, samplerate); + + /* destroy vector used to store frequency limits */ + del_fvec (freqs); + return retval; +} diff --git a/src/spectral/filterbank_mel.h b/src/spectral/filterbank_mel.h index 77f0be0..06bbf6c 100644 --- a/src/spectral/filterbank_mel.h +++ b/src/spectral/filterbank_mel.h @@ -55,16 +55,63 @@ uint_t aubio_filterbank_set_triangle_bands (aubio_filterbank_t * fb, /** filterbank initialization for Mel filters using Slaney's coefficients \param fb filterbank object - \param samplerate audio sampling rate + \param samplerate audio sampling rate, in Hz + + The filter coefficients are built to match exactly Malcolm Slaney's Auditory + Toolbox implementation (see file mfcc.m). The number of filters should be 40. + + References + ---------- - The filter coefficients are built according to Malcolm Slaney's Auditory - Toolbox, available at http://engineering.purdue.edu/~malcolm/interval/1998-010/ - (see file mfcc.m). + Malcolm Slaney, *Auditory Toolbox Version 2, Technical Report #1998-010* + https://engineering.purdue.edu/~malcolm/interval/1998-010/ */ uint_t aubio_filterbank_set_mel_coeffs_slaney (aubio_filterbank_t * fb, smpl_t samplerate); +/** Mel filterbank initialization + + \param fb filterbank object + \param samplerate audio sampling rate + \param fmin start frequency, in Hz + \param fmax end frequency, in Hz + + The filterbank will be initialized with bands linearly spaced in the mel + scale, from `fmin` to `fmax`. + + References + ---------- + + Malcolm Slaney, *Auditory Toolbox Version 2, Technical Report #1998-010* + https://engineering.purdue.edu/~malcolm/interval/1998-010/ + +*/ +uint_t aubio_filterbank_set_mel_coeffs(aubio_filterbank_t * fb, + smpl_t samplerate, smpl_t fmin, smpl_t fmax); + +/** Mel filterbank initialization + + \param fb filterbank object + \param samplerate audio sampling rate + \param fmin start frequency, in Hz + \param fmax end frequency, in Hz + + The bank of filters will be initalized to to cover linearly spaced bands in + the Htk mel scale, from `fmin` to `fmax`. + + References + ---------- + + Douglas O'Shaughnessy (1987). *Speech communication: human and machine*. + Addison-Wesley. p. 150. ISBN 978-0-201-16520-3. + + HTK Speech Recognition Toolkit: http://htk.eng.cam.ac.uk/ + +*/ +uint_t aubio_filterbank_set_mel_coeffs_htk(aubio_filterbank_t * fb, + smpl_t samplerate, smpl_t fmin, smpl_t fmax); + #ifdef __cplusplus } #endif diff --git a/src/spectral/mfcc.c b/src/spectral/mfcc.c index 101deb8..badba52 100644 --- a/src/spectral/mfcc.c +++ b/src/spectral/mfcc.c @@ -28,6 +28,7 @@ #include "spectral/fft.h" #include "spectral/filterbank.h" #include "spectral/filterbank_mel.h" +#include "spectral/dct.h" #include "spectral/mfcc.h" /** Internal structure for mfcc object */ @@ -36,11 +37,13 @@ struct _aubio_mfcc_t { uint_t win_s; /** grain length */ uint_t samplerate; /** sample rate (needed?) */ - uint_t n_filters; /** number of *filters */ + uint_t n_filters; /** number of filters */ uint_t n_coefs; /** number of coefficients (<= n_filters/2 +1) */ aubio_filterbank_t *fb; /** filter bank */ fvec_t *in_dct; /** input buffer for dct * [fb->n_filters] */ - fmat_t *dct_coeffs; /** DCT transform n_filters * n_coeffs */ + aubio_dct_t *dct; /** dct object */ + fvec_t *output; /** dct output */ + smpl_t scale; }; @@ -51,9 +54,15 @@ new_aubio_mfcc (uint_t win_s, uint_t n_filters, uint_t n_coefs, /* allocate space for mfcc object */ aubio_mfcc_t *mfcc = AUBIO_NEW (aubio_mfcc_t); - smpl_t scaling; - uint_t i, j; + if ((sint_t)n_coefs <= 0) { + AUBIO_ERR("mfcc: n_coefs should be > 0, got %d\n", n_coefs); + goto failure; + } + if ((sint_t)samplerate <= 0) { + AUBIO_ERR("mfcc: samplerate should be > 0, got %d\n", samplerate); + goto failure; + } mfcc->win_s = win_s; mfcc->samplerate = samplerate; @@ -62,39 +71,45 @@ new_aubio_mfcc (uint_t win_s, uint_t n_filters, uint_t n_coefs, /* filterbank allocation */ mfcc->fb = new_aubio_filterbank (n_filters, mfcc->win_s); - aubio_filterbank_set_mel_coeffs_slaney (mfcc->fb, samplerate); + + if (!mfcc->fb) + goto failure; + + if (n_filters == 40) + aubio_filterbank_set_mel_coeffs_slaney (mfcc->fb, samplerate); + else + aubio_filterbank_set_mel_coeffs(mfcc->fb, samplerate, + 0, samplerate/2.); /* allocating buffers */ mfcc->in_dct = new_fvec (n_filters); - mfcc->dct_coeffs = new_fmat (n_coefs, n_filters); - - /* compute DCT transform dct_coeffs[j][i] as - cos ( j * (i+.5) * PI / n_filters ) */ - scaling = 1. / SQRT (n_filters / 2.); - for (i = 0; i < n_filters; i++) { - for (j = 0; j < n_coefs; j++) { - mfcc->dct_coeffs->data[j][i] = - scaling * COS (j * (i + 0.5) * PI / n_filters); - } - mfcc->dct_coeffs->data[0][i] *= SQRT (2.) / 2.; - } + mfcc->dct = new_aubio_dct (n_filters); + mfcc->output = new_fvec (n_filters); + + if (!mfcc->in_dct || !mfcc->dct || !mfcc->output) + goto failure; + + mfcc->scale = 1.; return mfcc; + +failure: + del_aubio_mfcc(mfcc); + return NULL; } void del_aubio_mfcc (aubio_mfcc_t * mf) { - - /* delete filterbank */ - del_aubio_filterbank (mf->fb); - - /* delete buffers */ - del_fvec (mf->in_dct); - del_fmat (mf->dct_coeffs); - - /* delete mfcc object */ + if (mf->fb) + del_aubio_filterbank (mf->fb); + if (mf->in_dct) + del_fvec (mf->in_dct); + if (mf->dct) + del_aubio_dct (mf->dct); + if (mf->output) + del_fvec (mf->output); AUBIO_FREE (mf); } @@ -102,17 +117,63 @@ del_aubio_mfcc (aubio_mfcc_t * mf) void aubio_mfcc_do (aubio_mfcc_t * mf, const cvec_t * in, fvec_t * out) { + fvec_t tmp; + /* compute filterbank */ aubio_filterbank_do (mf->fb, in, mf->in_dct); /* compute log10 */ fvec_log10 (mf->in_dct); - /* raise power */ - //fvec_pow (mf->in_dct, 3.); + if (mf->scale != 1) fvec_mul (mf->in_dct, mf->scale); /* compute mfccs */ - fmat_vecmul(mf->dct_coeffs, mf->in_dct, out); + aubio_dct_do(mf->dct, mf->in_dct, mf->output); + // copy only first n_coeffs elements + // TODO assert mf->output->length == n_coeffs + tmp.data = mf->output->data; + tmp.length = out->length; + fvec_copy(&tmp, out); return; } + +uint_t aubio_mfcc_set_power (aubio_mfcc_t *mf, smpl_t power) +{ + return aubio_filterbank_set_power(mf->fb, power); +} + +smpl_t aubio_mfcc_get_power (aubio_mfcc_t *mf) +{ + return aubio_filterbank_get_power(mf->fb); +} + +uint_t aubio_mfcc_set_scale (aubio_mfcc_t *mf, smpl_t scale) +{ + mf->scale = scale; + return AUBIO_OK; +} + +smpl_t aubio_mfcc_get_scale (aubio_mfcc_t *mf) +{ + return mf->scale; +} + +uint_t aubio_mfcc_set_mel_coeffs (aubio_mfcc_t *mf, smpl_t freq_min, + smpl_t freq_max) +{ + return aubio_filterbank_set_mel_coeffs(mf->fb, mf->samplerate, + freq_min, freq_max); +} + +uint_t aubio_mfcc_set_mel_coeffs_htk (aubio_mfcc_t *mf, smpl_t freq_min, + smpl_t freq_max) +{ + return aubio_filterbank_set_mel_coeffs_htk(mf->fb, mf->samplerate, + freq_min, freq_max); +} + +uint_t aubio_mfcc_set_mel_coeffs_slaney (aubio_mfcc_t *mf) +{ + return aubio_filterbank_set_mel_coeffs_slaney (mf->fb, mf->samplerate); +} diff --git a/src/spectral/mfcc.h b/src/spectral/mfcc.h index a170e12..46fd979 100644 --- a/src/spectral/mfcc.h +++ b/src/spectral/mfcc.h @@ -26,9 +26,10 @@ This object computes MFCC coefficients on an input cvec_t. The implementation follows the specifications established by Malcolm Slaney - in its Auditory Toolbox, available online (see file mfcc.m). + in its Auditory Toolbox, available online at the following address (see + file mfcc.m): - http://engineering.ecn.purdue.edu/~malcolm/interval/1998-010/ + https://engineering.purdue.edu/~malcolm/interval/1998-010/ \example spectral/test-mfcc.c @@ -72,6 +73,99 @@ void del_aubio_mfcc (aubio_mfcc_t * mf); */ void aubio_mfcc_do (aubio_mfcc_t * mf, const cvec_t * in, fvec_t * out); +/** set power parameter + + \param mf mfcc object, as returned by new_aubio_mfcc() + \param power Raise norm of the input spectrum norm to this power before + computing filterbank. Defaults to `1`. + + See aubio_filterbank_set_power(). + + */ +uint_t aubio_mfcc_set_power (aubio_mfcc_t *mf, smpl_t power); + +/** get power parameter + + \param mf mfcc object, as returned by new_aubio_mfcc() + \return current power parameter. Defaults to `1`. + + See aubio_filterbank_get_power(). + + */ +smpl_t aubio_mfcc_get_power (aubio_mfcc_t *mf); + +/** set scaling parameter + + \param mf mfcc object, as returned by new_aubio_mfcc() + \param scale Scaling value to apply. + + Scales the output of the filterbank after taking its logarithm and before + computing the DCT. Defaults to `1`. + +*/ +uint_t aubio_mfcc_set_scale (aubio_mfcc_t *mf, smpl_t scale); + +/** get scaling parameter + + \param mf mfcc object, as returned by new_aubio_mfcc() + \return current scaling parameter. Defaults to `1`. + + */ +smpl_t aubio_mfcc_get_scale (aubio_mfcc_t *mf); + +/** Mel filterbank initialization + + \param mf mfcc object + \param fmin start frequency, in Hz + \param fmax end frequency, in Hz + + The filterbank will be initialized with bands linearly spaced in the mel + scale, from `fmin` to `fmax`. + + See also + -------- + + aubio_filterbank_set_mel_coeffs() + +*/ +uint_t aubio_mfcc_set_mel_coeffs (aubio_mfcc_t *mf, + smpl_t fmin, smpl_t fmax); + +/** Mel filterbank initialization + + \param mf mfcc object + \param fmin start frequency, in Hz + \param fmax end frequency, in Hz + + The bank of filters will be initalized to to cover linearly spaced bands in + the Htk mel scale, from `fmin` to `fmax`. + + See also + -------- + + aubio_filterbank_set_mel_coeffs_htk() + +*/ +uint_t aubio_mfcc_set_mel_coeffs_htk (aubio_mfcc_t *mf, + smpl_t fmin, smpl_t fmax); + +/** Mel filterbank initialization (Auditory Toolbox's parameters) + + \param mf mfcc object + + The filter coefficients are built to match exactly Malcolm Slaney's Auditory + Toolbox implementation. The number of filters should be 40. + + This is the default filterbank when `mf` was created with `n_filters = 40`. + + See also + -------- + + aubio_filterbank_set_mel_coeffs_slaney() + +*/ +uint_t aubio_mfcc_set_mel_coeffs_slaney (aubio_mfcc_t *mf); + #ifdef __cplusplus } #endif diff --git a/src/spectral/ooura_fft8g.c b/src/spectral/ooura_fft8g.c index 004d8de..394bea0 100644 --- a/src/spectral/ooura_fft8g.c +++ b/src/spectral/ooura_fft8g.c @@ -2,28 +2,31 @@ // - replace all 'double' with 'smpl_t' // - include "aubio_priv.h" (for config.h and types.h) // - add missing prototypes -// - use COS and SIN macros +// - use COS, SIN, and ATAN macros +// - add cast to (smpl_t) to avoid float conversion warnings +// - declare initialization as static +// - prefix public function with aubio_ooura_ #include "aubio_priv.h" -void cdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w); -void rdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w); -void ddct(int n, int isgn, smpl_t *a, int *ip, smpl_t *w); -void ddst(int n, int isgn, smpl_t *a, int *ip, smpl_t *w); -void dfct(int n, smpl_t *a, smpl_t *t, int *ip, smpl_t *w); -void dfst(int n, smpl_t *a, smpl_t *t, int *ip, smpl_t *w); -void makewt(int nw, int *ip, smpl_t *w); -void makect(int nc, int *ip, smpl_t *c); -void bitrv2(int n, int *ip, smpl_t *a); -void bitrv2conj(int n, int *ip, smpl_t *a); -void cftfsub(int n, smpl_t *a, smpl_t *w); -void cftbsub(int n, smpl_t *a, smpl_t *w); -void cft1st(int n, smpl_t *a, smpl_t *w); -void cftmdl(int n, int l, smpl_t *a, smpl_t *w); -void rftfsub(int n, smpl_t *a, int nc, smpl_t *c); -void rftbsub(int n, smpl_t *a, int nc, smpl_t *c); -void dctsub(int n, smpl_t *a, int nc, smpl_t *c); -void dstsub(int n, smpl_t *a, int nc, smpl_t *c); +void aubio_ooura_cdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w); +void aubio_ooura_rdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w); +void aubio_ooura_ddct(int n, int isgn, smpl_t *a, int *ip, smpl_t *w); +void aubio_ooura_ddst(int n, int isgn, smpl_t *a, int *ip, smpl_t *w); +void aubio_ooura_dfct(int n, smpl_t *a, smpl_t *t, int *ip, smpl_t *w); +void aubio_ooura_dfst(int n, smpl_t *a, smpl_t *t, int *ip, smpl_t *w); +static void makewt(int nw, int *ip, smpl_t *w); +static void makect(int nc, int *ip, smpl_t *c); +static void bitrv2(int n, int *ip, smpl_t *a); +static void bitrv2conj(int n, int *ip, smpl_t *a); +static void cftfsub(int n, smpl_t *a, smpl_t *w); +static void cftbsub(int n, smpl_t *a, smpl_t *w); +static void cft1st(int n, smpl_t *a, smpl_t *w); +static void cftmdl(int n, int l, smpl_t *a, smpl_t *w); +static void rftfsub(int n, smpl_t *a, int nc, smpl_t *c); +static void rftbsub(int n, smpl_t *a, int nc, smpl_t *c); +static void dctsub(int n, smpl_t *a, int nc, smpl_t *c); +static void dstsub(int n, smpl_t *a, int nc, smpl_t *c); /* Fast Fourier/Cosine/Sine Transform @@ -302,7 +305,7 @@ Appendix : */ -void cdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) +void aubio_ooura_cdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) { void makewt(int nw, int *ip, smpl_t *w); void bitrv2(int n, int *ip, smpl_t *a); @@ -327,7 +330,7 @@ void cdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) } -void rdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) +void aubio_ooura_rdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) { void makewt(int nw, int *ip, smpl_t *w); void makect(int nc, int *ip, smpl_t *c); @@ -361,7 +364,7 @@ void rdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) a[0] += a[1]; a[1] = xi; } else { - a[1] = 0.5 * (a[0] - a[1]); + a[1] = (smpl_t)0.5 * (a[0] - a[1]); a[0] -= a[1]; if (n > 4) { rftbsub(n, a, nc, w + nw); @@ -374,7 +377,7 @@ void rdft(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) } -void ddct(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) +void aubio_ooura_ddct(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) { void makewt(int nw, int *ip, smpl_t *w); void makect(int nc, int *ip, smpl_t *c); @@ -433,7 +436,7 @@ void ddct(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) } -void ddst(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) +void aubio_ooura_ddst(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) { void makewt(int nw, int *ip, smpl_t *w); void makect(int nc, int *ip, smpl_t *c); @@ -492,7 +495,7 @@ void ddst(int n, int isgn, smpl_t *a, int *ip, smpl_t *w) } -void dfct(int n, smpl_t *a, smpl_t *t, int *ip, smpl_t *w) +void aubio_ooura_dfct(int n, smpl_t *a, smpl_t *t, int *ip, smpl_t *w) { void makewt(int nw, int *ip, smpl_t *w); void makect(int nc, int *ip, smpl_t *c); @@ -588,7 +591,7 @@ void dfct(int n, smpl_t *a, smpl_t *t, int *ip, smpl_t *w) } -void dfst(int n, smpl_t *a, smpl_t *t, int *ip, smpl_t *w) +void aubio_ooura_dfst(int n, smpl_t *a, smpl_t *t, int *ip, smpl_t *w) { void makewt(int nw, int *ip, smpl_t *w); void makect(int nc, int *ip, smpl_t *c); @@ -690,7 +693,7 @@ void makewt(int nw, int *ip, smpl_t *w) ip[1] = 1; if (nw > 2) { nwh = nw >> 1; - delta = atan(1.0) / nwh; + delta = ATAN(1.0) / nwh; w[0] = 1; w[1] = 0; w[nwh] = COS(delta * nwh); @@ -724,12 +727,12 @@ void makect(int nc, int *ip, smpl_t *c) ip[1] = nc; if (nc > 1) { nch = nc >> 1; - delta = atan(1.0) / nch; - c[0] = cos(delta * nch); - c[nch] = 0.5 * c[0]; + delta = ATAN(1.0) / nch; + c[0] = COS(delta * nch); + c[nch] = (smpl_t)0.5 * c[0]; for (j = 1; j < nch; j++) { - c[j] = 0.5 * cos(delta * j); - c[nc - j] = 0.5 * sin(delta * j); + c[j] = (smpl_t)0.5 * COS(delta * j); + c[nc - j] = (smpl_t)0.5 * SIN(delta * j); } } } @@ -1585,7 +1588,7 @@ void rftfsub(int n, smpl_t *a, int nc, smpl_t *c) for (j = 2; j < m; j += 2) { k = n - j; kk += ks; - wkr = 0.5 - c[nc - kk]; + wkr = (smpl_t)0.5 - c[nc - kk]; wki = c[kk]; xr = a[j] - a[k]; xi = a[j + 1] + a[k + 1]; @@ -1611,7 +1614,7 @@ void rftbsub(int n, smpl_t *a, int nc, smpl_t *c) for (j = 2; j < m; j += 2) { k = n - j; kk += ks; - wkr = 0.5 - c[nc - kk]; + wkr = (smpl_t)0.5 - c[nc - kk]; wki = c[kk]; xr = a[j] - a[k]; xi = a[j + 1] + a[k + 1]; diff --git a/src/spectral/phasevoc.c b/src/spectral/phasevoc.c index e48e91a..05ebdb0 100644 --- a/src/spectral/phasevoc.c +++ b/src/spectral/phasevoc.c @@ -88,7 +88,7 @@ aubio_pvoc_t * new_aubio_pvoc (uint_t win_s, uint_t hop_s) { AUBIO_ERR("pvoc: got buffer_size %d, but can not be < 2\n", win_s); goto beach; } else if (win_s < hop_s) { - AUBIO_ERR("pvoc: hop size (%d) is larger than win size (%d)\n", win_s, hop_s); + AUBIO_ERR("pvoc: hop size (%d) is larger than win size (%d)\n", hop_s, win_s); goto beach; } @@ -143,6 +143,10 @@ beach: return NULL; } +uint_t aubio_pvoc_set_window(aubio_pvoc_t *pv, const char_t *window) { + return fvec_set_window(pv->w, (char_t*)window); +} + void del_aubio_pvoc(aubio_pvoc_t *pv) { del_fvec(pv->data); del_fvec(pv->synth); @@ -208,3 +212,13 @@ static void aubio_pvoc_addsynth(aubio_pvoc_t *pv, fvec_t *synth_new) for (i = 0; i < pv->end; i++) synthold[i] += synth[i + pv->hop_s] * pv->scale; } + +uint_t aubio_pvoc_get_win(aubio_pvoc_t* pv) +{ + return pv->win_s; +} + +uint_t aubio_pvoc_get_hop(aubio_pvoc_t* pv) +{ + return pv->hop_s; +} diff --git a/src/spectral/phasevoc.h b/src/spectral/phasevoc.h index d1e440d..e3caf2d 100644 --- a/src/spectral/phasevoc.h +++ b/src/spectral/phasevoc.h @@ -88,6 +88,7 @@ void aubio_pvoc_rdo(aubio_pvoc_t *pv, cvec_t * fftgrain, fvec_t *out); */ uint_t aubio_pvoc_get_win(aubio_pvoc_t* pv); + /** get hop size \param pv phase vocoder to get the hop size from @@ -95,6 +96,16 @@ uint_t aubio_pvoc_get_win(aubio_pvoc_t* pv); */ uint_t aubio_pvoc_get_hop(aubio_pvoc_t* pv); +/** set window type + + \param pv phase vocoder to set the window type + \param window_type a string representing a window + + \return 0 if successful, non-zero otherwise + + */ +uint_t aubio_pvoc_set_window(aubio_pvoc_t *pv, const char_t *window_type); + #ifdef __cplusplus } #endif diff --git a/src/spectral/specdesc.c b/src/spectral/specdesc.c index fb9b2f7..cc1665a 100644 --- a/src/spectral/specdesc.c +++ b/src/spectral/specdesc.c @@ -30,6 +30,7 @@ void aubio_specdesc_energy(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t void aubio_specdesc_hfc(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); void aubio_specdesc_complex(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); void aubio_specdesc_phase(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); +void aubio_specdesc_wphase(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); void aubio_specdesc_specdiff(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); void aubio_specdesc_kl(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); void aubio_specdesc_mkl(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); @@ -57,6 +58,7 @@ typedef enum { aubio_onset_hfc, /**< high frequency content */ aubio_onset_complex, /**< complex domain */ aubio_onset_phase, /**< phase fast */ + aubio_onset_wphase, /**< weighted phase */ aubio_onset_kl, /**< Kullback Liebler */ aubio_onset_mkl, /**< modified Kullback Liebler */ aubio_onset_specflux, /**< spectral flux */ @@ -159,6 +161,23 @@ void aubio_specdesc_phase(aubio_specdesc_t *o, //onset->data[0] = fvec_mean(o->dev1); } +/* weighted phase */ +void +aubio_specdesc_wphase(aubio_specdesc_t *o, + const cvec_t *fftgrain, fvec_t *onset) { + uint_t i; + aubio_specdesc_phase(o, fftgrain, onset); + for (i = 0; i < fftgrain->length; i++) { + o->dev1->data[i] *= fftgrain->norm[i]; + } + /* apply o->histogram */ + aubio_hist_dyn_notnull(o->histog,o->dev1); + /* weight it */ + aubio_hist_weight(o->histog); + /* its mean is the result */ + onset->data[0] = aubio_hist_mean(o->histog); +} + /* Spectral difference method onset detection function */ void aubio_specdesc_specdiff(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset){ @@ -250,6 +269,8 @@ new_aubio_specdesc (const char_t * onset_mode, uint_t size){ onset_type = aubio_onset_complex; else if (strcmp (onset_mode, "phase") == 0) onset_type = aubio_onset_phase; + else if (strcmp (onset_mode, "wphase") == 0) + onset_type = aubio_onset_wphase; else if (strcmp (onset_mode, "mkl") == 0) onset_type = aubio_onset_mkl; else if (strcmp (onset_mode, "kl") == 0) @@ -270,15 +291,19 @@ new_aubio_specdesc (const char_t * onset_mode, uint_t size){ onset_type = aubio_specmethod_decrease; else if (strcmp (onset_mode, "rolloff") == 0) onset_type = aubio_specmethod_rolloff; + else if (strcmp (onset_mode, "old_default") == 0) + onset_type = aubio_onset_default; else if (strcmp (onset_mode, "default") == 0) onset_type = aubio_onset_default; else { - AUBIO_ERR("unknown spectral descriptor type %s, using default.\n", onset_mode); - onset_type = aubio_onset_default; + AUBIO_ERR("specdesc: unknown spectral descriptor type '%s'\n", + onset_mode); + AUBIO_FREE(o); + return NULL; } switch(onset_type) { /* for both energy and hfc, only fftgrain->norm is required */ - case aubio_onset_energy: + case aubio_onset_energy: break; case aubio_onset_hfc: break; @@ -290,6 +315,7 @@ new_aubio_specdesc (const char_t * onset_mode, uint_t size){ o->theta2 = new_fvec(rsize); break; case aubio_onset_phase: + case aubio_onset_wphase: o->dev1 = new_fvec(rsize); o->theta1 = new_fvec(rsize); o->theta2 = new_fvec(rsize); @@ -324,6 +350,9 @@ new_aubio_specdesc (const char_t * onset_mode, uint_t size){ case aubio_onset_phase: o->funcpointer = aubio_specdesc_phase; break; + case aubio_onset_wphase: + o->funcpointer = aubio_specdesc_wphase; + break; case aubio_onset_specdiff: o->funcpointer = aubio_specdesc_specdiff; break; @@ -366,7 +395,7 @@ new_aubio_specdesc (const char_t * onset_mode, uint_t size){ void del_aubio_specdesc (aubio_specdesc_t *o){ switch(o->onset_type) { - case aubio_onset_energy: + case aubio_onset_energy: break; case aubio_onset_hfc: break; @@ -377,6 +406,7 @@ void del_aubio_specdesc (aubio_specdesc_t *o){ del_fvec(o->theta2); break; case aubio_onset_phase: + case aubio_onset_wphase: del_fvec(o->dev1); del_fvec(o->theta1); del_fvec(o->theta2); diff --git a/src/spectral/specdesc.h b/src/spectral/specdesc.h index 2cdb87a..0f688c1 100644 --- a/src/spectral/specdesc.h +++ b/src/spectral/specdesc.h @@ -59,6 +59,13 @@ Conference on Acoustics Speech and Signal Processing, pages 441444, Hong-Kong, 2003. + \b \p wphase : Weighted Phase Deviation onset detection function + + S. Dixon. Onset detection revisited. In Proceedings of the 9th International + Conference on Digital Audio Ef- fects (DAFx) , pages 133–137, 2006. + + http://www.eecs.qmul.ac.uk/~simond/pub/2006/dafx.pdf + \b \p specdiff : Spectral difference method onset detection function Jonhatan Foote and Shingo Uchihashi. The beat spectrum: a new approach to @@ -174,8 +181,11 @@ void aubio_specdesc_do (aubio_specdesc_t * o, const cvec_t * fftgrain, The parameter \p method is a string that can be any of: - - `energy`, `hfc`, `complex`, `phase`, `specdiff`, `kl`, `mkl`, `specflux` - - `centroid`, `spread`, `skewness`, `kurtosis`, `slope`, `decrease`, `rolloff` + - onset novelty functions: `complex`, `energy`, `hfc`, `kl`, `mkl`, + `phase`, `specdiff`, `specflux`, `wphase`, + + - spectral descriptors: `centroid`, `decrease`, `kurtosis`, `rolloff`, + `skewness`, `slope`, `spread`. */ aubio_specdesc_t *new_aubio_specdesc (const char_t * method, uint_t buf_size); diff --git a/src/synth/sampler.c b/src/synth/sampler.c index 7df1b5f..050f008 100644 --- a/src/synth/sampler.c +++ b/src/synth/sampler.c @@ -19,7 +19,6 @@ */ -#include "config.h" #include "aubio_priv.h" #include "fvec.h" #include "fmat.h" diff --git a/src/synth/wavetable.c b/src/synth/wavetable.c index 830ffd7..43ece36 100644 --- a/src/synth/wavetable.c +++ b/src/synth/wavetable.c @@ -19,7 +19,6 @@ */ -#include "config.h" #include "aubio_priv.h" #include "fvec.h" #include "fmat.h" @@ -104,6 +103,7 @@ void aubio_wavetable_do ( aubio_wavetable_t * s, const fvec_t * input, fvec_t * for (i = 0; i < output->length; i++) { output->data[i] += input->data[i]; } + fvec_clamp(output, 1.); } } @@ -164,7 +164,14 @@ uint_t aubio_wavetable_stop ( aubio_wavetable_t * s ) //aubio_wavetable_set_freq (s, 0.); aubio_wavetable_set_amp (s, 0.); //s->last_pos = 0; - return aubio_wavetable_set_playing (s, 1); + return aubio_wavetable_set_playing (s, 0); +} + +uint_t +aubio_wavetable_load ( aubio_wavetable_t *s UNUSED, const char_t *uri UNUSED) +{ + AUBIO_ERR("wavetable: load method not implemented yet, see sampler\n"); + return AUBIO_FAIL; } uint_t aubio_wavetable_set_freq ( aubio_wavetable_t * s, smpl_t freq ) diff --git a/src/synth/wavetable.h b/src/synth/wavetable.h index b333575..7607fbe 100644 --- a/src/synth/wavetable.h +++ b/src/synth/wavetable.h @@ -53,6 +53,8 @@ aubio_wavetable_t * new_aubio_wavetable(uint_t samplerate, uint_t hop_size); /** load source in wavetable + TODO: This function is not implemented yet. See new_aubio_sampler() instead. + \param o wavetable, created by new_aubio_wavetable() \param uri the uri of the source to load diff --git a/src/tempo/beattracking.h b/src/tempo/beattracking.h index bc57de8..bc95a73 100644 --- a/src/tempo/beattracking.h +++ b/src/tempo/beattracking.h @@ -31,7 +31,7 @@ Matthew E. P. Davies, Paul Brossier, and Mark D. Plumbley. Beat tracking towards automatic musical accompaniment. In Proceedings of the Audio - Engeeniring Society 118th Convention, Barcelona, Spain, May 2005. + Engineering Society 118th Convention, Barcelona, Spain, May 2005. \example tempo/test-beattracking.c diff --git a/src/tempo/tempo.c b/src/tempo/tempo.c index 80c89e9..5698b2b 100644 --- a/src/tempo/tempo.c +++ b/src/tempo/tempo.c @@ -128,8 +128,7 @@ uint_t aubio_tempo_set_delay_s(aubio_tempo_t * o, smpl_t delay) { } uint_t aubio_tempo_set_delay_ms(aubio_tempo_t * o, smpl_t delay) { - o->delay = 1000. * delay * o->samplerate; - return AUBIO_OK; + return aubio_tempo_set_delay_s(o, delay / 1000.); } uint_t aubio_tempo_get_delay(aubio_tempo_t * o) { @@ -141,7 +140,7 @@ smpl_t aubio_tempo_get_delay_s(aubio_tempo_t * o) { } smpl_t aubio_tempo_get_delay_ms(aubio_tempo_t * o) { - return o->delay / (smpl_t)(o->samplerate) / 1000.; + return aubio_tempo_get_delay_s(o) * 1000.; } uint_t aubio_tempo_set_silence(aubio_tempo_t * o, smpl_t silence) { @@ -168,7 +167,7 @@ aubio_tempo_t * new_aubio_tempo (const char_t * tempo_mode, uint_t buf_size, uint_t hop_size, uint_t samplerate) { aubio_tempo_t * o = AUBIO_NEW(aubio_tempo_t); - char_t specdesc_func[20]; + char_t specdesc_func[PATH_MAX]; o->samplerate = samplerate; // check parameters are valid if ((sint_t)hop_size < 1) { @@ -203,9 +202,10 @@ aubio_tempo_t * new_aubio_tempo (const char_t * tempo_mode, o->pp = new_aubio_peakpicker(); aubio_peakpicker_set_threshold (o->pp, o->threshold); if ( strcmp(tempo_mode, "default") == 0 ) { - strcpy(specdesc_func, "specflux"); + strncpy(specdesc_func, "specflux", PATH_MAX - 1); } else { - strcpy(specdesc_func, tempo_mode); + strncpy(specdesc_func, tempo_mode, PATH_MAX - 1); + specdesc_func[PATH_MAX - 1] = '\0'; } o->od = new_aubio_specdesc(specdesc_func,buf_size); o->of = new_fvec(1); @@ -215,12 +215,17 @@ aubio_tempo_t * new_aubio_tempo (const char_t * tempo_mode, o2 = new_aubio_specdesc(type_onset2,buffer_size); onset2 = new_fvec(1); }*/ + if (!o->dfframe || !o->fftgrain || !o->out || !o->pv || + !o->pp || !o->od || !o->of || !o->bt || !o->onset) { + AUBIO_ERR("tempo: failed creating tempo object\n"); + goto beach; + } o->last_tatum = 0; o->tatum_signature = 4; return o; beach: - AUBIO_FREE(o); + del_aubio_tempo(o); return NULL; } @@ -277,15 +282,23 @@ uint_t aubio_tempo_set_tatum_signature (aubio_tempo_t *o, uint_t signature) { void del_aubio_tempo (aubio_tempo_t *o) { - del_aubio_specdesc(o->od); - del_aubio_beattracking(o->bt); - del_aubio_peakpicker(o->pp); - del_aubio_pvoc(o->pv); - del_fvec(o->out); - del_fvec(o->of); - del_cvec(o->fftgrain); - del_fvec(o->dfframe); - del_fvec(o->onset); + if (o->od) + del_aubio_specdesc(o->od); + if (o->bt) + del_aubio_beattracking(o->bt); + if (o->pp) + del_aubio_peakpicker(o->pp); + if (o->pv) + del_aubio_pvoc(o->pv); + if (o->out) + del_fvec(o->out); + if (o->of) + del_fvec(o->of); + if (o->fftgrain) + del_cvec(o->fftgrain); + if (o->dfframe) + del_fvec(o->dfframe); + if (o->onset) + del_fvec(o->onset); AUBIO_FREE(o); - return; } diff --git a/src/tempo/tempo.h b/src/tempo/tempo.h index 13637f9..e2afe99 100644 --- a/src/tempo/tempo.h +++ b/src/tempo/tempo.h @@ -154,8 +154,8 @@ smpl_t aubio_tempo_get_bpm(aubio_tempo_t * o); \param o beat tracking object - \return confidence with which the tempo has been observed, `0` if no - consistent value is found. + \return confidence with which the tempo has been observed, the higher the + more confidence, `0` if no consistent value is found. */ smpl_t aubio_tempo_get_confidence(aubio_tempo_t * o); diff --git a/src/temporal/biquad.c b/src/temporal/biquad.c index 6a03aa6..426b64f 100644 --- a/src/temporal/biquad.c +++ b/src/temporal/biquad.c @@ -41,7 +41,7 @@ aubio_filter_set_biquad (aubio_filter_t * f, lsmp_t b0, lsmp_t b1, lsmp_t b2, bs->data[2] = b2; as->data[0] = 1.; as->data[1] = a1; - as->data[1] = a2; + as->data[2] = a2; return AUBIO_OK; } diff --git a/src/temporal/resampler.c b/src/temporal/resampler.c index 2c9d2fd..adfb08f 100644 --- a/src/temporal/resampler.c +++ b/src/temporal/resampler.c @@ -18,14 +18,16 @@ */ -#include "config.h" - #include "aubio_priv.h" #include "fvec.h" #include "temporal/resampler.h" #ifdef HAVE_SAMPLERATE +#if HAVE_AUBIO_DOUBLE +#error "Should not use libsamplerate with aubio in double precision" +#endif + #include <samplerate.h> /* from libsamplerate */ struct _aubio_resampler_t diff --git a/src/utils/hist.c b/src/utils/hist.c index 9b5ab10..2dcc443 100644 --- a/src/utils/hist.c +++ b/src/utils/hist.c @@ -43,6 +43,10 @@ aubio_hist_t * new_aubio_hist (smpl_t flow, smpl_t fhig, uint_t nelems){ smpl_t step = (fhig-flow)/(smpl_t)(nelems); smpl_t accum = step; uint_t i; + if ((sint_t)nelems <= 0) { + AUBIO_FREE(s); + return NULL; + } s->nelems = nelems; s->hist = new_fvec(nelems); s->cent = new_fvec(nelems); diff --git a/src/utils/log.c b/src/utils/log.c new file mode 100644 index 0000000..967c2d6 --- /dev/null +++ b/src/utils/log.c @@ -0,0 +1,92 @@ +/* + Copyright (C) 2016 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#include "aubio_priv.h" +#include "log.h" + +/** array of pointers to logging functions, one per level */ +static aubio_log_function_t aubio_log_function[AUBIO_LOG_LAST_LEVEL]; +/** array of pointers to closure passed to logging functions, one per level */ +static void* aubio_log_user_data[AUBIO_LOG_LAST_LEVEL]; +/** buffer for logging messages */ +static char aubio_log_buffer[512]; + +/** private function used by default by logging functions */ +void +aubio_default_log(sint_t level, const char_t *message, void * data UNUSED) +{ + FILE *out; + out = stdout; + if (level == AUBIO_LOG_ERR || level == AUBIO_LOG_DBG || level == AUBIO_LOG_WRN) { + out = stderr; + } + fprintf(out, "%s", message); + //fflush(out); +} + +uint_t +aubio_log(sint_t level, const char_t *fmt, ...) +{ + aubio_log_function_t fun = NULL; + + va_list args; + va_start(args, fmt); + vsnprintf(aubio_log_buffer, sizeof(aubio_log_buffer), fmt, args); + va_end(args); + + if ((level >= 0) && (level < AUBIO_LOG_LAST_LEVEL)) { + fun = aubio_log_function[level]; + if (fun != NULL) { + (*fun)(level, aubio_log_buffer, aubio_log_user_data[level]); + } else { + aubio_default_log(level, aubio_log_buffer, NULL); + } + } + return AUBIO_FAIL; +} + +void +aubio_log_reset(void) +{ + uint_t i = 0; + for (i = 0; i < AUBIO_LOG_LAST_LEVEL; i++) { + aubio_log_set_level_function(i, aubio_default_log, NULL); + } +} + +aubio_log_function_t +aubio_log_set_level_function(sint_t level, aubio_log_function_t fun, void * data) +{ + aubio_log_function_t old = NULL; + if ((level >= 0) && (level < AUBIO_LOG_LAST_LEVEL)) { + old = aubio_log_function[level]; + aubio_log_function[level] = fun; + aubio_log_user_data[level] = data; + } + return old; +} + +void +aubio_log_set_function(aubio_log_function_t fun, void * data) { + uint_t i = 0; + for (i = 0; i < AUBIO_LOG_LAST_LEVEL; i++) { + aubio_log_set_level_function(i, fun, data); + } +} diff --git a/src/utils/log.h b/src/utils/log.h new file mode 100644 index 0000000..091e91d --- /dev/null +++ b/src/utils/log.h @@ -0,0 +1,99 @@ +/* + Copyright (C) 2016 Paul Brossier <piem@aubio.org> + + This file is part of aubio. + + aubio is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + aubio is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with aubio. If not, see <http://www.gnu.org/licenses/>. + +*/ + +#ifndef AUBIO_LOG_H +#define AUBIO_LOG_H + +#ifdef __cplusplus +extern "C" { +#endif + +/** \file + + Logging features + + This file specifies ::aubio_log_set_function and + ::aubio_log_set_level_function, which let you define one or several custom + logging functions to redirect warnings and errors from aubio to your + application. The custom function should have the prototype defined in + ::aubio_log_function_t. + + After a call to ::aubio_log_set_level_function, ::aubio_log_reset can be used + to reset each logging functions to the default ones. + + \example utils/test-log.c + +*/ + +/** list of logging levels */ +enum aubio_log_level { + AUBIO_LOG_ERR, /**< critical errors */ + AUBIO_LOG_INF, /**< infos */ + AUBIO_LOG_MSG, /**< general messages */ + AUBIO_LOG_DBG, /**< debug messages */ + AUBIO_LOG_WRN, /**< warnings */ + AUBIO_LOG_LAST_LEVEL, /**< number of valid levels */ +}; + +/** Logging function prototype, to be passed to ::aubio_log_set_function + + \param level log level + \param message text to log + \param data optional closure used by the callback + + See @ref utils/test-log.c for an example of logging function. + + */ +typedef void (*aubio_log_function_t)(sint_t level, const char_t *message, void + *data); + +/** Set logging function for all levels + + \param fun the function to be used to log, of type ::aubio_log_function_t + \param data optional closure to be passed to the function (can be NULL if + nothing to pass) + + */ +void aubio_log_set_function(aubio_log_function_t fun, void* data); + +/** Set logging function for a given level + + \param level the level for which to set the logging function + \param fun the function to be used to log, of type ::aubio_log_function_t + \param data optional closure to be passed to the function (can be NULL if + nothing to pass) + +*/ +aubio_log_function_t aubio_log_set_level_function(sint_t level, + aubio_log_function_t fun, void* data); + +/** Reset all logging functions to the default one + + After calling this function, the default logging function will be used to + print error, warning, normal, and debug messages to `stdout` or `stderr`. + + */ +void aubio_log_reset(void); + +#ifdef __cplusplus +} +#endif + +#endif /* AUBIO_LOG_H */ diff --git a/src/utils/parameter.c b/src/utils/parameter.c index f8a8b45..00b3979 100644 --- a/src/utils/parameter.c +++ b/src/utils/parameter.c @@ -18,7 +18,6 @@ */ -#include "config.h" #include "aubio_priv.h" #include "parameter.h" diff --git a/src/utils/windll.c b/src/utils/windll.c index 7c11af6..cad0443 100644 --- a/src/utils/windll.c +++ b/src/utils/windll.c @@ -24,7 +24,7 @@ */ -#include "config.h" +#include "aubio_priv.h" #ifdef HAVE_WIN_HACKS @@ -41,9 +41,9 @@ #include "aubio.h" -BOOL APIENTRY DllMain( HMODULE hModule, +BOOL APIENTRY DllMain( HMODULE hModule UNUSED, DWORD ul_reason_for_call, - LPVOID lpReserved ) + LPVOID lpReserved UNUSED) { switch (ul_reason_for_call) { diff --git a/src/vecutils.c b/src/vecutils.c index 8e49f96..8907b5e 100644 --- a/src/vecutils.c +++ b/src/vecutils.c @@ -1,4 +1,3 @@ -#include "config.h" #include "aubio_priv.h" #include "types.h" #include "fvec.h" diff --git a/src/wscript_build b/src/wscript_build index c55d5f2..895c191 100644 --- a/src/wscript_build +++ b/src/wscript_build @@ -3,10 +3,12 @@ uselib = [] uselib += ['M'] uselib += ['FFTW3', 'FFTW3F'] +uselib += ['INTEL_IPP'] uselib += ['SAMPLERATE'] uselib += ['SNDFILE'] uselib += ['AVCODEC'] uselib += ['AVFORMAT'] +uselib += ['SWRESAMPLE'] uselib += ['AVRESAMPLE'] uselib += ['AVUTIL'] uselib += ['BLAS'] @@ -23,24 +25,28 @@ ctx(features = 'c', if ctx.env['DEST_OS'] in ['ios', 'iosimulator']: build_features = ['cstlib', 'cshlib'] elif ctx.env['DEST_OS'] in ['win32', 'win64']: - build_features = ['cstlib', 'cshlib'] + build_features = ['cstlib', 'cshlib gensyms'] elif ctx.env['DEST_OS'] in ['emscripten']: + build_features = ['cstlib','cshlib'] +elif '--static' in ctx.env['LDFLAGS'] or '--static' in ctx.env['LINKFLAGS']: + # static in cflags, ... build_features = ['cstlib'] -else: #linux, darwin, android, mingw, ... +else: + # linux, darwin, android, mingw, ... build_features = ['cstlib', 'cshlib'] # also install static lib from waflib.Tools.c import cstlib -from waflib.Tools.fc import fcstlib -fcstlib.inst_to = cstlib.inst_to = '${LIBDIR}' +cstlib.inst_to = '${LIBDIR}' for target in build_features: ctx(features = 'c ' + target, use = uselib + ['lib_objects'], target = 'aubio', + export_symbols_regex=r'(?:.*aubio|fvec|lvec|cvec|fmat|new|del)_.*', vnum = ctx.env['LIB_VERSION']) # install headers, except _priv.h ones -ctx.install_files('${PREFIX}/include/aubio/', +ctx.install_files('${INCLUDEDIR}/aubio/', ctx.path.ant_glob('**/*.h', excl = ['**_priv.h', 'config.h']), relative_trick=True) diff --git a/tests/create_tests_source.py b/tests/create_tests_source.py new file mode 100755 index 0000000..1feb144 --- /dev/null +++ b/tests/create_tests_source.py @@ -0,0 +1,42 @@ +#! /usr/bin/env python + +""" Create a simple stereo file containing a sine tone at 441 Hz, using only +python's built-in modules. """ + +import wave +import math +import struct + + +def create_sine_wave(freq, samplerate, nframes, nchannels): + """ create a pure tone (without numpy) """ + _x = [0.7 * math.sin(2. * math.pi * freq * t / float(samplerate)) + for t in range(nframes)] + _x = [int(a * 32767) for a in _x] + _x = b''.join([b''.join([struct.pack('h', v) + for _ in range(nchannels)]) + for v in _x]) + return _x + + +def create_test_sound(pathname, freq=441, duration=None, + framerate=44100, nchannels=2): + """ create a sound file at pathname, overwriting exiting file """ + sampwidth = 2 + nframes = duration or framerate # defaults to 1 second duration + fid = wave.open(pathname, 'w') + fid.setnchannels(nchannels) + fid.setsampwidth(sampwidth) + fid.setframerate(framerate) + fid.setnframes(nframes) + frames = create_sine_wave(freq, framerate, nframes, nchannels) + fid.writeframes(frames) + fid.close() + return 0 + + +if __name__ == '__main__': + import sys + if len(sys.argv) < 2: + sys.exit(2) + sys.exit(create_test_sound(sys.argv[1])) diff --git a/tests/src/io/base-sink_custom.h b/tests/src/io/base-sink_custom.h new file mode 100644 index 0000000..3104366 --- /dev/null +++ b/tests/src/io/base-sink_custom.h @@ -0,0 +1,170 @@ +// this should be included *after* custom functions have been defined + +#ifndef aubio_sink_custom +#define aubio_sink_custom "undefined" +#endif /* aubio_sink_custom */ + +#ifdef HAVE_AUBIO_SINK_CUSTOM +int test_wrong_params(void); + +int base_main(int argc, char **argv) +{ + uint_t err = 0; + if (argc < 3 || argc >= 6) { + PRINT_ERR("wrong number of arguments, running tests\n"); + err = test_wrong_params(); + PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [hop_size]\n", + argv[0]); + return err; + } + + uint_t samplerate = 0; + uint_t hop_size = 512; + uint_t n_frames = 0, read = 0; + + char_t *source_path = argv[1]; + char_t *sink_path = argv[2]; + + aubio_source_t *src = NULL; + aubio_sink_custom_t *snk = NULL; + + if ( argc >= 4 ) samplerate = atoi(argv[3]); + if ( argc >= 5 ) hop_size = atoi(argv[4]); + + fvec_t *vec = new_fvec(hop_size); + if (!vec) { err = 1; goto failure; } + + src = new_aubio_source(source_path, samplerate, hop_size); + if (!src) { err = 1; goto failure; } + if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(src); + + snk = new_aubio_sink_custom(sink_path, samplerate); + if (!snk) { err = 1; goto failure; } + + do { + aubio_source_do(src, vec, &read); + aubio_sink_custom_do(snk, vec, read); + n_frames += read; + } while ( read == hop_size ); + + PRINT_MSG("%d frames at %dHz (%d blocks) read from %s, wrote to %s\n", + n_frames, samplerate, n_frames / hop_size, + source_path, sink_path); + + // close sink now (optional) + aubio_sink_custom_close(snk); + +failure: + if (snk) + del_aubio_sink_custom(snk); + if (src) + del_aubio_source(src); + if (vec) + del_fvec(vec); + + return err; +} + +int test_wrong_params(void) +{ + fvec_t *vec; + fmat_t *mat; + aubio_sink_custom_t *s; + char_t sink_path[PATH_MAX] = "tmp_aubio_XXXXXX"; + uint_t samplerate = 44100; + uint_t hop_size = 256; + uint_t oversized_hop_size = 4097; + uint_t oversized_samplerate = 192000 * 8 + 1; + uint_t channels = 3; + uint_t oversized_channels = 1025; + // create temp file + int fd = create_temp_sink(sink_path); + + if (!fd) return 1; + + if (new_aubio_sink_custom( 0, samplerate)) return 1; + if (new_aubio_sink_custom("\0", samplerate)) return 1; + if (new_aubio_sink_custom(sink_path, -1)) return 1; + + s = new_aubio_sink_custom(sink_path, 0); + + // check setting wrong parameters fails + if (!aubio_sink_custom_preset_samplerate(s, oversized_samplerate)) return 1; + if (!aubio_sink_custom_preset_channels(s, oversized_channels)) return 1; + if (!aubio_sink_custom_preset_channels(s, -1)) return 1; + + // check setting valid parameters passes + if (aubio_sink_custom_preset_samplerate(s, samplerate)) return 1; + if (aubio_sink_custom_preset_channels(s, 1)) return 1; + + // check writing a vector with valid length + vec = new_fvec(hop_size); + aubio_sink_custom_do(s, vec, hop_size); + // check writing more than in the input + aubio_sink_custom_do(s, vec, hop_size+1); + // check write 0 frames + aubio_sink_custom_do(s, vec, 0); + del_fvec(vec); + + // check writing an oversized vector + vec = new_fvec(oversized_hop_size); + aubio_sink_custom_do(s, vec, oversized_hop_size); + del_fvec(vec); + + // test delete without closing + del_aubio_sink_custom(s); + + s = new_aubio_sink_custom(sink_path, 0); + + // preset channels first + if (aubio_sink_custom_preset_channels(s, channels)) return 1; + if (aubio_sink_custom_preset_samplerate(s, samplerate)) return 1; + + if (aubio_sink_custom_get_samplerate(s) != samplerate) return 1; + if (aubio_sink_custom_get_channels(s) != channels) return 1; + + mat = new_fmat(channels, hop_size); + // check writing a vector with valid length + aubio_sink_custom_do_multi(s, mat, hop_size); + // check writing 0 frames + aubio_sink_custom_do_multi(s, mat, 0); + // check writing more than in the input + aubio_sink_custom_do_multi(s, mat, hop_size+1); + del_fmat(mat); + + // check writing oversized input + mat = new_fmat(channels, oversized_hop_size); + aubio_sink_custom_do_multi(s, mat, oversized_hop_size); + del_fmat(mat); + + // check writing undersized input + mat = new_fmat(channels - 1, hop_size); + aubio_sink_custom_do_multi(s, mat, hop_size); + del_fmat(mat); + + aubio_sink_custom_close(s); + // test closing twice + aubio_sink_custom_close(s); + + del_aubio_sink_custom(s); + + // delete temp file + close_temp_sink(sink_path, fd); + + // shouldn't crash on null (bypassed, only check del_aubio_sink) + // del_aubio_sink_custom(NULL); + + return run_on_default_source_and_sink(base_main); +} + +#else /* HAVE_AUBIO_SINK_CUSTOM */ + +int base_main(int argc, char** argv) +{ + PRINT_ERR("aubio was not compiled with aubio_sink_" + aubio_sink_custom ", failed running %s with %d args\n", + argv[0], argc); + return 0; +} + +#endif /* HAVE_AUBIO_SINK_CUSTOM */ diff --git a/tests/src/io/base-source_custom.h b/tests/src/io/base-source_custom.h new file mode 100644 index 0000000..95540cf --- /dev/null +++ b/tests/src/io/base-source_custom.h @@ -0,0 +1,172 @@ +// this should be included *after* custom functions have been defined + +#ifndef aubio_source_custom +#define aubio_source_custom "undefined" +#endif /* aubio_source_custom */ + +#ifdef HAVE_AUBIO_SOURCE_CUSTOM +int test_wrong_params(void); + +int base_main(int argc, char **argv) +{ + uint_t err = 0; + if (argc < 2) { + PRINT_ERR("not enough arguments, running tests\n"); + err = test_wrong_params(); + PRINT_MSG("read a wave file as a mono vector\n"); + PRINT_MSG("usage: %s <source_path> [samplerate] [hop_size]\n", argv[0]); + PRINT_MSG("examples:\n"); + PRINT_MSG(" - read file.wav at original samplerate\n"); + PRINT_MSG(" %s file.wav\n", argv[0]); + PRINT_MSG(" - read file.wav at 32000Hz\n"); + PRINT_MSG(" %s file.aif 32000\n", argv[0]); + PRINT_MSG(" - read file.wav at original samplerate with 4096 blocks\n"); + PRINT_MSG(" %s file.wav 0 4096 \n", argv[0]); + return err; + } + + uint_t samplerate = 0; + uint_t hop_size = 256; + uint_t n_frames = 0, read = 0; + if ( argc >= 3 ) samplerate = atoi(argv[2]); + if ( argc >= 4 ) hop_size = atoi(argv[3]); + + char_t *source_path = argv[1]; + + aubio_source_custom_t * s = + new_aubio_source_custom(source_path, samplerate, hop_size); + fvec_t *vec = new_fvec(hop_size); + if (!s || !vec) { err = 1; goto beach; } + + uint_t n_frames_expected = aubio_source_custom_get_duration(s); + + samplerate = aubio_source_custom_get_samplerate(s); + + do { + aubio_source_custom_do(s, vec, &read); + fvec_print (vec); + n_frames += read; + } while ( read == hop_size ); + + PRINT_MSG("read %d frames (expected %d) at %dHz (%d blocks) from %s\n", + n_frames, n_frames_expected, samplerate, n_frames / hop_size, + source_path); + + // close the file (optional) + aubio_source_custom_close(s); + +beach: + if (vec) + del_fvec(vec); + if (s) + del_aubio_source_custom(s); + return err; +} + +int test_wrong_params(void) +{ + char_t *uri = DEFINEDSTRING(AUBIO_TESTS_SOURCE); + uint_t samplerate = 44100; + uint_t hop_size = 512; + uint_t channels, read = 0; + fvec_t *vec; + fmat_t *mat; + aubio_source_custom_t *s; + + if (new_aubio_source_custom(0, samplerate, hop_size)) return 1; + if (new_aubio_source_custom("\0", samplerate, hop_size)) return 1; + if (new_aubio_source_custom(uri, -1, hop_size)) return 1; + if (new_aubio_source_custom(uri, 0, 0)) return 1; + + s = new_aubio_source_custom(uri, samplerate, hop_size); + if (!s) return 1; + channels = aubio_source_custom_get_channels(s); + + // vector to read downmixed samples + vec = new_fvec(hop_size); + // matrix to read individual channels + mat = new_fmat(channels, hop_size); + + if (aubio_source_custom_get_samplerate(s) != samplerate) return 1; + + // read first hop_size frames + aubio_source_custom_do(s, vec, &read); + if (read != hop_size) return 1; + + // read again in undersized vector + del_fvec(vec); + vec = new_fvec(hop_size - 1); + aubio_source_custom_do(s, vec, &read); + if (read != hop_size - 1) return 1; + + // read again in oversized vector + del_fvec(vec); + vec = new_fvec(hop_size + 1); + aubio_source_custom_do(s, vec, &read); + if (read != hop_size) return 1; + + // seek to 0 + if(aubio_source_custom_seek(s, 0)) return 1; + + // read again as multiple channels + aubio_source_custom_do_multi(s, mat, &read); + if (read != hop_size) return 1; + + // read again as multiple channels in an undersized matrix + del_fmat(mat); + mat = new_fmat(channels - 1, hop_size); + aubio_source_custom_do_multi(s, mat, &read); + if (read != hop_size) return 1; + + // read again as multiple channels in an undersized matrix + del_fmat(mat); + mat = new_fmat(channels, hop_size - 1); + aubio_source_custom_do_multi(s, mat, &read); + if (read != hop_size - 1) return 1; + + // read again as multiple channels in an oversized matrix + del_fmat(mat); + mat = new_fmat(channels + 1, hop_size); + aubio_source_custom_do_multi(s, mat, &read); + if (read != hop_size) return 1; + + // read again as multiple channels in an oversized matrix + del_fmat(mat); + mat = new_fmat(channels, hop_size + 1); + aubio_source_custom_do_multi(s, mat, &read); + if (read != hop_size) return 1; + + // close the file (optional) + aubio_source_custom_close(s); + // test closing the file a second time + aubio_source_custom_close(s); + + // reading after close fails + del_fvec(vec); + vec = new_fvec(hop_size); + aubio_source_custom_do(s, vec, &read); + del_fmat(mat); + mat = new_fmat(channels, hop_size); + aubio_source_custom_do_multi(s, mat, &read); + + del_aubio_source_custom(s); + del_fmat(mat); + del_fvec(vec); + + // shouldn't crash on null (bypassed, only check del_aubio_source) + // del_aubio_source_custom(NULL); + + return run_on_default_source(base_main); +} + +#else /* HAVE_AUBIO_SOURCE_CUSTOM */ + +int base_main(int argc, char** argv) +{ + PRINT_ERR("aubio was not compiled with aubio_source_" + aubio_source_custom ", failed running %s with %d args\n", + argv[0], argc); + return 0; +} + +#endif /* HAVE_AUBIO_SOURCE_CUSTOM */ diff --git a/tests/src/io/test-sink-multi.c b/tests/src/io/test-sink-multi.c deleted file mode 100644 index 3f44787..0000000 --- a/tests/src/io/test-sink-multi.c +++ /dev/null @@ -1,73 +0,0 @@ -#define AUBIO_UNSTABLE 1 -#include <aubio.h> -#include "utils_tests.h" - -// this file uses the unstable aubio api, please use aubio_sink instead -// see src/io/sink.h and tests/src/sink/test-sink.c - -int main (int argc, char **argv) -{ - sint_t err = 0; - - if (argc < 3) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [channels] [hop_size]\n", argv[0]); - return err; - } - - uint_t samplerate = 0; - uint_t channels = 0; - uint_t hop_size = 512; - uint_t n_frames = 0, read = 0; - - char_t *source_path = argv[1]; - char_t *sink_path = argv[2]; - - if ( argc >= 4 ) samplerate = atoi(argv[3]); - if ( argc >= 5 ) channels = atoi(argv[4]); - if ( argc >= 6 ) hop_size = atoi(argv[5]); - if ( argc >= 7 ) { - err = 2; - PRINT_ERR("too many arguments\n"); - return err; - } - - aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); - if (!i) { err = 1; goto beach_source; } - - if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); - if (channels == 0 ) channels = aubio_source_get_channels(i); - - fmat_t *mat = new_fmat(channels, hop_size); - if (!mat) { err = 1; goto beach_fmat; } - - aubio_sink_t *o = new_aubio_sink(sink_path, 0); - if (!o) { err = 1; goto beach_sink; } - err = aubio_sink_preset_samplerate(o, samplerate); - if (err) { goto beach; } - err = aubio_sink_preset_channels(o, channels); - if (err) { goto beach; } - - do { - aubio_source_do_multi(i, mat, &read); - aubio_sink_do_multi(o, mat, read); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %d frames at %dHz in %d channels (%d blocks) from %s written to %s\n", - n_frames, samplerate, channels, n_frames / hop_size, - source_path, sink_path); - PRINT_MSG("wrote %s with %dHz in %d channels\n", sink_path, - aubio_sink_get_samplerate(o), - aubio_sink_get_channels(o) ); - -beach: - del_aubio_sink(o); -beach_sink: - del_fmat(mat); -beach_fmat: - del_aubio_source(i); -beach_source: - return err; -} diff --git a/tests/src/io/test-sink.c b/tests/src/io/test-sink.c index 2f04177..35a2060 100644 --- a/tests/src/io/test-sink.c +++ b/tests/src/io/test-sink.c @@ -1,14 +1,16 @@ #include <aubio.h> #include "utils_tests.h" -int main (int argc, char **argv) -{ - sint_t err = 0; +int test_wrong_params(void); - if (argc < 3) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [hop_size]\n", argv[0]); +int main(int argc, char **argv) +{ + uint_t err = 0; + if (argc < 3 || argc >= 6) { + PRINT_ERR("wrong number of arguments, running tests\n"); + err = test_wrong_params(); + PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [hop_size]\n", + argv[0]); return err; } @@ -19,40 +21,134 @@ int main (int argc, char **argv) char_t *source_path = argv[1]; char_t *sink_path = argv[2]; + aubio_source_t *src = NULL; + aubio_sink_t *snk = NULL; + if ( argc >= 4 ) samplerate = atoi(argv[3]); if ( argc >= 5 ) hop_size = atoi(argv[4]); - if ( argc >= 6 ) { - err = 2; - PRINT_ERR("too many arguments\n"); - return err; - } fvec_t *vec = new_fvec(hop_size); - if (!vec) { err = 1; goto beach_fvec; } + if (!vec) { err = 1; goto failure; } - aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); - if (!i) { err = 1; goto beach_source; } + src = new_aubio_source(source_path, samplerate, hop_size); + if (!src) { err = 1; goto failure; } + if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(src); - if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); - - aubio_sink_t *o = new_aubio_sink(sink_path, samplerate); - if (!o) { err = 1; goto beach_sink; } + snk = new_aubio_sink(sink_path, samplerate); + if (!snk) { err = 1; goto failure; } do { - aubio_source_do(i, vec, &read); - aubio_sink_do(o, vec, read); + aubio_source_do(src, vec, &read); + aubio_sink_do(snk, vec, read); n_frames += read; } while ( read == hop_size ); - PRINT_MSG("read %d frames at %dHz (%d blocks) from %s written to %s\n", + PRINT_MSG("%d frames read at %dHz (%d blocks) from %s and written to %s\n", n_frames, samplerate, n_frames / hop_size, source_path, sink_path); - del_aubio_sink(o); -beach_sink: - del_aubio_source(i); -beach_source: - del_fvec(vec); -beach_fvec: + // close sink now (optional) + aubio_sink_close(snk); + +failure: + if (snk) + del_aubio_sink(snk); + if (src) + del_aubio_source(src); + if (vec) + del_fvec(vec); + return err; } + +int test_wrong_params(void) +{ + fvec_t *vec; + fmat_t *mat; + aubio_sink_t *s; + char_t sink_path[PATH_MAX] = "tmp_aubio_XXXXXX"; + uint_t samplerate = 44100; + uint_t hop_size = 256; + uint_t oversized_hop_size = 4097; + uint_t oversized_samplerate = 192000 * 8 + 1; + uint_t channels = 3; + uint_t oversized_channels = 1025; + // create temp file + int fd = create_temp_sink(sink_path); + + if (!fd) return 1; + + if (new_aubio_sink( 0, samplerate)) return 1; + if (new_aubio_sink("\0", samplerate)) return 1; + if (new_aubio_sink(sink_path, -1)) return 1; + + s = new_aubio_sink(sink_path, 0); + + // check setting wrong parameters fails + if (!aubio_sink_preset_samplerate(s, oversized_samplerate)) return 1; + if (!aubio_sink_preset_channels(s, oversized_channels)) return 1; + if (!aubio_sink_preset_channels(s, -1)) return 1; + + // check setting valid parameters passes + if (aubio_sink_preset_samplerate(s, samplerate)) return 1; + if (aubio_sink_preset_channels(s, 1)) return 1; + + // check writing a vector with valid length + vec = new_fvec(hop_size); + aubio_sink_do(s, vec, hop_size); + // check writing more than in the input + aubio_sink_do(s, vec, hop_size+1); + // check write 0 frames + aubio_sink_do(s, vec, 0); + del_fvec(vec); + + // check writing an oversized vector + vec = new_fvec(oversized_hop_size); + aubio_sink_do(s, vec, oversized_hop_size); + del_fvec(vec); + + // test delete without closing + del_aubio_sink(s); + + s = new_aubio_sink(sink_path, 0); + + // preset channels first + if (aubio_sink_preset_channels(s, channels)) return 1; + if (aubio_sink_preset_samplerate(s, samplerate)) return 1; + + if (aubio_sink_get_samplerate(s) != samplerate) return 1; + if (aubio_sink_get_channels(s) != channels) return 1; + + mat = new_fmat(channels, hop_size); + // check writing a vector with valid length + aubio_sink_do_multi(s, mat, hop_size); + // check writing 0 frames + aubio_sink_do_multi(s, mat, 0); + // check writing more than in the input + aubio_sink_do_multi(s, mat, hop_size+1); + del_fmat(mat); + + // check writing oversized input + mat = new_fmat(channels, oversized_hop_size); + aubio_sink_do_multi(s, mat, oversized_hop_size); + del_fmat(mat); + + // check writing undersized input + mat = new_fmat(channels - 1, hop_size); + aubio_sink_do_multi(s, mat, hop_size); + del_fmat(mat); + + aubio_sink_close(s); + // test closing twice + aubio_sink_close(s); + + del_aubio_sink(s); + + // delete temp file + close_temp_sink(sink_path, fd); + + // shouldn't crash on null + del_aubio_sink(NULL); + + return run_on_default_source_and_sink(main); +} diff --git a/tests/src/io/test-sink_apple_audio-multi.c b/tests/src/io/test-sink_apple_audio-multi.c deleted file mode 100644 index fc06d5e..0000000 --- a/tests/src/io/test-sink_apple_audio-multi.c +++ /dev/null @@ -1,78 +0,0 @@ -#define AUBIO_UNSTABLE 1 -#include <aubio.h> -#include "utils_tests.h" - -// this file uses the unstable aubio api, please use aubio_sink instead -// see src/io/sink.h and tests/src/sink/test-sink.c - -int main (int argc, char **argv) -{ - sint_t err = 0; - - if (argc < 3) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [channels] [hop_size]\n", argv[0]); - return err; - } - -#ifdef HAVE_SINK_APPLE_AUDIO - uint_t samplerate = 0; - uint_t channels = 0; - uint_t hop_size = 512; - uint_t n_frames = 0, read = 0; - - char_t *source_path = argv[1]; - char_t *sink_path = argv[2]; - - if ( argc >= 4 ) samplerate = atoi(argv[3]); - if ( argc >= 5 ) channels = atoi(argv[4]); - if ( argc >= 6 ) hop_size = atoi(argv[5]); - if ( argc >= 7 ) { - err = 2; - PRINT_ERR("too many arguments\n"); - return err; - } - - aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); - if (!i) { err = 1; goto beach_source; } - - if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); - if (channels == 0 ) channels = aubio_source_get_channels(i); - - fmat_t *mat = new_fmat(channels, hop_size); - if (!mat) { err = 1; goto beach_fmat; } - - aubio_sink_apple_audio_t *o = new_aubio_sink_apple_audio(sink_path, 0); - if (!o) { err = 1; goto beach_sink; } - err = aubio_sink_apple_audio_preset_samplerate(o, samplerate); - if (err) { goto beach; } - err = aubio_sink_apple_audio_preset_channels(o, channels); - if (err) { goto beach; } - - do { - aubio_source_do_multi(i, mat, &read); - aubio_sink_apple_audio_do_multi(o, mat, read); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %d frames at %dHz in %d channels (%d blocks) from %s written to %s\n", - n_frames, samplerate, channels, n_frames / hop_size, - source_path, sink_path); - PRINT_MSG("wrote %s with %dHz in %d channels\n", sink_path, - aubio_sink_apple_audio_get_samplerate(o), - aubio_sink_apple_audio_get_channels(o) ); - -beach: - del_aubio_sink_apple_audio(o); -beach_sink: - del_fmat(mat); -beach_fmat: - del_aubio_source(i); -beach_source: -#else /* HAVE_SINK_APPLE_AUDIO */ - err = 3; - PRINT_ERR("aubio was not compiled with aubio_sink_apple_audio\n"); -#endif /* HAVE_SINK_APPLE_AUDIO */ - return err; -} diff --git a/tests/src/io/test-sink_apple_audio.c b/tests/src/io/test-sink_apple_audio.c index 86d1769..8349cc8 100644 --- a/tests/src/io/test-sink_apple_audio.c +++ b/tests/src/io/test-sink_apple_audio.c @@ -2,66 +2,28 @@ #include <aubio.h> #include "utils_tests.h" -// this file uses the unstable aubio api, please use aubio_sink instead -// see src/io/sink.h and tests/src/sink/test-sink.c - -int main (int argc, char **argv) -{ - sint_t err = 0; - - if (argc < 3) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [hop_size]\n", argv[0]); - return err; - } +#define aubio_sink_custom "apple_audio" #ifdef HAVE_SINK_APPLE_AUDIO - uint_t samplerate = 0; - uint_t hop_size = 512; - uint_t n_frames = 0, read = 0; - - char_t *source_path = argv[1]; - char_t *sink_path = argv[2]; - - if ( argc >= 4 ) samplerate = atoi(argv[3]); - if ( argc >= 5 ) hop_size = atoi(argv[4]); - if ( argc >= 6 ) { - err = 2; - PRINT_ERR("too many arguments\n"); - return err; - } - - fvec_t *vec = new_fvec(hop_size); - if (!vec) { err = 1; goto beach_fvec; } - - aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); - if (!i) { err = 1; goto beach_source; } - - if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); - - aubio_sink_apple_audio_t *o = new_aubio_sink_apple_audio(sink_path, samplerate); - if (!o) { err = 1; goto beach_sink; } +#define HAVE_AUBIO_SINK_CUSTOM +#define aubio_sink_custom_t aubio_sink_apple_audio_t +#define new_aubio_sink_custom new_aubio_sink_apple_audio +#define del_aubio_sink_custom del_aubio_sink_apple_audio +#define aubio_sink_custom_do aubio_sink_apple_audio_do +#define aubio_sink_custom_do_multi aubio_sink_apple_audio_do_multi +#define aubio_sink_custom_close aubio_sink_apple_audio_close +#define aubio_sink_custom_preset_samplerate aubio_sink_apple_audio_preset_samplerate +#define aubio_sink_custom_preset_channels aubio_sink_apple_audio_preset_channels +#define aubio_sink_custom_get_samplerate aubio_sink_apple_audio_get_samplerate +#define aubio_sink_custom_get_channels aubio_sink_apple_audio_get_channels +#endif /* HAVE_SINK_APPLE_AUDIO */ - do { - aubio_source_do(i, vec, &read); - aubio_sink_apple_audio_do(o, vec, read); - n_frames += read; - } while ( read == hop_size ); +#include "base-sink_custom.h" - PRINT_MSG("read %d frames at %dHz (%d blocks) from %s written to %s\n", - n_frames, samplerate, n_frames / hop_size, - source_path, sink_path); +// this file uses the unstable aubio api, please use aubio_sink instead +// see src/io/sink.h and tests/src/sink/test-sink.c - del_aubio_sink_apple_audio(o); -beach_sink: - del_aubio_source(i); -beach_source: - del_fvec(vec); -beach_fvec: -#else /* HAVE_SINK_APPLE_AUDIO */ - err = 3; - PRINT_ERR("aubio was not compiled with aubio_source_apple_audio\n"); -#endif /* HAVE_SINK_APPLE_AUDIO */ - return err; +int main (int argc, char **argv) +{ + return base_main(argc, argv); } diff --git a/tests/src/io/test-sink_sndfile-multi.c b/tests/src/io/test-sink_sndfile-multi.c deleted file mode 100644 index 4dcc690..0000000 --- a/tests/src/io/test-sink_sndfile-multi.c +++ /dev/null @@ -1,79 +0,0 @@ -#define AUBIO_UNSTABLE 1 -#include <aubio.h> -#include "config.h" -#include "utils_tests.h" - -// this file uses the unstable aubio api, please use aubio_sink instead -// see src/io/sink.h and tests/src/sink/test-sink.c - -int main (int argc, char **argv) -{ - sint_t err = 0; - - if (argc < 3) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [channels] [hop_size]\n", argv[0]); - return err; - } - -#ifdef HAVE_SNDFILE - uint_t samplerate = 0; - uint_t channels = 0; - uint_t hop_size = 512; - uint_t n_frames = 0, read = 0; - - char_t *source_path = argv[1]; - char_t *sink_path = argv[2]; - - if ( argc >= 4 ) samplerate = atoi(argv[3]); - if ( argc >= 5 ) channels = atoi(argv[4]); - if ( argc >= 6 ) hop_size = atoi(argv[5]); - if ( argc >= 7 ) { - err = 2; - PRINT_ERR("too many arguments\n"); - return err; - } - - aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); - if (!i) { err = 1; goto beach_source; } - - if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); - if (channels == 0 ) channels = aubio_source_get_channels(i); - - fmat_t *mat = new_fmat(channels, hop_size); - if (!mat) { err = 1; goto beach_fmat; } - - aubio_sink_sndfile_t *o = new_aubio_sink_sndfile(sink_path, 0); - if (!o) { err = 1; goto beach_sink; } - err = aubio_sink_sndfile_preset_samplerate(o, samplerate); - if (err) { goto beach; } - err = aubio_sink_sndfile_preset_channels(o, channels); - if (err) { goto beach; } - - do { - aubio_source_do_multi(i, mat, &read); - aubio_sink_sndfile_do_multi(o, mat, read); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %d frames at %dHz in %d channels (%d blocks) from %s written to %s\n", - n_frames, samplerate, channels, n_frames / hop_size, - source_path, sink_path); - PRINT_MSG("wrote %s with %dHz in %d channels\n", sink_path, - aubio_sink_sndfile_get_samplerate(o), - aubio_sink_sndfile_get_channels(o) ); - -beach: - del_aubio_sink_sndfile(o); -beach_sink: - del_fmat(mat); -beach_fmat: - del_aubio_source(i); -beach_source: -#else - err = 3; - PRINT_ERR("aubio was not compiled with aubio_sink_sndfile\n"); -#endif /* HAVE_SNDFILE */ - return err; -} diff --git a/tests/src/io/test-sink_sndfile.c b/tests/src/io/test-sink_sndfile.c index 7812229..5195086 100644 --- a/tests/src/io/test-sink_sndfile.c +++ b/tests/src/io/test-sink_sndfile.c @@ -1,68 +1,29 @@ #define AUBIO_UNSTABLE 1 #include <aubio.h> -#include "config.h" #include "utils_tests.h" -// this file uses the unstable aubio api, please use aubio_sink instead -// see src/io/sink.h and tests/src/sink/test-sink.c - -int main (int argc, char **argv) -{ - sint_t err = 0; - - if (argc < 3) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [hop_size]\n", argv[0]); - return err; - } +#define aubio_sink_custom "sndfile" #ifdef HAVE_SNDFILE - uint_t samplerate = 0; - uint_t hop_size = 512; - uint_t n_frames = 0, read = 0; - - char_t *source_path = argv[1]; - char_t *sink_path = argv[2]; - - if ( argc >= 4 ) samplerate = atoi(argv[3]); - if ( argc >= 5 ) hop_size = atoi(argv[4]); - if ( argc >= 6 ) { - err = 2; - PRINT_ERR("too many arguments\n"); - return err; - } - - fvec_t *vec = new_fvec(hop_size); - if (!vec) { err = 1; goto beach_fvec; } - - aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); - if (!i) { err = 1; goto beach_source; } - - if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); - - aubio_sink_sndfile_t *o = new_aubio_sink_sndfile(sink_path, samplerate); - if (!o) { err = 1; goto beach_sink; } +#define HAVE_AUBIO_SINK_CUSTOM +#define aubio_sink_custom_t aubio_sink_sndfile_t +#define new_aubio_sink_custom new_aubio_sink_sndfile +#define del_aubio_sink_custom del_aubio_sink_sndfile +#define aubio_sink_custom_do aubio_sink_sndfile_do +#define aubio_sink_custom_do_multi aubio_sink_sndfile_do_multi +#define aubio_sink_custom_close aubio_sink_sndfile_close +#define aubio_sink_custom_preset_samplerate aubio_sink_sndfile_preset_samplerate +#define aubio_sink_custom_preset_channels aubio_sink_sndfile_preset_channels +#define aubio_sink_custom_get_samplerate aubio_sink_sndfile_get_samplerate +#define aubio_sink_custom_get_channels aubio_sink_sndfile_get_channels +#endif /* HAVE_SNDFILE */ - do { - aubio_source_do(i, vec, &read); - aubio_sink_sndfile_do(o, vec, read); - n_frames += read; - } while ( read == hop_size ); +#include "base-sink_custom.h" - PRINT_MSG("read %d frames at %dHz (%d blocks) from %s written to %s\n", - n_frames, samplerate, n_frames / hop_size, - source_path, sink_path); +// this file uses the unstable aubio api, please use aubio_sink instead +// see src/io/sink.h and tests/src/sink/test-sink.c - del_aubio_sink_sndfile(o); -beach_sink: - del_aubio_source(i); -beach_source: - del_fvec(vec); -beach_fvec: -#else - err = 3; - PRINT_ERR("aubio was not compiled with aubio_source_sndfile\n"); -#endif /* HAVE_SNDFILE */ - return err; +int main (int argc, char **argv) +{ + return base_main(argc, argv); } diff --git a/tests/src/io/test-sink_wavwrite-multi.c b/tests/src/io/test-sink_wavwrite-multi.c deleted file mode 100644 index c80e2c0..0000000 --- a/tests/src/io/test-sink_wavwrite-multi.c +++ /dev/null @@ -1,78 +0,0 @@ -#define AUBIO_UNSTABLE 1 -#include <aubio.h> -#include "utils_tests.h" - -// this file uses the unstable aubio api, please use aubio_sink instead -// see src/io/sink.h and tests/src/sink/test-sink.c - -int main (int argc, char **argv) -{ - sint_t err = 0; - - if (argc < 3) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [channels] [hop_size]\n", argv[0]); - return err; - } - -#ifdef HAVE_WAVWRITE - uint_t samplerate = 0; - uint_t channels = 0; - uint_t hop_size = 512; - uint_t n_frames = 0, read = 0; - - char_t *source_path = argv[1]; - char_t *sink_path = argv[2]; - - if ( argc >= 4 ) samplerate = atoi(argv[3]); - if ( argc >= 5 ) channels = atoi(argv[4]); - if ( argc >= 6 ) hop_size = atoi(argv[5]); - if ( argc >= 7 ) { - err = 2; - PRINT_ERR("too many arguments\n"); - return err; - } - - aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); - if (!i) { err = 1; goto beach_source; } - - if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); - if (channels == 0 ) channels = aubio_source_get_channels(i); - - fmat_t *mat = new_fmat(channels, hop_size); - if (!mat) { err = 1; goto beach_fmat; } - - aubio_sink_wavwrite_t *o = new_aubio_sink_wavwrite(sink_path, 0); - if (!o) { err = 1; goto beach_sink; } - err = aubio_sink_wavwrite_preset_samplerate(o, samplerate); - if (err) { goto beach; } - err = aubio_sink_wavwrite_preset_channels(o, channels); - if (err) { goto beach; } - - do { - aubio_source_do_multi(i, mat, &read); - aubio_sink_wavwrite_do_multi(o, mat, read); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %d frames at %dHz in %d channels (%d blocks) from %s written to %s\n", - n_frames, samplerate, channels, n_frames / hop_size, - source_path, sink_path); - PRINT_MSG("wrote %s with %dHz in %d channels\n", sink_path, - aubio_sink_wavwrite_get_samplerate(o), - aubio_sink_wavwrite_get_channels(o) ); - -beach: - del_aubio_sink_wavwrite(o); -beach_sink: - del_fmat(mat); -beach_fmat: - del_aubio_source(i); -beach_source: -#else - err = 3; - PRINT_ERR("aubio was not compiled with aubio_sink_wavwrite\n"); -#endif /* HAVE_WAVWRITE */ - return err; -} diff --git a/tests/src/io/test-sink_wavwrite.c b/tests/src/io/test-sink_wavwrite.c index 2e758e3..e219fe6 100644 --- a/tests/src/io/test-sink_wavwrite.c +++ b/tests/src/io/test-sink_wavwrite.c @@ -2,66 +2,28 @@ #include <aubio.h> #include "utils_tests.h" -// this file uses the unstable aubio api, please use aubio_sink instead -// see src/io/sink.h and tests/src/sink/test-sink.c - -int main (int argc, char **argv) -{ - sint_t err = 0; - - if (argc < 3) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [hop_size]\n", argv[0]); - return err; - } +#define aubio_sink_custom "wavwrite" #ifdef HAVE_WAVWRITE - uint_t samplerate = 0; - uint_t hop_size = 512; - uint_t n_frames = 0, read = 0; - - char_t *source_path = argv[1]; - char_t *sink_path = argv[2]; - - if ( argc >= 4 ) samplerate = atoi(argv[3]); - if ( argc >= 5 ) hop_size = atoi(argv[4]); - if ( argc >= 6 ) { - err = 2; - PRINT_ERR("too many arguments\n"); - return err; - } - - fvec_t *vec = new_fvec(hop_size); - if (!vec) { err = 1; goto beach_fvec; } - - aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); - if (!i) { err = 1; goto beach_source; } - - if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); - - aubio_sink_wavwrite_t *o = new_aubio_sink_wavwrite(sink_path, samplerate); - if (!o) { err = 1; goto beach_sink; } +#define HAVE_AUBIO_SINK_CUSTOM +#define aubio_sink_custom_t aubio_sink_wavwrite_t +#define new_aubio_sink_custom new_aubio_sink_wavwrite +#define del_aubio_sink_custom del_aubio_sink_wavwrite +#define aubio_sink_custom_do aubio_sink_wavwrite_do +#define aubio_sink_custom_do_multi aubio_sink_wavwrite_do_multi +#define aubio_sink_custom_close aubio_sink_wavwrite_close +#define aubio_sink_custom_preset_samplerate aubio_sink_wavwrite_preset_samplerate +#define aubio_sink_custom_preset_channels aubio_sink_wavwrite_preset_channels +#define aubio_sink_custom_get_samplerate aubio_sink_wavwrite_get_samplerate +#define aubio_sink_custom_get_channels aubio_sink_wavwrite_get_channels +#endif /* HAVE_WAVWRITE */ - do { - aubio_source_do(i, vec, &read); - aubio_sink_wavwrite_do(o, vec, read); - n_frames += read; - } while ( read == hop_size ); +#include "base-sink_custom.h" - PRINT_MSG("read %d frames at %dHz (%d blocks) from %s written to %s\n", - n_frames, samplerate, n_frames / hop_size, - source_path, sink_path); +// this file uses the unstable aubio api, please use aubio_sink instead +// see src/io/sink.h and tests/src/sink/test-sink.c - del_aubio_sink_wavwrite(o); -beach_sink: - del_aubio_source(i); -beach_source: - del_fvec(vec); -beach_fvec: -#else - err = 3; - PRINT_ERR("aubio was not compiled with aubio_sink_wavwrite\n"); -#endif /* HAVE_WAVWRITE */ - return err; +int main (int argc, char **argv) +{ + return base_main(argc, argv); } diff --git a/tests/src/io/test-source.c b/tests/src/io/test-source.c index a1f17ae..d4f628e 100644 --- a/tests/src/io/test-source.c +++ b/tests/src/io/test-source.c @@ -1,12 +1,14 @@ #include <aubio.h> #include "utils_tests.h" -int main (int argc, char **argv) +int test_wrong_params(void); + +int main(int argc, char **argv) { uint_t err = 0; if (argc < 2) { - err = 2; - PRINT_ERR("not enough arguments\n"); + PRINT_ERR("not enough arguments, running tests\n"); + err = test_wrong_params(); PRINT_MSG("read a wave file as a mono vector\n"); PRINT_MSG("usage: %s <source_path> [samplerate] [hop_size]\n", argv[0]); PRINT_MSG("examples:\n"); @@ -22,16 +24,15 @@ int main (int argc, char **argv) uint_t samplerate = 0; uint_t hop_size = 256; uint_t n_frames = 0, read = 0; - if ( argc == 3 ) samplerate = atoi(argv[2]); - if ( argc == 4 ) hop_size = atoi(argv[3]); + if ( argc >= 3 ) samplerate = atoi(argv[2]); + if ( argc >= 4 ) hop_size = atoi(argv[3]); char_t *source_path = argv[1]; - aubio_source_t* s = new_aubio_source(source_path, samplerate, hop_size); - if (!s) { err = 1; goto beach; } fvec_t *vec = new_fvec(hop_size); + if (!s || !vec) { err = 1; goto beach; } uint_t n_frames_expected = aubio_source_get_duration(s); @@ -49,11 +50,107 @@ int main (int argc, char **argv) // close the file (optional) aubio_source_close(s); - // test closing the file a second time - aubio_source_close(s); - del_fvec (vec); - del_aubio_source (s); beach: + if (vec) + del_fvec(vec); + if (s) + del_aubio_source(s); return err; } + +int test_wrong_params(void) +{ + char_t *uri = DEFINEDSTRING(AUBIO_TESTS_SOURCE); + uint_t samplerate = 44100; + uint_t hop_size = 512; + uint_t channels, read = 0; + fvec_t *vec; + fmat_t *mat; + aubio_source_t *s; + + if (new_aubio_source(0, samplerate, hop_size)) return 1; + if (new_aubio_source("\0", samplerate, hop_size)) return 1; + if (new_aubio_source(uri, -1, hop_size)) return 1; + if (new_aubio_source(uri, 0, 0)) return 1; + + s = new_aubio_source(uri, samplerate, hop_size); + if (!s) return 1; + channels = aubio_source_get_channels(s); + + // vector to read downmixed samples + vec = new_fvec(hop_size); + // matrix to read individual channels + mat = new_fmat(channels, hop_size); + + if (aubio_source_get_samplerate(s) != samplerate) return 1; + + // read first hop_size frames + aubio_source_do(s, vec, &read); + if (read != hop_size) return 1; + + // read again in undersized vector + del_fvec(vec); + vec = new_fvec(hop_size - 1); + aubio_source_do(s, vec, &read); + if (read != hop_size - 1) return 1; + + // read again in oversized vector + del_fvec(vec); + vec = new_fvec(hop_size + 1); + aubio_source_do(s, vec, &read); + if (read != hop_size) return 1; + + // seek to 0 + if(aubio_source_seek(s, 0)) return 1; + + // read again as multiple channels + aubio_source_do_multi(s, mat, &read); + if (read != hop_size) return 1; + + // read again as multiple channels in an undersized matrix + del_fmat(mat); + mat = new_fmat(channels - 1, hop_size); + aubio_source_do_multi(s, mat, &read); + if (read != hop_size) return 1; + + // read again as multiple channels in an undersized matrix + del_fmat(mat); + mat = new_fmat(channels, hop_size - 1); + aubio_source_do_multi(s, mat, &read); + if (read != hop_size - 1) return 1; + + // read again as multiple channels in an oversized matrix + del_fmat(mat); + mat = new_fmat(channels + 1, hop_size); + aubio_source_do_multi(s, mat, &read); + if (read != hop_size) return 1; + + // read again as multiple channels in an oversized matrix + del_fmat(mat); + mat = new_fmat(channels, hop_size + 1); + aubio_source_do_multi(s, mat, &read); + if (read != hop_size) return 1; + + // close the file (optional) + aubio_source_close(s); + // test closing the file a second time + aubio_source_close(s); + + // reading after close fails + del_fvec(vec); + vec = new_fvec(hop_size); + aubio_source_do(s, vec, &read); + del_fmat(mat); + mat = new_fmat(channels, hop_size); + aubio_source_do_multi(s, mat, &read); + + del_aubio_source(s); + del_fmat(mat); + del_fvec(vec); + + // shouldn't crash on null + del_aubio_source(NULL); + + return run_on_default_source(main); +} diff --git a/tests/src/io/test-source_apple_audio.c b/tests/src/io/test-source_apple_audio.c index 92d0e54..25fec53 100644 --- a/tests/src/io/test-source_apple_audio.c +++ b/tests/src/io/test-source_apple_audio.c @@ -2,62 +2,29 @@ #include <aubio.h> #include "utils_tests.h" +#define aubio_source_custom "apple_audio" + +#ifdef HAVE_SOURCE_APPLE_AUDIO +#define HAVE_AUBIO_SOURCE_CUSTOM +#define aubio_source_custom_t aubio_source_apple_audio_t +#define new_aubio_source_custom new_aubio_source_apple_audio +#define del_aubio_source_custom del_aubio_source_apple_audio +#define aubio_source_custom_get_samplerate aubio_source_apple_audio_get_samplerate +#define aubio_source_custom_get_duration aubio_source_apple_audio_get_duration +#define aubio_source_custom_do aubio_source_apple_audio_do +#define aubio_source_custom_do_multi aubio_source_apple_audio_do_multi +#define aubio_source_custom_seek aubio_source_apple_audio_seek +#define aubio_source_custom_close aubio_source_apple_audio_close +#define aubio_source_custom_get_channels aubio_source_apple_audio_get_channels +#define aubio_source_custom_get_samplerate aubio_source_apple_audio_get_samplerate +#endif /* HAVE_SOURCE_APPLE_AUDIO */ + +#include "base-source_custom.h" + // this file uses the unstable aubio api, please use aubio_source instead // see src/io/source.h and tests/src/source/test-source.c int main (int argc, char **argv) { - uint_t err = 0; - if (argc < 2) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("read a wave file as a mono vector\n"); - PRINT_MSG("usage: %s <source_path> [samplerate] [hop_size]\n", argv[0]); - PRINT_MSG("examples:\n"); - PRINT_MSG(" - read file.wav at original samplerate\n"); - PRINT_MSG(" %s file.wav\n", argv[0]); - PRINT_MSG(" - read file.aif at 32000Hz\n"); - PRINT_MSG(" %s file.aif 32000\n", argv[0]); - PRINT_MSG(" - read file.mp3 at original samplerate with 4096 blocks\n"); - PRINT_MSG(" %s file.mp3 0 4096 \n", argv[0]); - return err; - } - -#if HAVE_SOURCE_APPLE_AUDIO - uint_t samplerate = 0; - uint_t hop_size = 256; - uint_t n_frames = 0, read = 0; - if ( argc == 3 ) samplerate = atoi(argv[2]); - if ( argc == 4 ) hop_size = atoi(argv[3]); - - char_t *source_path = argv[1]; - - - aubio_source_apple_audio_t * s = - new_aubio_source_apple_audio(source_path, samplerate, hop_size); - if (!s) { err = 1; goto beach; } - fvec_t *vec = new_fvec(hop_size); - - uint_t n_frames_expected = aubio_source_apple_audio_get_duration(s); - - samplerate = aubio_source_apple_audio_get_samplerate(s); - - do { - aubio_source_apple_audio_do(s, vec, &read); - fvec_print (vec); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %d frames (expected %d) at %dHz (%d blocks) from %s\n", - n_frames, n_frames_expected, samplerate, n_frames / hop_size, - source_path); - - del_fvec (vec); - del_aubio_source_apple_audio (s); -beach: -#else /* HAVE_SOURCE_APPLE_AUDIO */ - err = 3; - PRINT_ERR("aubio was not compiled with aubio_source_apple_audio\n"); -#endif /* HAVE_SOURCE_APPLE_AUDIO */ - return err; + return base_main(argc, argv); } diff --git a/tests/src/io/test-source_avcodec.c b/tests/src/io/test-source_avcodec.c index dc23d76..c2c2a13 100644 --- a/tests/src/io/test-source_avcodec.c +++ b/tests/src/io/test-source_avcodec.c @@ -2,62 +2,29 @@ #include <aubio.h> #include "utils_tests.h" -// this file uses the unstable aubio api, please use aubio_source instead -// see src/io/source.h and tests/src/source/test-source.c - -int main (int argc, char **argv) -{ - uint_t err = 0; - if (argc < 2) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("read a wave file as a mono vector\n"); - PRINT_MSG("usage: %s <source_path> [samplerate] [hop_size]\n", argv[0]); - PRINT_MSG("examples:\n"); - PRINT_MSG(" - read file.wav at original samplerate\n"); - PRINT_MSG(" %s file.wav\n", argv[0]); - PRINT_MSG(" - read file.wav at 32000Hz\n"); - PRINT_MSG(" %s file.aif 32000\n", argv[0]); - PRINT_MSG(" - read file.wav at original samplerate with 4096 blocks\n"); - PRINT_MSG(" %s file.wav 0 4096 \n", argv[0]); - return err; - } +#define aubio_source_custom "avcodec" #ifdef HAVE_LIBAV - uint_t samplerate = 0; - uint_t hop_size = 256; - uint_t n_frames = 0, read = 0; - if ( argc == 3 ) samplerate = atoi(argv[2]); - if ( argc == 4 ) hop_size = atoi(argv[3]); - - char_t *source_path = argv[1]; - - - aubio_source_avcodec_t * s = - new_aubio_source_avcodec(source_path, samplerate, hop_size); - if (!s) { err = 1; goto beach; } - fvec_t *vec = new_fvec(hop_size); - - uint_t n_frames_expected = aubio_source_avcodec_get_duration(s); - - samplerate = aubio_source_avcodec_get_samplerate(s); +#define HAVE_AUBIO_SOURCE_CUSTOM +#define aubio_source_custom_t aubio_source_avcodec_t +#define new_aubio_source_custom new_aubio_source_avcodec +#define del_aubio_source_custom del_aubio_source_avcodec +#define aubio_source_custom_get_samplerate aubio_source_avcodec_get_samplerate +#define aubio_source_custom_get_duration aubio_source_avcodec_get_duration +#define aubio_source_custom_do aubio_source_avcodec_do +#define aubio_source_custom_do_multi aubio_source_avcodec_do_multi +#define aubio_source_custom_seek aubio_source_avcodec_seek +#define aubio_source_custom_close aubio_source_avcodec_close +#define aubio_source_custom_get_channels aubio_source_avcodec_get_channels +#define aubio_source_custom_get_samplerate aubio_source_avcodec_get_samplerate +#endif /* HAVE_LIBAV */ - do { - aubio_source_avcodec_do(s, vec, &read); - fvec_print (vec); - n_frames += read; - } while ( read == hop_size ); +#include "base-source_custom.h" - PRINT_MSG("read %d frames (expected %d) at %dHz (%d blocks) from %s\n", - n_frames, n_frames_expected, samplerate, n_frames / hop_size, - source_path); +// this file uses the unstable aubio api, please use aubio_source instead +// see src/io/source.h and tests/src/source/test-source.c - del_fvec (vec); - del_aubio_source_avcodec (s); -beach: -#else /* HAVE_LIBAV */ - err = 3; - PRINT_ERR("aubio was not compiled with aubio_source_avcodec\n"); -#endif /* HAVE_LIBAV */ - return err; +int main (int argc, char **argv) +{ + return base_main(argc, argv); } diff --git a/tests/src/io/test-source_multi.c b/tests/src/io/test-source_multi.c deleted file mode 100644 index b3b4683..0000000 --- a/tests/src/io/test-source_multi.c +++ /dev/null @@ -1,57 +0,0 @@ -#include <aubio.h> -#include "utils_tests.h" - -int main (int argc, char **argv) -{ - sint_t err = 0; - if (argc < 2) { - err = -2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("read a wave file as a mono vector\n"); - PRINT_MSG("usage: %s <source_path> [samplerate] [hop_size]\n", argv[0]); - PRINT_MSG("examples:\n"); - PRINT_MSG(" - read file.wav at original samplerate\n"); - PRINT_MSG(" %s file.wav\n", argv[0]); - PRINT_MSG(" - read file.wav at 32000Hz\n"); - PRINT_MSG(" %s file.aif 32000\n", argv[0]); - PRINT_MSG(" - read file.wav at original samplerate with 4096 blocks\n"); - PRINT_MSG(" %s file.wav 0 4096 \n", argv[0]); - PRINT_MSG(" - read file.wav at original samplerate with 256 frames blocks, mono\n"); - PRINT_MSG(" %s file.wav 0 4096 1\n", argv[0]); - return err; - } - - uint_t samplerate = 0; - uint_t hop_size = 256; - uint_t n_frames = 0, read = 0; - uint_t n_channels = 0; - if ( argc >= 3 ) samplerate = atoi(argv[2]); - if ( argc >= 4 ) hop_size = atoi(argv[3]); - if ( argc >= 5 ) n_channels = atoi(argv[4]); - - char_t *source_path = argv[1]; - - aubio_source_t* s = new_aubio_source(source_path, samplerate, hop_size); - if (!s) { err = -1; goto beach; } - - if ( samplerate == 0 ) samplerate = aubio_source_get_samplerate(s); - - if ( n_channels == 0 ) n_channels = aubio_source_get_channels(s); - - fmat_t *mat = new_fmat(n_channels, hop_size); - - do { - aubio_source_do_multi (s, mat, &read); - fmat_print (mat); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %d frames in %d channels at %dHz (%d blocks) from %s\n", - n_frames, n_channels, samplerate, n_frames / hop_size, source_path); - - del_fmat (mat); - del_aubio_source (s); -beach: - - return err; -} diff --git a/tests/src/io/test-source_seek.c b/tests/src/io/test-source_seek.c deleted file mode 100644 index 773aefe..0000000 --- a/tests/src/io/test-source_seek.c +++ /dev/null @@ -1,92 +0,0 @@ -#include <aubio.h> -#include "utils_tests.h" - -int main (int argc, char **argv) -{ - uint_t err = 0; - if (argc < 2) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("read a wave file as a mono vector\n"); - PRINT_MSG("usage: %s <source_path> [samplerate] [hop_size]\n", argv[0]); - PRINT_MSG("examples:\n"); - PRINT_MSG(" - read file.wav at original samplerate\n"); - PRINT_MSG(" %s file.wav\n", argv[0]); - PRINT_MSG(" - read file.wav at 32000Hz\n"); - PRINT_MSG(" %s file.aif 32000\n", argv[0]); - PRINT_MSG(" - read file.wav at original samplerate with 4096 blocks\n"); - PRINT_MSG(" %s file.wav 0 4096 \n", argv[0]); - return err; - } - - uint_t samplerate = 0; - uint_t hop_size = 256; - uint_t n_frames = 0, read = 0; - uint_t old_n_frames_1 = 0, old_n_frames_2 = 0, old_n_frames_3 = 0; - if ( argc == 3 ) samplerate = atoi(argv[2]); - if ( argc == 4 ) hop_size = atoi(argv[3]); - - char_t *source_path = argv[1]; - - fvec_t *vec = new_fvec(hop_size); - - aubio_source_t* s = new_aubio_source(source_path, samplerate, hop_size); - if (!s) { err = 1; goto beach; } - - if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(s); - - do { - aubio_source_do(s, vec, &read); - //fvec_print (vec); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %.2fs, %d frames at %dHz (%d blocks) from %s\n", - n_frames * 1. / samplerate, - n_frames, samplerate, - n_frames / hop_size, source_path); - - old_n_frames_1 = n_frames; - - aubio_source_seek (s, 0); - - n_frames = 0; - do { - aubio_source_do(s, vec, &read); - //fvec_print (vec); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %.2fs, %d frames at %dHz (%d blocks) from %s\n", - n_frames * 1. / samplerate, - n_frames, samplerate, - n_frames / hop_size, source_path); - - old_n_frames_2 = n_frames; - - aubio_source_seek (s, old_n_frames_1 / 2); - - n_frames = 0; - do { - aubio_source_do(s, vec, &read); - //fvec_print (vec); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %.2fs, %d frames at %dHz (%d blocks) from %s\n", - n_frames * 1. / samplerate, - n_frames, samplerate, - n_frames / hop_size, source_path); - - old_n_frames_3 = n_frames; - - del_aubio_source (s); -beach: - del_fvec (vec); - - // check that we got exactly the same number of frames - assert ( old_n_frames_2 == old_n_frames_1 ); - // check that we got about half the frames, with 3 decimals - assert ( roundf(1.e3 * old_n_frames_1 / old_n_frames_3) / 1.e3 == 2.); - return err; -} diff --git a/tests/src/io/test-source_sndfile.c b/tests/src/io/test-source_sndfile.c index 6dfff59..5c2c06f 100644 --- a/tests/src/io/test-source_sndfile.c +++ b/tests/src/io/test-source_sndfile.c @@ -2,62 +2,29 @@ #include <aubio.h> #include "utils_tests.h" +#define aubio_source_custom "sndfile" + +#ifdef HAVE_SNDFILE +#define HAVE_AUBIO_SOURCE_CUSTOM +#define aubio_source_custom_t aubio_source_sndfile_t +#define new_aubio_source_custom new_aubio_source_sndfile +#define del_aubio_source_custom del_aubio_source_sndfile +#define aubio_source_custom_get_samplerate aubio_source_sndfile_get_samplerate +#define aubio_source_custom_get_duration aubio_source_sndfile_get_duration +#define aubio_source_custom_do aubio_source_sndfile_do +#define aubio_source_custom_do_multi aubio_source_sndfile_do_multi +#define aubio_source_custom_seek aubio_source_sndfile_seek +#define aubio_source_custom_close aubio_source_sndfile_close +#define aubio_source_custom_get_channels aubio_source_sndfile_get_channels +#define aubio_source_custom_get_samplerate aubio_source_sndfile_get_samplerate +#endif /* HAVE_LIBAV */ + +#include "base-source_custom.h" + // this file uses the unstable aubio api, please use aubio_source instead // see src/io/source.h and tests/src/source/test-source.c int main (int argc, char **argv) { - uint_t err = 0; - if (argc < 2) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("read a wave file as a mono vector\n"); - PRINT_MSG("usage: %s <source_path> [samplerate] [hop_size]\n", argv[0]); - PRINT_MSG("examples:\n"); - PRINT_MSG(" - read file.wav at original samplerate\n"); - PRINT_MSG(" %s file.wav\n", argv[0]); - PRINT_MSG(" - read file.wav at 32000Hz\n"); - PRINT_MSG(" %s file.aif 32000\n", argv[0]); - PRINT_MSG(" - read file.wav at original samplerate with 4096 blocks\n"); - PRINT_MSG(" %s file.wav 0 4096 \n", argv[0]); - return err; - } - -#ifdef HAVE_SNDFILE - uint_t samplerate = 0; - uint_t hop_size = 256; - uint_t n_frames = 0, read = 0; - if ( argc == 3 ) samplerate = atoi(argv[2]); - if ( argc == 4 ) hop_size = atoi(argv[3]); - - char_t *source_path = argv[1]; - - - aubio_source_sndfile_t * s = - new_aubio_source_sndfile(source_path, samplerate, hop_size); - if (!s) { err = 1; goto beach; } - fvec_t *vec = new_fvec(hop_size); - - uint_t n_frames_expected = aubio_source_sndfile_get_duration(s); - - samplerate = aubio_source_sndfile_get_samplerate(s); - - do { - aubio_source_sndfile_do(s, vec, &read); - fvec_print (vec); - n_frames += read; - } while ( read == hop_size ); - - PRINT_MSG("read %d frames (expected %d) at %dHz (%d blocks) from %s\n", - n_frames, n_frames_expected, samplerate, n_frames / hop_size, - source_path); - - del_fvec (vec); - del_aubio_source_sndfile (s); -beach: -#else - err = 3; - PRINT_ERR("aubio was not compiled with aubio_source_sndfile\n"); -#endif /* HAVE_SNDFILE */ - return err; + return base_main(argc, argv); } diff --git a/tests/src/io/test-source_wavread.c b/tests/src/io/test-source_wavread.c index 23511ca..81fa6a8 100644 --- a/tests/src/io/test-source_wavread.c +++ b/tests/src/io/test-source_wavread.c @@ -2,63 +2,29 @@ #include <aubio.h> #include "utils_tests.h" -// this file uses the unstable aubio api, please use aubio_source instead -// see src/io/source.h and tests/src/source/test-source.c - -int main (int argc, char **argv) -{ - uint_t err = 0; - if (argc < 2) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("read a wave file as a mono vector\n"); - PRINT_MSG("usage: %s <source_path> [samplerate] [hop_size]\n", argv[0]); - PRINT_MSG("examples:\n"); - PRINT_MSG(" - read file.wav at original samplerate\n"); - PRINT_MSG(" %s file.wav\n", argv[0]); - PRINT_MSG(" - read file.wav at 32000Hz\n"); - PRINT_MSG(" %s file.aif 32000\n", argv[0]); - PRINT_MSG(" - read file.wav at original samplerate with 4096 blocks\n"); - PRINT_MSG(" %s file.wav 0 4096 \n", argv[0]); - return err; - } +#define aubio_source_custom "wavread" #ifdef HAVE_WAVREAD - uint_t samplerate = 0; - uint_t hop_size = 256; - uint_t n_frames = 0, read = 0; - if ( argc == 3 ) samplerate = atoi(argv[2]); - if ( argc == 4 ) hop_size = atoi(argv[3]); - - char_t *source_path = argv[1]; - - - aubio_source_wavread_t * s = - new_aubio_source_wavread(source_path, samplerate, hop_size); - - if (!s) { err = 1; goto beach; } - fvec_t *vec = new_fvec(hop_size); - - uint_t n_frames_expected = aubio_source_wavread_get_duration(s); - - samplerate = aubio_source_wavread_get_samplerate(s); +#define HAVE_AUBIO_SOURCE_CUSTOM +#define aubio_source_custom_t aubio_source_wavread_t +#define new_aubio_source_custom new_aubio_source_wavread +#define del_aubio_source_custom del_aubio_source_wavread +#define aubio_source_custom_get_samplerate aubio_source_wavread_get_samplerate +#define aubio_source_custom_get_duration aubio_source_wavread_get_duration +#define aubio_source_custom_do aubio_source_wavread_do +#define aubio_source_custom_do_multi aubio_source_wavread_do_multi +#define aubio_source_custom_seek aubio_source_wavread_seek +#define aubio_source_custom_close aubio_source_wavread_close +#define aubio_source_custom_get_channels aubio_source_wavread_get_channels +#define aubio_source_custom_get_samplerate aubio_source_wavread_get_samplerate +#endif /* HAVE_WAVREAD */ - do { - aubio_source_wavread_do(s, vec, &read); - fvec_print (vec); - n_frames += read; - } while ( read == hop_size ); +#include "base-source_custom.h" - PRINT_MSG("read %d frames (expected %d) at %dHz (%d blocks) from %s\n", - n_frames, n_frames_expected, samplerate, n_frames / hop_size, - source_path); +// this file uses the unstable aubio api, please use aubio_source instead +// see src/io/source.h and tests/src/source/test-source.c - del_fvec (vec); - del_aubio_source_wavread (s); -beach: -#else - err = 3; - PRINT_ERR("aubio was not compiled with aubio_source_wavread\n"); -#endif /* HAVE_WAVREAD */ - return err; +int main (int argc, char **argv) +{ + return base_main(argc, argv); } diff --git a/tests/src/notes/test-notes.c b/tests/src/notes/test-notes.c new file mode 100644 index 0000000..e35d5d1 --- /dev/null +++ b/tests/src/notes/test-notes.c @@ -0,0 +1,24 @@ +#include <aubio.h> + +int main (void) +{ + uint_t buf_size = 2048; + uint_t hop_size = 512; + uint_t samplerate = 44100; + smpl_t silence, minioi_ms, release_drop; + aubio_notes_t *o = new_aubio_notes("default", + buf_size, hop_size, samplerate); + silence = aubio_notes_get_silence(o); + minioi_ms = aubio_notes_get_minioi_ms(o); + release_drop = aubio_notes_get_release_drop(o); + if (aubio_notes_set_silence(o, silence)) return 1; + if (aubio_notes_set_minioi_ms(o, minioi_ms)) return 1; + if (aubio_notes_set_release_drop(o, release_drop)) return 1; + del_aubio_notes(o); + // test wrong arguments + if (new_aubio_notes("unknown", buf_size, hop_size, samplerate)) return 1; + if (new_aubio_notes("default", 0, hop_size, samplerate)) return 1; + if (new_aubio_notes("default", buf_size, 0, samplerate)) return 1; + if (new_aubio_notes("default", buf_size, hop_size, 0)) return 1; + return 0; +} diff --git a/tests/src/onset/test-onset.c b/tests/src/onset/test-onset.c index d1476b0..cd93651 100644 --- a/tests/src/onset/test-onset.c +++ b/tests/src/onset/test-onset.c @@ -1,13 +1,15 @@ #include <aubio.h> #include "utils_tests.h" +int test_wrong_params(void); + int main (int argc, char **argv) { uint_t err = 0; if (argc < 2) { err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("read a wave file as a mono vector\n"); + PRINT_WRN("no arguments, running tests\n"); + err = test_wrong_params(); PRINT_MSG("usage: %s <source_path> [samplerate] [hop_size]\n", argv[0]); return err; } @@ -15,8 +17,8 @@ int main (int argc, char **argv) uint_t win_s = 1024; // window size uint_t hop_size = win_s / 4; uint_t n_frames = 0, read = 0; - if ( argc == 3 ) samplerate = atoi(argv[2]); - if ( argc == 4 ) hop_size = atoi(argv[3]); + if ( argc >= 3 ) samplerate = atoi(argv[2]); + if ( argc >= 4 ) hop_size = atoi(argv[3]); char_t *source_path = argv[1]; aubio_source_t * source = new_aubio_source(source_path, samplerate, hop_size); @@ -60,3 +62,36 @@ beach: return err; } + +int test_wrong_params(void) +{ + uint_t win_size = 1024; + uint_t hop_size = win_size / 2; + uint_t samplerate = 44100; + // hop_size < 1 + if (new_aubio_onset("default", 5, 0, samplerate)) return 1; + + // buf_size < 2 + if (new_aubio_onset("default", 1, 1, samplerate)) return 1; + + // buf_size < hop_size + if (new_aubio_onset("default", hop_size, win_size, samplerate)) return 1; + + // samplerate < 1 + if (new_aubio_onset("default", 1024, 512, 0)) return 1; + + // specdesc creation failed + if (new_aubio_onset("abcd", win_size, win_size/2, samplerate)) return 1; + + aubio_onset_t *o; + + // pv creation might fail + o = new_aubio_onset("default", 5, 2, samplerate); + if (o) del_aubio_onset(o); + + o = new_aubio_onset("default", win_size, hop_size, samplerate); + if (!aubio_onset_set_default_parameters(o, "wrong_type")) return 1; + del_aubio_onset(o); + + return run_on_default_source(main); +} diff --git a/tests/src/pitch/test-pitch.c b/tests/src/pitch/test-pitch.c index 1509870..45f75d8 100644 --- a/tests/src/pitch/test-pitch.c +++ b/tests/src/pitch/test-pitch.c @@ -30,5 +30,42 @@ int main (void) del_fvec (input); aubio_cleanup (); + if (new_aubio_pitch(0, win_s, hop_s, samplerate)) return 1; + if (new_aubio_pitch("unknown", win_s, hop_s, samplerate)) return 1; + if (new_aubio_pitch("default", win_s, 0, samplerate)) return 1; + if (new_aubio_pitch("default", 0, hop_s, samplerate)) return 1; + if (new_aubio_pitch("default", hop_s, win_s, samplerate)) return 1; + if (new_aubio_pitch("default", win_s, hop_s, 0)) return 1; + + o = new_aubio_pitch("default", win_s, hop_s, samplerate); + + if (aubio_pitch_set_unit(o, "freq")) return 1; + if (aubio_pitch_set_unit(o, "hertz")) return 1; + if (aubio_pitch_set_unit(o, "Hertz")) return 1; + if (aubio_pitch_set_unit(o, "Hz")) return 1; + if (aubio_pitch_set_unit(o, "f0")) return 1; + if (aubio_pitch_set_unit(o, "midi")) return 1; + if (aubio_pitch_set_unit(o, "cent")) return 1; + if (aubio_pitch_set_unit(o, "bin")) return 1; + if (!aubio_pitch_set_unit(o, "unknown")) return 1; + + if (aubio_pitch_set_tolerance(o, 0.3)) return 1; + if (aubio_pitch_set_silence(o, 0)) return 1; + if (aubio_pitch_set_silence(o, -200)) return 1; + if (!aubio_pitch_set_silence(o, -300)) return 1; + del_aubio_pitch(o); + + // fft based might fail with non power of 2 + o = new_aubio_pitch("yinfft", win_s + 1, hop_s, samplerate); + if (o) del_aubio_pitch(o); + o = new_aubio_pitch("yinfast", win_s + 1, hop_s, samplerate); + if (o) del_aubio_pitch(o); + o = new_aubio_pitch("fcomb", win_s + 1, hop_s, samplerate); + if (o) del_aubio_pitch(o); + o = new_aubio_pitch("mcomb", win_s + 1, hop_s, samplerate); + if (o) del_aubio_pitch(o); + o = new_aubio_pitch("specacf", win_s + 1, hop_s, samplerate); + if (o) del_aubio_pitch(o); + return 0; } diff --git a/tests/src/spectral/test-awhitening.c b/tests/src/spectral/test-awhitening.c new file mode 100644 index 0000000..ebfc733 --- /dev/null +++ b/tests/src/spectral/test-awhitening.c @@ -0,0 +1,111 @@ +#include <aubio.h> +#include "utils_tests.h" + +int test_wrong_params(void); + +int main (int argc, char **argv) +{ + sint_t err = 0; + + if (argc < 3) { + err = 2; + PRINT_WRN("no arguments, running tests\n"); + err = test_wrong_params(); + PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [hop_size]\n", argv[0]); + return err; + } + + uint_t samplerate = 0; + uint_t win_size = 1024; + uint_t hop_size = 512; + uint_t n_frames = 0, read = 0; + + char_t *source_path = argv[1]; + char_t *sink_path = argv[2]; + + if ( argc >= 4 ) samplerate = atoi(argv[3]); + if ( argc >= 5 ) hop_size = atoi(argv[4]); + + fvec_t *vec = new_fvec(hop_size); + fvec_t *out = new_fvec(hop_size); // output buffer + fvec_t *scale = new_fvec(hop_size); + cvec_t *fftgrain = new_cvec(win_size); // fft norm and phase + if (!vec) { err = 1; goto beach_fvec; } + + aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); + if (!i) { err = 1; goto beach_source; } + + if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); + + aubio_sink_t *o = new_aubio_sink(sink_path, samplerate); + if (!o) { err = 1; goto beach_sink; } + + aubio_pvoc_t *pv = new_aubio_pvoc(win_size, hop_size); + if (!pv) { err = 1; goto beach_pvoc; } + + aubio_spectral_whitening_t *awhitening = + new_aubio_spectral_whitening (win_size, hop_size, samplerate); + if (!awhitening) { err = 1; goto beach_awhitening; } + + aubio_spectral_whitening_set_relax_time(awhitening, 20.); + fvec_set_all(scale, 3.); + + PRINT_MSG("spectral whitening relaxation time is %f\n", + aubio_spectral_whitening_get_relax_time(awhitening)); + + do { + aubio_source_do(i, vec, &read); + aubio_pvoc_do(pv, vec, fftgrain); + // apply spectral whitening + aubio_spectral_whitening_do(awhitening, fftgrain); + // rebuild the signal + aubio_pvoc_rdo(pv, fftgrain, out); + // make louder + fvec_weight(out, scale); + // make sure we dont saturate + fvec_clamp(out, 1.); + // write output + aubio_sink_do(o, out, read); + n_frames += read; + } while ( read == hop_size ); + + PRINT_MSG("read %d frames at %dHz (%d blocks) from %s written to %s\n", + n_frames, samplerate, n_frames / hop_size, + source_path, sink_path); + + del_aubio_spectral_whitening(awhitening); +beach_awhitening: + del_aubio_pvoc(pv); +beach_pvoc: + del_aubio_sink(o); +beach_sink: + del_aubio_source(i); +beach_source: + del_fvec(vec); + del_fvec(out); + del_fvec(scale); + del_cvec(fftgrain); +beach_fvec: + return err; +} + +int test_wrong_params(void) +{ + uint_t buf_size = 512; + uint_t hop_size = 256; + uint_t samplerate = 44100; + aubio_spectral_whitening_t *o; + + if (new_aubio_spectral_whitening( 0, hop_size, samplerate)) return 1; + if (new_aubio_spectral_whitening(buf_size, 0, samplerate)) return 1; + if (new_aubio_spectral_whitening(buf_size, hop_size, 0)) return 1; + + o = new_aubio_spectral_whitening(buf_size, hop_size, samplerate); + + aubio_spectral_whitening_get_relax_time(o); + aubio_spectral_whitening_get_floor(o); + + del_aubio_spectral_whitening(o); + + return run_on_default_source_and_sink(main); +} diff --git a/tests/src/spectral/test-dct.c b/tests/src/spectral/test-dct.c new file mode 100644 index 0000000..3ea3109 --- /dev/null +++ b/tests/src/spectral/test-dct.c @@ -0,0 +1,49 @@ +#include <math.h> +#include "aubio.h" +#include "utils_tests.h" + +int main (void) +{ + int return_code = 0; + uint_t win_s = 32; // window size + uint_t i, j, n_iters = 10; // number of iterations + // create dct object + aubio_dct_t * dct = new_aubio_dct(win_s); + aubio_dct_t * tmp; + + if (new_aubio_dct(0)) return 1; + + fvec_t * in = new_fvec (win_s); // input buffer + fvec_t * dctout = new_fvec (win_s); // output buffer + fvec_t * out = new_fvec (win_s); // input buffer + + if ((tmp = new_aubio_dct(1)) == 0) return 1; + //aubio_dct_do(tmp, dctout, out); + //aubio_dct_rdo(tmp, dctout, out); + del_aubio_dct(tmp); + + if (!dct || !in || !dctout) { + return_code = 1; + return return_code; + } + + in->data[0] = 1.; + for (i = 0; i < n_iters; i++) { + aubio_dct_do (dct, in, dctout); + aubio_dct_rdo (dct, dctout, out); + for (j = 0; j < in->length; j++) { + return_code += (fabsf(in->data[j] - out->data[j]) > 10.e-4); + } + } + + fvec_print(in); + fvec_print(dctout); + fvec_print(out); + + del_fvec(dctout); + del_fvec(in); + del_fvec(out); + del_aubio_dct(dct); + + return return_code; +} diff --git a/tests/src/spectral/test-fft.c b/tests/src/spectral/test-fft.c index 72db530..4d5a5fb 100644 --- a/tests/src/spectral/test-fft.c +++ b/tests/src/spectral/test-fft.c @@ -4,7 +4,7 @@ int main (void) { int return_code = 0; uint_t i, n_iters = 100; // number of iterations - uint_t win_s = 500; // window size + uint_t win_s = 512; // window size fvec_t * in = new_fvec (win_s); // input buffer cvec_t * fftgrain = new_cvec (win_s); // fft norm and phase fvec_t * out = new_fvec (win_s); // output buffer diff --git a/tests/src/spectral/test-filterbank.c b/tests/src/spectral/test-filterbank.c index 543226d..769d144 100644 --- a/tests/src/spectral/test-filterbank.c +++ b/tests/src/spectral/test-filterbank.c @@ -8,9 +8,21 @@ int main (void) cvec_t *in_spec = new_cvec (win_s); // input vector of samples fvec_t *out_filters = new_fvec (n_filters); // per-band outputs + if (new_aubio_filterbank(0, win_s)) return 1; + if (new_aubio_filterbank(n_filters, 0)) return 1; + // create filterbank object aubio_filterbank_t *o = new_aubio_filterbank (n_filters, win_s); + smpl_t power = aubio_filterbank_get_power(o); + smpl_t norm = aubio_filterbank_get_norm(o); + if (aubio_filterbank_set_power(o, power)) { + return 1; + } + if (aubio_filterbank_set_norm(o, norm)) { + return 1; + } + // apply filterbank ten times uint_t n = 10; while (n) { diff --git a/tests/src/spectral/test-mfcc.c b/tests/src/spectral/test-mfcc.c index 23f8c64..0b4cf5a 100644 --- a/tests/src/spectral/test-mfcc.c +++ b/tests/src/spectral/test-mfcc.c @@ -1,30 +1,94 @@ #include <aubio.h> +#include "utils_tests.h" -int main (void) +int test_wrong_params(void); + +int main (int argc, char** argv) { - uint_t win_s = 512; // fft size + sint_t err = 0; + + if (argc < 2) { + err = 2; + PRINT_WRN("no arguments, running tests\n"); + err = test_wrong_params(); + PRINT_MSG("usage: %s <input_path> [samplerate] [hop_size]\n", argv[0]); + return err; + } + + uint_t win_s; // fft size + uint_t hop_s = 256; // block size + uint_t samplerate = 0; // samplerate uint_t n_filters = 40; // number of filters - uint_t n_coefs = 13; // number of coefficients - smpl_t samplerate = 16000.; // samplerate - cvec_t *in = new_cvec (win_s); // input buffer - fvec_t *out = new_fvec (n_coefs); // output coefficients + uint_t n_coeffs = 13; // number of coefficients + uint_t read = 0; + + char_t *source_path = argv[1]; + + if ( argc >= 3 ) samplerate = atoi(argv[2]); + if ( argc >= 4 ) hop_s = atoi(argv[3]); + + win_s = 2 * hop_s; + + aubio_source_t *source = 0; + aubio_pvoc_t *pv = 0; + aubio_mfcc_t *mfcc = 0; + + fvec_t *in = new_fvec (hop_s); // phase vocoder input + cvec_t *fftgrain = new_cvec (win_s); // pvoc output / mfcc input + fvec_t *out = new_fvec (n_coeffs); // mfcc output - // create mfcc object - aubio_mfcc_t *o = new_aubio_mfcc (win_s, n_filters, n_coefs, samplerate); + if (!in || !fftgrain || !out) { err = 1; goto failure; } - cvec_norm_set_all (in, 1.); - aubio_mfcc_do (o, in, out); - fvec_print (out); + // source + source = new_aubio_source(source_path, samplerate, hop_s); + if (!source) { err = 1; goto failure; } + if (samplerate == 0) samplerate = aubio_source_get_samplerate(source); - cvec_norm_set_all (in, .5); - aubio_mfcc_do (o, in, out); - fvec_print (out); + // phase vocoder + pv = new_aubio_pvoc(win_s, hop_s); + if (!pv) { err = 1; goto failure; } + + // mfcc object + mfcc = new_aubio_mfcc (win_s, n_filters, n_coeffs, samplerate); + if (!mfcc) { err = 1; goto failure; } + + // processing loop + do { + aubio_source_do(source, in, &read); + aubio_pvoc_do(pv, in, fftgrain); + aubio_mfcc_do(mfcc, fftgrain, out); + fvec_print(out); + } while (read == hop_s); + +failure: + + if (mfcc) + del_aubio_mfcc(mfcc); + if (pv) + del_aubio_pvoc(pv); + if (source) + del_aubio_source(source); + if (in) + del_fvec(in); + if (fftgrain) + del_cvec(fftgrain); + if (out) + del_fvec(out); + aubio_cleanup(); + return err; +} + +int test_wrong_params() +{ + uint_t win_s = 512; // fft size + uint_t n_filters = 40; // number of filters + uint_t n_coeffs = 13; // number of coefficients + smpl_t samplerate = 16000.; // samplerate - // clean up - del_aubio_mfcc (o); - del_cvec (in); - del_fvec (out); - aubio_cleanup (); + if (new_aubio_mfcc( 0, n_filters, n_coeffs, samplerate)) return 1; + if (new_aubio_mfcc(win_s, 0, n_coeffs, samplerate)) return 1; + if (new_aubio_mfcc(win_s, n_filters, 0, samplerate)) return 1; + if (new_aubio_mfcc(win_s, n_filters, n_coeffs, 0)) return 1; - return 0; + return run_on_default_source(main); } diff --git a/tests/src/spectral/test-phasevoc.c b/tests/src/spectral/test-phasevoc.c index e43b881..2dec675 100644 --- a/tests/src/spectral/test-phasevoc.c +++ b/tests/src/spectral/test-phasevoc.c @@ -13,6 +13,13 @@ int main (void) // allocate fft and other memory space aubio_pvoc_t * pv = new_aubio_pvoc(win_s,hop_s); + if (new_aubio_pvoc(win_s, 0)) return 1; + + if (aubio_pvoc_get_win(pv) != win_s) return 1; + if (aubio_pvoc_get_hop(pv) != hop_s) return 1; + + if (aubio_pvoc_set_window(pv, "hanningz") != 0) return 1; + // fill input with some data fvec_set_all (in, 1.); fvec_print (in); @@ -28,7 +35,7 @@ int main (void) // ... cvec_print (fftgrain); - // optionnaly rebuild the signa + // optionally rebuild the signal aubio_pvoc_rdo(pv,fftgrain,out); // and do something with the result diff --git a/tests/src/spectral/test-tss.c b/tests/src/spectral/test-tss.c index 0e18b20..db73735 100644 --- a/tests/src/spectral/test-tss.c +++ b/tests/src/spectral/test-tss.c @@ -34,6 +34,10 @@ int main (void) aubio_pvoc_rdo (pvs, ctrans, trans); } + aubio_tss_set_alpha(tss, 4.); + aubio_tss_set_beta(tss, 3.); + aubio_tss_set_threshold(tss, 3.); + del_aubio_pvoc(pv); del_aubio_pvoc(pvt); del_aubio_pvoc(pvs); diff --git a/tests/src/synth/test-sampler.c b/tests/src/synth/test-sampler.c index 0f3bfa7..bc3bcf0 100644 --- a/tests/src/synth/test-sampler.c +++ b/tests/src/synth/test-sampler.c @@ -1,3 +1,5 @@ +#include <string.h> // strncpy +#include <limits.h> // PATH_MAX #include <aubio.h> #include "utils_tests.h" @@ -5,9 +7,9 @@ int main (int argc, char **argv) { sint_t err = 0; - if (argc < 4) { - err = 2; - PRINT_ERR("not enough arguments\n"); + if (argc < 3) { + PRINT_ERR("not enough arguments, running tests\n"); + err = run_on_default_source_and_sink(main); PRINT_MSG("usage: %s <input_path> <output_path> <sample_path> [samplerate]\n", argv[0]); return err; } @@ -18,8 +20,15 @@ int main (int argc, char **argv) char_t *source_path = argv[1]; char_t *sink_path = argv[2]; - char_t *sample_path = argv[3]; - if ( argc == 5 ) samplerate = atoi(argv[4]); + char_t sample_path[PATH_MAX]; + if ( argc >= 4 ) { + strncpy(sample_path, argv[3], PATH_MAX - 1); + } else { + // use input_path as sample + strncpy(sample_path, source_path, PATH_MAX - 1); + } + sample_path[PATH_MAX - 1] = '\0'; + if ( argc >= 5 ) samplerate = atoi(argv[4]); fvec_t *vec = new_fvec(hop_size); aubio_source_t *source = new_aubio_source(source_path, samplerate, hop_size); diff --git a/tests/src/synth/test-wavetable.c b/tests/src/synth/test-wavetable.c index f569277..8d35b93 100644 --- a/tests/src/synth/test-wavetable.c +++ b/tests/src/synth/test-wavetable.c @@ -6,8 +6,8 @@ int main (int argc, char **argv) sint_t err = 0; if (argc < 2) { - err = 2; - PRINT_ERR("not enough arguments\n"); + PRINT_ERR("not enough arguments, running tests\n"); + err = run_on_default_sink(main); PRINT_MSG("usage: %s <output_path> [freq] [samplerate]\n", argv[0]); return err; } @@ -17,8 +17,8 @@ int main (int argc, char **argv) smpl_t freq = 440.; char_t *sink_path = argv[1]; - if ( argc == 4 ) samplerate = atoi(argv[3]); - if ( argc == 3 ) freq = atof(argv[2]); + if ( argc >= 4 ) samplerate = atoi(argv[3]); + if ( argc >= 3 ) freq = atof(argv[2]); fvec_t *vec = new_fvec(hop_size); aubio_sink_t *sink = new_aubio_sink(sink_path, samplerate); diff --git a/tests/src/tempo/test-tempo.c b/tests/src/tempo/test-tempo.c index 7a87832..d14e9ae 100644 --- a/tests/src/tempo/test-tempo.c +++ b/tests/src/tempo/test-tempo.c @@ -1,14 +1,16 @@ #include <aubio.h> #include "utils_tests.h" +int test_wrong_params(void); + int main (int argc, char **argv) { uint_t err = 0; if (argc < 2) { - err = 2; - PRINT_ERR("not enough arguments\n"); - PRINT_MSG("read a wave file as a mono vector\n"); - PRINT_MSG("usage: %s <source_path> [samplerate] [win_size] [hop_size]\n", argv[0]); + PRINT_WRN("no arguments, running tests\n"); + err = test_wrong_params(); + PRINT_MSG("usage: %s <source_path> [samplerate] [win_size] [hop_size]\n", + argv[0]); return err; } uint_t samplerate = 0; @@ -20,7 +22,8 @@ int main (int argc, char **argv) uint_t n_frames = 0, read = 0; char_t *source_path = argv[1]; - aubio_source_t * source = new_aubio_source(source_path, samplerate, hop_size); + aubio_source_t * source = new_aubio_source(source_path, samplerate, + hop_size); if (!source) { err = 1; goto beach; } if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(source); @@ -30,7 +33,10 @@ int main (int argc, char **argv) fvec_t * out = new_fvec (1); // output position // create tempo object - aubio_tempo_t * o = new_aubio_tempo("default", win_size, hop_size, samplerate); + aubio_tempo_t * o = new_aubio_tempo("default", win_size, hop_size, + samplerate); + + if (!o) { err = 1; goto beach_tempo; } do { // put some fresh data in input vector @@ -39,9 +45,11 @@ int main (int argc, char **argv) aubio_tempo_do(o,in,out); // do something with the beats if (out->data[0] != 0) { - PRINT_MSG("beat at %.3fms, %.3fs, frame %d, %.2fbpm with confidence %.2f\n", + PRINT_MSG("beat at %.3fms, %.3fs, frame %d, %.2f bpm " + "with confidence %.2f\n", aubio_tempo_get_last_ms(o), aubio_tempo_get_last_s(o), - aubio_tempo_get_last(o), aubio_tempo_get_bpm(o), aubio_tempo_get_confidence(o)); + aubio_tempo_get_last(o), aubio_tempo_get_bpm(o), + aubio_tempo_get_confidence(o)); } n_frames += read; } while ( read == hop_size ); @@ -53,6 +61,7 @@ int main (int argc, char **argv) // clean up memory del_aubio_tempo(o); +beach_tempo: del_fvec(in); del_fvec(out); del_aubio_source(source); @@ -61,3 +70,57 @@ beach: return err; } + +int test_wrong_params(void) +{ + uint_t win_size = 1024; + uint_t hop_size = 256; + uint_t samplerate = 44100; + aubio_tempo_t *t; + fvec_t* in, *out; + uint_t i; + + // test wrong method fails + if (new_aubio_tempo("undefined", win_size, hop_size, samplerate)) return 1; + + // test hop > win fails + if (new_aubio_tempo("default", hop_size, win_size, samplerate)) return 1; + + // test null hop_size fails + if (new_aubio_tempo("default", win_size, 0, samplerate)) return 1; + + // test 1 buf_size fails + if (new_aubio_tempo("default", 1, 1, samplerate)) return 1; + + // test null samplerate fails + if (new_aubio_tempo("default", win_size, hop_size, 0)) return 1; + + // test short sizes workaround + t = new_aubio_tempo("default", 2048, 2048, 500); + if (!t) return 1; + + del_aubio_tempo(t); + + t = new_aubio_tempo("default", win_size, hop_size, samplerate); + if (!t) return 1; + + in = new_fvec(hop_size); + out = new_fvec(1); + + // up to step = (next_power_of_two(5.8 * samplerate / hop_size ) / 4 ) + for (i = 0; i < 256 + 1; i++) + { + aubio_tempo_do(t,in,out); + PRINT_MSG("beat at %.3fms, %.3fs, frame %d, %.2f bpm " + "with confidence %.2f, was tatum %d\n", + aubio_tempo_get_last_ms(t), aubio_tempo_get_last_s(t), + aubio_tempo_get_last(t), aubio_tempo_get_bpm(t), + aubio_tempo_get_confidence(t), aubio_tempo_was_tatum(t)); + } + + del_aubio_tempo(t); + del_fvec(in); + del_fvec(out); + + return run_on_default_source(main); +} diff --git a/tests/src/temporal/test-filter.c b/tests/src/temporal/test-filter.c index 260d81b..b156316 100644 --- a/tests/src/temporal/test-filter.c +++ b/tests/src/temporal/test-filter.c @@ -8,6 +8,15 @@ int main (void) fvec_t *out = new_fvec (win_s); // input buffer aubio_filter_t *o = new_aubio_filter_c_weighting (44100); + + if (new_aubio_filter(0)) return 1; + + if (aubio_filter_get_samplerate(o) != 44100) return 1; + + if (aubio_filter_set_c_weighting (o, -1) == 0) return 1; + + if (aubio_filter_set_c_weighting (0, 32000) == 0) return 1; + in->data[impulse_at] = 0.5; fvec_print (in); aubio_filter_do (o, in); @@ -15,6 +24,11 @@ int main (void) del_aubio_filter (o); o = new_aubio_filter_a_weighting (32000); + + if (aubio_filter_set_a_weighting (o, -1) == 0) return 1; + + if (aubio_filter_set_a_weighting (0, 32000) == 0) return 1; + in->data[impulse_at] = 0.5; fvec_print (in); aubio_filter_do_outplace (o, in, out); diff --git a/tests/src/test-cvec.c b/tests/src/test-cvec.c index c1e6bca..6e16ea5 100644 --- a/tests/src/test-cvec.c +++ b/tests/src/test-cvec.c @@ -3,46 +3,75 @@ int main (void) { - uint_t i, window_size = 16; // window size - cvec_t * complex_vector = new_cvec (window_size); // input buffer - uint_t rand_times = 4; - - utils_init_random(); - - while (rand_times -- ) { - // fill with random phas and norm - for ( i = 0; i < complex_vector->length; i++ ) { - complex_vector->norm[i] = ( 2. / RAND_MAX * random() - 1. ); - complex_vector->phas[i] = ( 2. / RAND_MAX * random() - 1. ) * M_PI; - } - // print the vector - cvec_print(complex_vector); - } + uint_t i, window_size = 16; + cvec_t * complex_vector = new_cvec(window_size); + cvec_t * other_cvector = new_cvec(window_size); - // set all vector elements to `0` - cvec_norm_zeros(complex_vector); + assert(cvec_norm_get_data(complex_vector) == complex_vector->norm); + assert(cvec_phas_get_data(complex_vector) == complex_vector->phas); + assert(complex_vector->length == window_size / 2 + 1); + + // all elements are initialized to 0 for ( i = 0; i < complex_vector->length; i++ ) { assert( complex_vector->norm[i] == 0. ); - // assert( complex_vector->phas[i] == 0 ); + assert( complex_vector->phas[i] == 0. ); } + + cvec_norm_set_sample(complex_vector, 2., 1); + assert(cvec_norm_get_sample(complex_vector, 1)); + + cvec_phas_set_sample(complex_vector, 2., 1); + assert(cvec_phas_get_sample(complex_vector, 1)); + cvec_print(complex_vector); - // set all vector elements to `1` + // set all norm and phas elements to 0 + cvec_zeros(complex_vector); + for ( i = 0; i < complex_vector->length; i++ ) { + assert( complex_vector->norm[i] == 0. ); + assert( complex_vector->phas[i] == 0. ); + } + + // set all norm elements to 1 cvec_norm_ones(complex_vector); for ( i = 0; i < complex_vector->length; i++ ) { assert( complex_vector->norm[i] == 1. ); - // assert( complex_vector->phas[i] == 0 ); } - cvec_print(complex_vector); - cvec_zeros(complex_vector); - cvec_phas_zeros(complex_vector); + // set all norm elements to 0 cvec_norm_zeros(complex_vector); - cvec_norm_ones(complex_vector); + for ( i = 0; i < complex_vector->length; i++ ) { + assert( complex_vector->norm[i] == 0. ); + } + + // set all phas elements to 1 cvec_phas_ones(complex_vector); + for ( i = 0; i < complex_vector->length; i++ ) { + assert( complex_vector->phas[i] == 1. ); + } + + // set all phas elements to 0 + cvec_phas_zeros(complex_vector); + for ( i = 0; i < complex_vector->length; i++ ) { + assert( complex_vector->phas[i] == 0. ); + } + + cvec_copy(complex_vector, other_cvector); + // copy to self cvec_copy(complex_vector, complex_vector); + // copy to a different size fails + del_cvec(other_cvector); + other_cvector = new_cvec(window_size + 2); + cvec_copy(complex_vector, other_cvector); + + if (complex_vector) + del_cvec(complex_vector); + if (other_cvector) + del_cvec(other_cvector); + + // wrong parameters + assert(new_cvec(-1) == NULL); + assert(new_cvec(0) == NULL); - // destroy it - del_cvec(complex_vector); return 0; } diff --git a/tests/src/test-delnull.c b/tests/src/test-delnull.c deleted file mode 100644 index bb24509..0000000 --- a/tests/src/test-delnull.c +++ /dev/null @@ -1,24 +0,0 @@ -#include <stdlib.h> -#include "aubio.h" - -// When creating an aubio object, the user should check whether the object is -// set NULL, indicating the creation failed and the object was not allocated. - -int main (void) -{ - uint_t return_code = 0; - fvec_t *f = new_fvec(-12); - cvec_t *c = new_cvec(-12); - lvec_t *l = new_lvec(-12); - aubio_fft_t *fft = new_aubio_fft(-12); - if (f != NULL) { - return_code = 1; - } else if (c != NULL) { - return_code = 2; - } else if (l != NULL) { - return_code = 3; - } else if (fft != NULL) { - return_code = 3; - } - return return_code; -} diff --git a/tests/src/test-fmat.c b/tests/src/test-fmat.c index 218c027..fea911f 100644 --- a/tests/src/test-fmat.c +++ b/tests/src/test-fmat.c @@ -4,27 +4,93 @@ // create a new matrix and fill it with i * 1. + j * .1, where i is the row, // and j the column. -int main (void) +void assert_fmat_all_equal(fmat_t *mat, smpl_t scalar) { - uint_t height = 3, length = 9, i, j; - // create fmat_t object - fmat_t * mat = new_fmat (height, length); + uint_t i, j; for ( i = 0; i < mat->height; i++ ) { for ( j = 0; j < mat->length; j++ ) { + assert(mat->data[i][j] == scalar); + } + } +} + +int main (void) +{ + uint_t i, j; + uint_t height = 3, length = 9; + + // create fmat_t object + fmat_t * mat = new_fmat(height, length); + fmat_t * other_mat = new_fmat(height, length); + + assert(mat); + assert(other_mat); + + assert(mat->length == length); + assert(mat->height == height); + + for (i = 0; i < mat->height; i++) { + for (j = 0; j < mat->length; j++) { // all elements are already initialized to 0. assert(mat->data[i][j] == 0); // setting element of row i, column j - mat->data[i][j] = i * 1. + j *.1; + mat->data[i][j] = i * 10. + j; + } + } + + // print out matrix + fmat_print(mat); + + // helpers + fmat_rev(mat); + fmat_print(mat); + for (i = 0; i < mat->height; i++) { + for (j = 0; j < mat->length; j++) { + assert(mat->data[i][j] == i * 10. + mat->length - 1. - j); } } + + fmat_set_sample(mat, 3, 1, 1); + assert(fmat_get_sample(mat, 1, 1) == 3.); + + fmat_ones(mat); + assert_fmat_all_equal(mat, 1.); + + fmat_set(other_mat, .5); + assert_fmat_all_equal(other_mat, .5); + + fmat_weight(mat, other_mat); + assert_fmat_all_equal(mat, .5); + fvec_t channel_onstack; fvec_t *channel = &channel_onstack; fmat_get_channel(mat, 1, channel); - fvec_print (channel); - // print out matrix - fmat_print(mat); - // destroy it - del_fmat(mat); + assert(channel->data == mat->data[1]); + + // copy of the same size + fmat_copy(mat, other_mat); + del_fmat(other_mat); + + // copy to undersized + other_mat = new_fmat(height - 1, length); + fmat_copy(mat, other_mat); + del_fmat(other_mat); + + // copy from undersized + other_mat = new_fmat(height, length + 1); + fmat_copy(mat, other_mat); + + // wrong parameters + assert(new_fmat(-1, length) == NULL); + assert(new_fmat(height, -1) == NULL); + + // methods for wrappers with opaque structure + assert (fmat_get_channel_data(mat, 0) == mat->data[0]); + assert (fmat_get_data(mat) == mat->data); + + if (mat) + del_fmat(mat); + if (other_mat) + del_fmat(other_mat); return 0; } - diff --git a/tests/src/test-fvec.c b/tests/src/test-fvec.c index c53e396..5c2493c 100644 --- a/tests/src/test-fvec.c +++ b/tests/src/test-fvec.c @@ -1,32 +1,40 @@ #include "aubio.h" #include "utils_tests.h" +void assert_fvec_all_equal(fvec_t *vec, smpl_t scalar) +{ + uint_t i; + for (i = 0; i < vec->length; i++) { + assert(vec->data[i] == scalar); + } +} + int main (void) { - uint_t vec_size = 10, i; - fvec_t * vec = new_fvec (vec_size); + uint_t length = 10; + uint_t i; + + fvec_t * vec = new_fvec (length); + fvec_t * other_vec = new_fvec (length); + + assert (vec); + assert (other_vec); // vec->length matches requested size - assert(vec->length == vec_size); + assert(vec->length == length); // all elements are initialized to `0.` for ( i = 0; i < vec->length; i++ ) { assert(vec->data[i] == 0.); } - // all elements can be set to `0.` - fvec_zeros(vec); - for ( i = 0; i < vec->length; i++ ) { - assert(vec->data[i] == 0.); - } - fvec_print(vec); - // all elements can be set to `1.` fvec_ones(vec); - for ( i = 0; i < vec->length; i++ ) { - assert(vec->data[i] == 1.); - } - fvec_print(vec); + assert_fvec_all_equal(vec, 1.); + + // all elements can be set to `0.` + fvec_zeros(vec); + assert_fvec_all_equal(vec, 0.); // each element can be accessed directly for ( i = 0; i < vec->length; i++ ) { @@ -35,9 +43,31 @@ int main (void) } fvec_print(vec); - // now destroys the vector - del_fvec(vec); + fvec_set_sample(vec, 3, 2); + assert(fvec_get_sample(vec, 2) == 3); + + assert(fvec_get_data(vec) == vec->data); + // wrong parameters + assert(new_fvec(-1) == NULL); + + // copy to an identical size works + fvec_copy(vec, other_vec); + del_fvec(other_vec); + + // copy to a different size fail + other_vec = new_fvec(length + 1); + fvec_copy(vec, other_vec); + del_fvec(other_vec); + + // copy to a different size fail + other_vec = new_fvec(length - 1); + fvec_copy(vec, other_vec); + + // now destroys the vector + if (vec) + del_fvec(vec); + if (other_vec) + del_fvec(other_vec); return 0; } - diff --git a/tests/src/test-lvec.c b/tests/src/test-lvec.c index 17b5fc6..d1a8a0e 100644 --- a/tests/src/test-lvec.c +++ b/tests/src/test-lvec.c @@ -1,18 +1,47 @@ #include "aubio.h" #include "utils_tests.h" +void assert_lvec_all_equal(lvec_t *vec, lsmp_t scalar) +{ + uint_t i; + for (i = 0; i < vec->length; i++) { + assert(vec->data[i] == scalar); + } +} + int main (void) { - uint_t win_s = 32; // window size - lvec_t * sp = new_lvec (win_s); // input buffer - lvec_set_sample (sp, 2./3., 0); - PRINT_MSG(AUBIO_LSMP_FMT "\n", lvec_get_sample (sp, 0)); - lvec_print (sp); - lvec_ones (sp); - lvec_print (sp); - lvec_set_all (sp, 3./5.); - lvec_print (sp); - del_lvec(sp); + uint_t length = 32; // window size + + lvec_t * vec = new_lvec (length); // input buffer + + assert(vec); + + assert(vec->length == length); + + lvec_set_sample (vec, 3., 0); + assert(lvec_get_sample(vec, 0) == 3.); + + assert(lvec_get_data(vec) == vec->data); + + lvec_print (vec); + // note AUBIO_LSMP_FMT can be used to print lsmp_t + PRINT_MSG(AUBIO_LSMP_FMT "\n", lvec_get_sample (vec, 0)); + + lvec_set_all (vec, 2.); + assert_lvec_all_equal(vec, 2.); + + lvec_ones (vec); + assert_lvec_all_equal(vec, 1.); + + lvec_zeros (vec); + assert_lvec_all_equal(vec, 0.); + + del_lvec(vec); + + // wrong parameters + assert(new_lvec(0) == NULL); + assert(new_lvec(-1) == NULL); + return 0; } - diff --git a/tests/src/test-mathutils-window.c b/tests/src/test-mathutils-window.c index 4b45e7f..a75bb5a 100644 --- a/tests/src/test-mathutils-window.c +++ b/tests/src/test-mathutils-window.c @@ -7,8 +7,8 @@ int main (void) uint_t n_length = 4, n_types = 10, i, t; uint_t lengths[4] = { 8, 10, 15, 16 }; char *method = "default"; - char *window_types[10] = { "default", - "rectangle", "hamming", "hanning", "hanningz", + char *window_types[11] = { "default", + "ones", "rectangle", "hamming", "hanning", "hanningz", "blackman", "blackman_harris", "gaussian", "welch", "parzen"}; for ( t = 0; t < n_types; t ++ ) { @@ -26,6 +26,10 @@ int main (void) del_fvec(window); } } + + assert (new_aubio_window("parzen", -1) == NULL); + assert (new_aubio_window(NULL, length) == NULL); + assert (new_aubio_window("\0", length) == NULL); return 0; } diff --git a/tests/src/test-mathutils.c b/tests/src/test-mathutils.c index 0a6eedf..ca01500 100644 --- a/tests/src/test-mathutils.c +++ b/tests/src/test-mathutils.c @@ -100,9 +100,10 @@ int test_aubio_window (void) window = new_fvec(window_size); fvec_set_window(window, "rectangle"); fvec_print(window); + del_fvec(window); window_size /= 2.; - window = new_aubio_window("triangle", window_size); + window = new_aubio_window("parzen", window_size); fvec_print(window); del_fvec(window); @@ -116,5 +117,6 @@ int main (void) test_next_power_of_two(); test_miditofreq(); test_freqtomidi(); + test_aubio_window(); return 0; } diff --git a/tests/src/test-vecutils.c b/tests/src/test-vecutils.c new file mode 100644 index 0000000..5964a76 --- /dev/null +++ b/tests/src/test-vecutils.c @@ -0,0 +1,65 @@ +#include "aubio.h" +#include "utils_tests.h" + +void assert_fvec_all_almost_equal(fvec_t *vec, smpl_t scalar, smpl_t err) +{ + uint_t i; + for (i = 0; i < vec->length; i++) { + assert( fabs(vec->data[i] - scalar) < (smpl_t)err ); + } +} + +int main (void) +{ + uint_t length = 10; + + fvec_t * vec = new_fvec(length); + + fvec_set_all(vec, 2); + fvec_exp(vec); + assert_fvec_all_almost_equal(vec, exp(2), 1e-10); + + fvec_set_all(vec, 0); + fvec_cos(vec); + assert_fvec_all_almost_equal(vec, 1., 1e-10); + + fvec_set_all(vec, 0); + fvec_sin(vec); + assert_fvec_all_almost_equal(vec, 0., 1e-10); + + fvec_set_all(vec, -1); + fvec_abs(vec); + assert_fvec_all_almost_equal(vec, 1., 1e-10); + + fvec_set_all(vec, 4); + fvec_sqrt(vec); + assert_fvec_all_almost_equal(vec, 2., 1e-10); + + fvec_set_all(vec, 10.); + fvec_log10(vec); + assert_fvec_all_almost_equal(vec, 1., 1e-10); + + fvec_set_all(vec, 1.); + fvec_log(vec); + assert_fvec_all_almost_equal(vec, 0., 1e-10); + + fvec_set_all(vec, 1.6); + fvec_floor(vec); + assert_fvec_all_almost_equal(vec, 1., 1e-10); + + fvec_set_all(vec, 1.6); + fvec_ceil(vec); + assert_fvec_all_almost_equal(vec, 2., 1e-10); + + fvec_set_all(vec, 1.6); + fvec_round(vec); + assert_fvec_all_almost_equal(vec, 2., 1e-10); + + fvec_set_all(vec, 2); + fvec_pow(vec, 3); + assert_fvec_all_almost_equal(vec, 8., 1e-10); + + if (vec) + del_fvec(vec); + return 0; +} diff --git a/tests/src/utils/test-hist.c b/tests/src/utils/test-hist.c index ee3bc9b..63235a6 100644 --- a/tests/src/utils/test-hist.c +++ b/tests/src/utils/test-hist.c @@ -25,6 +25,7 @@ int main (void) del_aubio_hist(o); del_fvec(t); } + if (new_aubio_hist(0, 1, 0)) return 1; return 0; } diff --git a/tests/src/utils/test-log.c b/tests/src/utils/test-log.c new file mode 100644 index 0000000..1c6b15b --- /dev/null +++ b/tests/src/utils/test-log.c @@ -0,0 +1,60 @@ +#include <aubio.h> +#include <stdio.h> +#include "aubio_priv.h" + +const char_t *hdr = "CUSTOM HEADER: "; +const char_t *hdr2 = "OTHER HEADER: "; + +/* an example of logging function that adds a custom header and prints + * aubio debug messages on stdout instead of stderr */ +void logging(int level, const char_t *message, void *data) { + FILE *out; + //fprintf(stdout, "using custom logging function\n"); + if (level == AUBIO_LOG_ERR) { + out = stderr; + } else { + out = stdout; + } + if ((level >= 0) && (data != NULL)) { + fprintf(out, "%s", (const char_t *)data); + } + fprintf(out, "%s", message); +} + +int main (void) +{ + fprintf(stdout, "### testing normal logging\n"); + AUBIO_ERR("testing normal AUBIO_LOG_ERR\n"); + AUBIO_INF("testing normal AUBIO_LOG_INF\n"); + AUBIO_WRN("testing normal AUBIO_LOG_WRN\n"); + AUBIO_MSG("testing normal AUBIO_LOG_MSG\n"); + AUBIO_DBG("testing normal AUBIO_LOG_DBG\n"); + + fprintf(stdout, "### testing with one custom function\n"); + aubio_log_set_function(logging, (void *)hdr); + AUBIO_ERR("testing custom set_function AUBIO_LOG_ERR\n"); + AUBIO_INF("testing custom set_function AUBIO_LOG_INF\n"); + AUBIO_WRN("testing custom set_function AUBIO_LOG_WRN\n"); + AUBIO_MSG("testing custom set_function AUBIO_LOG_MSG\n"); + AUBIO_DBG("testing custom set_function AUBIO_LOG_DBG\n"); + + fprintf(stdout, "### testing resetted logging\n"); + aubio_log_reset(); + AUBIO_ERR("testing again normal AUBIO_LOG_ERR\n"); + AUBIO_INF("testing again normal AUBIO_LOG_INF\n"); + AUBIO_WRN("testing again normal AUBIO_LOG_WRN\n"); + AUBIO_MSG("testing again normal AUBIO_LOG_MSG\n"); + AUBIO_DBG("testing again normal AUBIO_LOG_DBG\n"); + + fprintf(stdout, "### testing per level customization\n"); + aubio_log_set_level_function(AUBIO_LOG_ERR, logging, (void *)hdr2); + aubio_log_set_level_function(AUBIO_LOG_WRN, logging, NULL); + aubio_log_set_level_function(AUBIO_LOG_MSG, logging, (void *)hdr); + AUBIO_ERR("testing custom set_level_function AUBIO_LOG_ERR\n"); + AUBIO_INF("testing again normal AUBIO_LOG_INF\n"); + AUBIO_WRN("testing custom set_level_function AUBIO_LOG_WRN with data=NULL\n"); + AUBIO_MSG("testing custom set_level_function AUBIO_LOG_MSG\n"); + AUBIO_DBG("testing again normal AUBIO_LOG_DBG\n"); + + return 0; +} diff --git a/tests/utils_tests.h b/tests/utils_tests.h index 97d5297..cf1a446 100644 --- a/tests/utils_tests.h +++ b/tests/utils_tests.h @@ -5,6 +5,36 @@ #include <assert.h> #include "config.h" +#ifdef HAVE_STRING_H +#include <string.h> +#endif + +#ifdef HAVE_UNISTD_H +#include <unistd.h> // unlink, close +#endif + +#ifdef HAVE_LIMITS_H +#include <limits.h> // PATH_MAX +#endif /* HAVE_LIMITS_H */ +#ifndef PATH_MAX +#define PATH_MAX 1024 +#endif + +#if defined(HAVE_WIN_HACKS) && !defined(__GNUC__) +#include <io.h> // _access +#endif + +// This macro is used to pass a string to msvc compiler: since msvc's -D flag +// strips the quotes, we define the string without quotes and re-add them with +// this macro. + +#define REDEFINESTRING(x) #x +#define DEFINEDSTRING(x) REDEFINESTRING(x) + +#ifndef AUBIO_TESTS_SOURCE +#error "AUBIO_TESTS_SOURCE is not defined" +#endif + #ifdef HAVE_C99_VARARGS_MACROS #define PRINT_ERR(...) fprintf(stderr, "AUBIO-TESTS ERROR: " __VA_ARGS__) #define PRINT_MSG(...) fprintf(stdout, __VA_ARGS__) @@ -25,21 +55,20 @@ #define RAND_MAX 32767 #endif -// are we on windows ? or are we using -std=c99 ? -#if defined(HAVE_WIN_HACKS) || defined(__STRICT_ANSI__) -// http://en.wikipedia.org/wiki/Linear_congruential_generator -// no srandom/random on win32 +#if defined(HAVE_WIN_HACKS) -uint_t srandom_seed = 1029; +// use srand/rand on windows +#define srandom srand +#define random rand -void srandom(uint_t new_seed) { - srandom_seed = new_seed; -} +#elif defined(__STRICT_ANSI__) + +// workaround to build with -std=c99 (for instance with older cygwin), +// assuming libbc is recent enough to supports these functions. +extern void srandom(unsigned); +extern int random(void); +extern char mkstemp(const char *pat); -uint_t random(void) { - srandom_seed = 1664525 * srandom_seed + 1013904223; - return srandom_seed; -} #endif void utils_init_random (void); @@ -47,7 +76,113 @@ void utils_init_random (void); void utils_init_random (void) { time_t now = time(0); struct tm *tm_struct = localtime(&now); - int seed = tm_struct->tm_sec; + size_t **tm_address = (void*)&tm_struct; + int seed = tm_struct->tm_sec + (size_t)tm_address; //PRINT_WRN("current seed: %d\n", seed); - srandom (seed); + srandom ((unsigned int)seed); +} + +// create_temp_sink / close_temp_sink +#if defined(__GNUC__) // mkstemp + +int check_source(char *source_path) +{ + return access(source_path, R_OK); +} + +int create_temp_sink(char *sink_path) +{ + return mkstemp(sink_path); +} + +int close_temp_sink(char *sink_path, int sink_fildes) +{ + int err; + if ((err = close(sink_fildes)) != 0) return err; + if ((err = unlink(sink_path)) != 0) return err; + return err; +} + +#elif defined(HAVE_WIN_HACKS) //&& !defined(__GNUC__) +// windows workaround, where mkstemp does not exist... + +int check_source(char *source_path) +{ + return _access(source_path, 04); +} + +int create_temp_sink(char *templ) +{ + int i = 0; + static const char letters[] = "abcdefg0123456789"; + int letters_len = strlen(letters); + int templ_len = strlen(templ); + if (templ_len == 0) return 0; + utils_init_random(); + for (i = 0; i < 6; i++) + { + templ[templ_len - i] = letters[rand() % letters_len]; + } + return 1; +} + +int close_temp_sink(char* sink_path, int sink_fildes) { + // the file should be closed when not using mkstemp, no need to open it + if (sink_fildes == 0) return 1; + return _unlink(sink_path); +} + +#else // windows workaround +// otherwise, we don't really know what to do yet +#error "mkstemp undefined, but not on windows. additional workaround required." +#endif + +// pass progname / default +int run_on_default_source( int main(int, char**) ) +{ + const int argc = 2; + int err = 0; + char** argv = (char**)calloc(argc, sizeof(char*)); + argv[0] = __FILE__; + argv[1] = DEFINEDSTRING(AUBIO_TESTS_SOURCE); + // check if the file can be read + if ( check_source(argv[1]) ) return 1; + err = main(argc, argv); + if (argv) free(argv); + return err; +} + +int run_on_default_sink( int main(int, char**) ) +{ + const int argc = 2; + int err = 0; + char** argv = (char**)calloc(argc, sizeof(char*)); + char sink_path[PATH_MAX] = "tmp_aubio_XXXXXX"; + int fd = create_temp_sink(sink_path); + if (!fd) return 1; + argv[0] = __FILE__; + argv[1] = sink_path; + err = main(argc, argv); + close_temp_sink(sink_path, fd); + if (argv) free(argv); + return err; +} + +int run_on_default_source_and_sink( int main(int, char**) ) +{ + const int argc = 3; + int err = 0; + char** argv = (char**)calloc(argc, sizeof(char*)); + char sink_path[PATH_MAX] = "tmp_aubio_XXXXXX"; + int fd = create_temp_sink(sink_path); + if (!fd) return 1; + argv[0] = __FILE__; + argv[1] = DEFINEDSTRING(AUBIO_TESTS_SOURCE); + argv[2] = sink_path; + // check if the file can be read + if ( check_source(argv[1]) ) return 1; + err = main(argc, argv); + close_temp_sink(sink_path, fd); + if (argv) free(argv); + return err; } diff --git a/tests/wscript_build b/tests/wscript_build index 45e83b8..c0325ea 100644 --- a/tests/wscript_build +++ b/tests/wscript_build @@ -1,5 +1,6 @@ # vim:set syntax=python: +import sys import os.path uselib = ['aubio'] @@ -7,13 +8,30 @@ uselib = ['aubio'] includes = ['../src', '.'] programs_sources = ctx.path.ant_glob('src/**/*.c') +test_sound_target = '44100Hz_44100f_sine441_stereo.wav' +test_sound_abspath = bld.path.get_bld().make_node(test_sound_target) +# workaround to double escape backslash characters on windows +test_sound_abspath = str(test_sound_abspath).replace('\\', '\\\\') + +b = bld(name='create_tests_source', + rule=sys.executable + ' ${SRC} ${TGT}', + source='create_tests_source.py', + target=test_sound_target) +# use post() to create the task, keep a reference to it +b.post() +create_tests_source = b.tasks[0] + for source_file in programs_sources: target = os.path.basename(os.path.splitext(str(source_file))[0]) - bld(features = 'c cprogram test', + a = bld(features = 'c cprogram test', source = source_file, target = target, includes = includes, use = uselib, install_path = None, - defines = 'AUBIO_UNSTABLE_API=1', + defines = ['AUBIO_UNSTABLE_API=1', + 'AUBIO_TESTS_SOURCE={}'.format(test_sound_abspath)] ) + a.post() + # make sure the unit_test task runs *after* the source is created + a.tasks[-1].set_run_after(create_tests_source) diff --git a/this_version.py b/this_version.py new file mode 100644 index 0000000..4e85914 --- /dev/null +++ b/this_version.py @@ -0,0 +1,107 @@ +#! python +import os +import sys + +__version_info = {} # keep a reference to parse VERSION once + +def get_version_info(): + # read from VERSION + # return dictionary filled with content of version + if not __version_info: + this_file_dir = os.path.dirname(os.path.abspath(__file__)) + version_file = os.path.join(this_file_dir, 'VERSION') + + if not os.path.isfile(version_file): + raise SystemError("VERSION file not found.") + + for l in open(version_file).readlines(): + if l.startswith('AUBIO_MAJOR_VERSION'): + __version_info['AUBIO_MAJOR_VERSION'] = int(l.split('=')[1]) + if l.startswith('AUBIO_MINOR_VERSION'): + __version_info['AUBIO_MINOR_VERSION'] = int(l.split('=')[1]) + if l.startswith('AUBIO_PATCH_VERSION'): + __version_info['AUBIO_PATCH_VERSION'] = int(l.split('=')[1]) + if l.startswith('AUBIO_VERSION_STATUS'): + __version_info['AUBIO_VERSION_STATUS'] = \ + l.split('=')[1].strip()[1:-1] + + if l.startswith('LIBAUBIO_LT_CUR'): + __version_info['LIBAUBIO_LT_CUR'] = int(l.split('=')[1]) + if l.startswith('LIBAUBIO_LT_REV'): + __version_info['LIBAUBIO_LT_REV'] = int(l.split('=')[1]) + if l.startswith('LIBAUBIO_LT_AGE'): + __version_info['LIBAUBIO_LT_AGE'] = int(l.split('=')[1]) + + if len(__version_info) < 6: + raise SystemError("Failed parsing VERSION file.") + + # switch version status with commit sha in alpha releases + if __version_info['AUBIO_VERSION_STATUS'] and \ + '~alpha' in __version_info['AUBIO_VERSION_STATUS']: + AUBIO_GIT_SHA = get_git_revision_hash() + if AUBIO_GIT_SHA: + __version_info['AUBIO_VERSION_STATUS'] = '~git+' + AUBIO_GIT_SHA + + return __version_info + +def get_libaubio_version(): + verfmt = '%(LIBAUBIO_LT_CUR)s.%(LIBAUBIO_LT_REV)s.%(LIBAUBIO_LT_AGE)s' + return str(verfmt % get_version_info()) + +def get_aubio_version(): + verfmt = '%(AUBIO_MAJOR_VERSION)s.%(AUBIO_MINOR_VERSION)s.%(AUBIO_PATCH_VERSION)s%(AUBIO_VERSION_STATUS)s' + return str(verfmt % get_version_info()) + +def get_aubio_pyversion(): + # convert to version for python according to pep 440 + # see https://www.python.org/dev/peps/pep-0440/ + # outputs MAJ.MIN.PATCH[a0[+git.<sha>[.mods]]] + aubio_version = get_aubio_version() + if '~git+' in aubio_version: + pep440str = aubio_version.replace('+', '.') + verstr = pep440str.replace('~git.', 'a0+') + elif '~alpha' in aubio_version: + verstr = aubio_version.replace('~alpha', 'a0') + else: + verstr = aubio_version + return verstr + +def get_git_revision_hash(short=True): + # get commit id, with +mods if local tree is not clean + if not os.path.isdir('.git'): + # print('Version : not in git repository : can\'t get sha') + return None + import subprocess + aubio_dir = os.path.dirname(os.path.abspath(__file__)) + if not os.path.exists(aubio_dir): + raise SystemError("git / root folder not found") + gitcmd = ['git', '-C', aubio_dir, 'rev-parse'] + if short: + gitcmd.append('--short') + gitcmd.append('HEAD') + try: + gitsha = subprocess.check_output(gitcmd).strip().decode('utf8') + except Exception as e: + sys.stderr.write('git command error :%s\n' % e) + return None + # check if we have a clean tree + gitcmd = ['git', '-C', aubio_dir, 'status', '--porcelain'] + try: + output = subprocess.check_output(gitcmd).decode('utf8') + if len(output): + sys.stderr.write('Info: current tree is not clean\n\n') + sys.stderr.write(output + '\n') + gitsha += '+mods' + except subprocess.CalledProcessError as e: + sys.stderr.write('git command error :%s\n' % e) + pass + return gitsha + +if __name__ == '__main__': + if len(sys.argv) > 1 and sys.argv[1] == '-v': + print (get_aubio_version()) + elif len(sys.argv) > 1 and sys.argv[1] == '-p': + print (get_aubio_version()) + else: + print ('%30s'% 'aubio version:', get_aubio_version()) + print ('%30s'% 'python-aubio version:', get_aubio_pyversion()) @@ -1,7 +1,7 @@ #!/usr/bin/env python -# encoding: ISO8859-1 -# Thomas Nagy, 2005-2016 - +# encoding: latin-1 +# Thomas Nagy, 2005-2018 +# """ Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions @@ -32,13 +32,13 @@ POSSIBILITY OF SUCH DAMAGE. import os, sys, inspect -VERSION="1.8.22" -REVISION="596301b77b6d6efab064109ecd67cd79" -GIT="129ec0cfe7dc9d2880e085fab2a449c651ac8285" +VERSION="2.0.14" +REVISION="d2a83c080c7603d2106440d4d4e18e69" +GIT="x" INSTALL='' -C1='#.' -C2='#,' -C3='#&' +C1='#8' +C2='#(' +C3='#%' cwd = os.getcwd() join = os.path.join diff --git a/waf_gensyms.py b/waf_gensyms.py new file mode 100644 index 0000000..f9ddd70 --- /dev/null +++ b/waf_gensyms.py @@ -0,0 +1,40 @@ +import re +import os.path +from waflib import TaskGen, Task +from waflib.Context import STDOUT +from waflib.Utils import O644 + +class gen_sym_file(Task.Task): + color = 'BLUE' + inst_to = '${LIBDIR}' + def run(self): + syms = {} + reg = getattr(self.generator, 'export_symbols_regex','.+?') + if 'msvc' in self.env.CC_NAME: + outputs = [x.abspath() for x in self.generator.link_task.outputs] + binary_path = list(filter(lambda x: x.endswith('lib'), outputs))[0] + reg_compiled = re.compile(r'External\s+\|\s+(?P<symbol>%s)\b' % reg) + cmd =(self.env.LINK_CC) + ['/dump', '/symbols', binary_path] + else: # using gcc? assume we have nm + outputs = [x.abspath() for x in self.generator.link_task.outputs] + binary_path = list(filter(lambda x: x.endswith('dll'), outputs))[0] + reg_compiled = re.compile(r'(T|D)\s+_(?P<symbol>%s)\b'%reg) + cmd = (self.env.NM or ['nm']) + ['-g', binary_path] + dump_output = self.generator.bld.cmd_and_log(cmd, quiet=STDOUT) + syms = set([]) + for m in reg_compiled.finditer(dump_output): + syms.add(m.group('symbol')) + syms = list(syms) + syms.sort() + self.outputs[0].write('EXPORTS\n'+'\n'.join(syms)) + +@TaskGen.feature('gensyms') +@TaskGen.after_method('process_source','process_use','apply_link','process_uselib_local','propagate_uselib_vars') +def gen_symbols(self): + #sym_file = self.path.find_or_declare(self.target + '.def') + sym_file_name = os.path.splitext(self.link_task.outputs[0].abspath())[0] + '.def' + sym_file = self.path.find_or_declare(sym_file_name) + symtask = self.create_task('gen_sym_file', self.link_task.outputs, sym_file) + self.add_install_files(install_to=self.link_task.inst_to, install_from=sym_file, + chmod=O644, task=self.link_task) + diff --git a/waflib/Build.py b/waflib/Build.py index f24ce07..e54c6bc 100644 --- a/waflib/Build.py +++ b/waflib/Build.py @@ -7,17 +7,15 @@ try: import cPickle except ImportError: import pickle as cPickle -from waflib import Runner,TaskGen,Utils,ConfigSet,Task,Logs,Options,Context,Errors -import waflib.Node +from waflib import Node,Runner,TaskGen,Utils,ConfigSet,Task,Logs,Options,Context,Errors CACHE_DIR='c4che' CACHE_SUFFIX='_cache.py' INSTALL=1337 UNINSTALL=-1337 -SAVED_ATTRS='root node_deps raw_deps task_sigs'.split() +SAVED_ATTRS='root node_sigs task_sigs imp_sigs raw_deps node_deps'.split() CFG_FILES='cfg_files' POST_AT_ONCE=0 POST_LAZY=1 -POST_BOTH=2 PROTOCOL=-1 if sys.platform=='cli': PROTOCOL=0 @@ -29,19 +27,20 @@ class BuildContext(Context.Context): super(BuildContext,self).__init__(**kw) self.is_install=0 self.top_dir=kw.get('top_dir',Context.top_dir) - self.run_dir=kw.get('run_dir',Context.run_dir) - self.post_mode=POST_AT_ONCE self.out_dir=kw.get('out_dir',Context.out_dir) - self.cache_dir=kw.get('cache_dir',None) + self.run_dir=kw.get('run_dir',Context.run_dir) + self.launch_dir=Context.launch_dir + self.post_mode=POST_LAZY + self.cache_dir=kw.get('cache_dir') if not self.cache_dir: self.cache_dir=os.path.join(self.out_dir,CACHE_DIR) self.all_envs={} + self.node_sigs={} self.task_sigs={} + self.imp_sigs={} self.node_deps={} self.raw_deps={} - self.cache_dir_contents={} self.task_gen_cache_names={} - self.launch_dir=Context.launch_dir self.jobs=Options.options.jobs self.targets=Options.options.targets self.keep=Options.options.keep @@ -50,31 +49,22 @@ class BuildContext(Context.Context): self.current_group=0 self.groups=[] self.group_names={} + for v in SAVED_ATTRS: + if not hasattr(self,v): + setattr(self,v,{}) def get_variant_dir(self): if not self.variant: return self.out_dir - return os.path.join(self.out_dir,self.variant) + return os.path.join(self.out_dir,os.path.normpath(self.variant)) variant_dir=property(get_variant_dir,None) def __call__(self,*k,**kw): kw['bld']=self ret=TaskGen.task_gen(*k,**kw) self.task_gen_cache_names={} - self.add_to_group(ret,group=kw.get('group',None)) + self.add_to_group(ret,group=kw.get('group')) return ret - def rule(self,*k,**kw): - def f(rule): - ret=self(*k,**kw) - ret.rule=rule - return ret - return f def __copy__(self): - raise Errors.WafError('build contexts are not supposed to be copied') - def install_files(self,*k,**kw): - pass - def install_as(self,*k,**kw): - pass - def symlink_as(self,*k,**kw): - pass + raise Errors.WafError('build contexts cannot be copied') def load_envs(self): node=self.root.find_node(self.cache_dir) if not node: @@ -88,12 +78,8 @@ class BuildContext(Context.Context): self.all_envs[name]=env for f in env[CFG_FILES]: newnode=self.root.find_resource(f) - try: - h=Utils.h_file(newnode.abspath()) - except(IOError,AttributeError): - Logs.error('cannot find %r'%f) - h=Utils.SIG_NIL - newnode.sig=h + if not newnode or not newnode.exists(): + raise Errors.WafError('Missing configuration file %r, reconfigure the project!'%f) def init_dirs(self): if not(os.path.isabs(self.top_dir)and os.path.isabs(self.out_dir)): raise Errors.WafError('The project was not configured: run "waf configure" first!') @@ -106,7 +92,7 @@ class BuildContext(Context.Context): self.load_envs() self.execute_build() def execute_build(self): - Logs.info("Waf: Entering directory `%s'"%self.variant_dir) + Logs.info("Waf: Entering directory `%s'",self.variant_dir) self.recurse([self.run_dir]) self.pre_build() self.timer=Utils.Timer() @@ -114,10 +100,15 @@ class BuildContext(Context.Context): self.compile() finally: if self.progress_bar==1 and sys.stderr.isatty(): - c=len(self.returned_tasks)or 1 + c=self.producer.processed or 1 m=self.progress_line(c,c,Logs.colors.BLUE,Logs.colors.NORMAL) Logs.info(m,extra={'stream':sys.stderr,'c1':Logs.colors.cursor_off,'c2':Logs.colors.cursor_on}) - Logs.info("Waf: Leaving directory `%s'"%self.variant_dir) + Logs.info("Waf: Leaving directory `%s'",self.variant_dir) + try: + self.producer.bld=None + del self.producer + except AttributeError: + pass self.post_build() def restore(self): try: @@ -125,28 +116,28 @@ class BuildContext(Context.Context): except EnvironmentError: pass else: - if env['version']<Context.HEXVERSION: - raise Errors.WafError('Version mismatch! reconfigure the project') - for t in env['tools']: + if env.version<Context.HEXVERSION: + raise Errors.WafError('Project was configured with a different version of Waf, please reconfigure it') + for t in env.tools: self.setup(**t) dbfn=os.path.join(self.variant_dir,Context.DBFILE) try: data=Utils.readf(dbfn,'rb') - except(IOError,EOFError): - Logs.debug('build: Could not load the build cache %s (missing)'%dbfn) + except(EnvironmentError,EOFError): + Logs.debug('build: Could not load the build cache %s (missing)',dbfn) else: try: - waflib.Node.pickle_lock.acquire() - waflib.Node.Nod3=self.node_class + Node.pickle_lock.acquire() + Node.Nod3=self.node_class try: data=cPickle.loads(data) - except Exception ,e: - Logs.debug('build: Could not pickle the build cache %s: %r'%(dbfn,e)) + except Exception as e: + Logs.debug('build: Could not pickle the build cache %s: %r',dbfn,e) else: for x in SAVED_ATTRS: - setattr(self,x,data[x]) + setattr(self,x,data.get(x,{})) finally: - waflib.Node.pickle_lock.release() + Node.pickle_lock.release() self.init_dirs() def store(self): data={} @@ -154,11 +145,11 @@ class BuildContext(Context.Context): data[x]=getattr(self,x) db=os.path.join(self.variant_dir,Context.DBFILE) try: - waflib.Node.pickle_lock.acquire() - waflib.Node.Nod3=self.node_class + Node.pickle_lock.acquire() + Node.Nod3=self.node_class x=cPickle.dumps(data,PROTOCOL) finally: - waflib.Node.pickle_lock.release() + Node.pickle_lock.release() Utils.writef(db+'.tmp',x,m='wb') try: st=os.stat(db) @@ -172,23 +163,27 @@ class BuildContext(Context.Context): Logs.debug('build: compile()') self.producer=Runner.Parallel(self,self.jobs) self.producer.biter=self.get_build_iterator() - self.returned_tasks=[] try: self.producer.start() except KeyboardInterrupt: - self.store() + if self.is_dirty(): + self.store() raise else: - if self.producer.dirty: + if self.is_dirty(): self.store() if self.producer.error: raise Errors.BuildError(self.producer.error) + def is_dirty(self): + return self.producer.dirty def setup(self,tool,tooldir=None,funs=None): if isinstance(tool,list): - for i in tool:self.setup(i,tooldir) + for i in tool: + self.setup(i,tooldir) return module=Context.load_tool(tool,tooldir) - if hasattr(module,"setup"):module.setup(self) + if hasattr(module,"setup"): + module.setup(self) def get_env(self): try: return self.all_envs[self.variant] @@ -198,18 +193,20 @@ class BuildContext(Context.Context): self.all_envs[self.variant]=val env=property(get_env,set_env) def add_manual_dependency(self,path,value): - if path is None: - raise ValueError('Invalid input') - if isinstance(path,waflib.Node.Node): + if not path: + raise ValueError('Invalid input path %r'%path) + if isinstance(path,Node.Node): node=path elif os.path.isabs(path): node=self.root.find_resource(path) else: node=self.path.find_resource(path) + if not node: + raise ValueError('Could not find the path %r'%path) if isinstance(value,list): - self.deps_man[id(node)].extend(value) + self.deps_man[node].extend(value) else: - self.deps_man[id(node)].append(value) + self.deps_man[node].append(value) def launch_node(self): try: return self.p_ln @@ -232,9 +229,8 @@ class BuildContext(Context.Context): except KeyError: pass lst=[env[a]for a in vars_lst] - ret=Utils.h_list(lst) + cache[idx]=ret=Utils.h_list(lst) Logs.debug('envhash: %s %r',Utils.to_hex(ret),lst) - cache[idx]=ret return ret def get_tgen_by_name(self,name): cache=self.task_gen_cache_names @@ -249,20 +245,20 @@ class BuildContext(Context.Context): return cache[name] except KeyError: raise Errors.WafError('Could not find a task generator for the name %r'%name) - def progress_line(self,state,total,col1,col2): + def progress_line(self,idx,total,col1,col2): if not sys.stderr.isatty(): return'' n=len(str(total)) Utils.rot_idx+=1 ind=Utils.rot_chr[Utils.rot_idx%4] - pc=(100.*state)/total - eta=str(self.timer) - fs="[%%%dd/%%%dd][%%s%%2d%%%%%%s][%s]["%(n,n,ind) - left=fs%(state,total,col1,pc,col2) - right='][%s%s%s]'%(col1,eta,col2) + pc=(100.*idx)/total + fs="[%%%dd/%%d][%%s%%2d%%%%%%s][%s]["%(n,ind) + left=fs%(idx,total,col1,pc,col2) + right='][%s%s%s]'%(col1,self.timer,col2) cols=Logs.get_term_cols()-len(left)-len(right)+2*len(col1)+2*len(col2) - if cols<7:cols=7 - ratio=((cols*state)//total)-1 + if cols<7: + cols=7 + ratio=((cols*idx)//total)-1 bar=('='*ratio+'>').ljust(cols) msg=Logs.indicator%(left,bar,right) return msg @@ -293,7 +289,7 @@ class BuildContext(Context.Context): return self.group_names[x] return self.groups[x] def add_to_group(self,tgen,group=None): - assert(isinstance(tgen,TaskGen.task_gen)or isinstance(tgen,Task.TaskBase)) + assert(isinstance(tgen,TaskGen.task_gen)or isinstance(tgen,Task.Task)) tgen.bld=self self.get_group(group).append(tgen) def get_group_name(self,g): @@ -305,14 +301,14 @@ class BuildContext(Context.Context): return'' def get_group_idx(self,tg): se=id(tg) - for i in range(len(self.groups)): - for t in self.groups[i]: + for i,tmp in enumerate(self.groups): + for t in tmp: if id(t)==se: return i return None def add_group(self,name=None,move=True): if name and name in self.group_names: - Logs.error('add_group: name %s already present'%name) + raise Errors.WafError('add_group: name %s already present',name) g=[] self.group_names[name]=g self.groups.append(g) @@ -321,8 +317,8 @@ class BuildContext(Context.Context): def set_group(self,idx): if isinstance(idx,str): g=self.group_names[idx] - for i in range(len(self.groups)): - if id(g)==id(self.groups[i]): + for i,tmp in enumerate(self.groups): + if id(g)==id(tmp): self.current_group=i break else: @@ -354,23 +350,20 @@ class BuildContext(Context.Context): lst.extend(g) return lst def post_group(self): + def tgpost(tg): + try: + f=tg.post + except AttributeError: + pass + else: + f() if self.targets=='*': - for tg in self.groups[self.cur]: - try: - f=tg.post - except AttributeError: - pass - else: - f() + for tg in self.groups[self.current_group]: + tgpost(tg) elif self.targets: - if self.cur<self._min_grp: - for tg in self.groups[self.cur]: - try: - f=tg.post - except AttributeError: - pass - else: - f() + if self.current_group<self._min_grp: + for tg in self.groups[self.current_group]: + tgpost(tg) else: for tg in self._exact_tg: tg.post() @@ -380,16 +373,28 @@ class BuildContext(Context.Context): Logs.warn('Building from the build directory, forcing --targets=*') ln=self.srcnode elif not ln.is_child_of(self.srcnode): - Logs.warn('CWD %s is not under %s, forcing --targets=* (run distclean?)'%(ln.abspath(),self.srcnode.abspath())) + Logs.warn('CWD %s is not under %s, forcing --targets=* (run distclean?)',ln.abspath(),self.srcnode.abspath()) ln=self.srcnode - for tg in self.groups[self.cur]: + def is_post(tg,ln): try: - f=tg.post + p=tg.path except AttributeError: pass else: - if tg.path.is_child_of(ln): - f() + if p.is_child_of(ln): + return True + def is_post_group(): + for i,g in enumerate(self.groups): + if i>self.current_group: + for tg in g: + if is_post(tg,ln): + return True + if self.post_mode==POST_LAZY and ln!=self.srcnode: + if is_post_group(): + ln=self.srcnode + for tg in self.groups[self.current_group]: + if is_post(tg,ln): + tgpost(tg) def get_tasks_group(self,idx): tasks=[] for tg in self.groups[idx]: @@ -399,99 +404,169 @@ class BuildContext(Context.Context): tasks.append(tg) return tasks def get_build_iterator(self): - self.cur=0 if self.targets and self.targets!='*': (self._min_grp,self._exact_tg)=self.get_targets() - global lazy_post if self.post_mode!=POST_LAZY: - while self.cur<len(self.groups): + for self.current_group,_ in enumerate(self.groups): self.post_group() - self.cur+=1 - self.cur=0 - while self.cur<len(self.groups): + for self.current_group,_ in enumerate(self.groups): if self.post_mode!=POST_AT_ONCE: self.post_group() - tasks=self.get_tasks_group(self.cur) + tasks=self.get_tasks_group(self.current_group) Task.set_file_constraints(tasks) Task.set_precedence_constraints(tasks) self.cur_tasks=tasks - self.cur+=1 - if not tasks: - continue - yield tasks + if tasks: + yield tasks while 1: yield[] + def install_files(self,dest,files,**kw): + assert(dest) + tg=self(features='install_task',install_to=dest,install_from=files,**kw) + tg.dest=tg.install_to + tg.type='install_files' + if not kw.get('postpone',True): + tg.post() + return tg + def install_as(self,dest,srcfile,**kw): + assert(dest) + tg=self(features='install_task',install_to=dest,install_from=srcfile,**kw) + tg.dest=tg.install_to + tg.type='install_as' + if not kw.get('postpone',True): + tg.post() + return tg + def symlink_as(self,dest,src,**kw): + assert(dest) + tg=self(features='install_task',install_to=dest,install_from=src,**kw) + tg.dest=tg.install_to + tg.type='symlink_as' + tg.link=src + if not kw.get('postpone',True): + tg.post() + return tg +@TaskGen.feature('install_task') +@TaskGen.before_method('process_rule','process_source') +def process_install_task(self): + self.add_install_task(**self.__dict__) +@TaskGen.taskgen_method +def add_install_task(self,**kw): + if not self.bld.is_install: + return + if not kw['install_to']: + return + if kw['type']=='symlink_as'and Utils.is_win32: + if kw.get('win32_install'): + kw['type']='install_as' + else: + return + tsk=self.install_task=self.create_task('inst') + tsk.chmod=kw.get('chmod',Utils.O644) + tsk.link=kw.get('link','')or kw.get('install_from','') + tsk.relative_trick=kw.get('relative_trick',False) + tsk.type=kw['type'] + tsk.install_to=tsk.dest=kw['install_to'] + tsk.install_from=kw['install_from'] + tsk.relative_base=kw.get('cwd')or kw.get('relative_base',self.path) + tsk.install_user=kw.get('install_user') + tsk.install_group=kw.get('install_group') + tsk.init_files() + if not kw.get('postpone',True): + tsk.run_now() + return tsk +@TaskGen.taskgen_method +def add_install_files(self,**kw): + kw['type']='install_files' + return self.add_install_task(**kw) +@TaskGen.taskgen_method +def add_install_as(self,**kw): + kw['type']='install_as' + return self.add_install_task(**kw) +@TaskGen.taskgen_method +def add_symlink_as(self,**kw): + kw['type']='symlink_as' + return self.add_install_task(**kw) class inst(Task.Task): - color='CYAN' + def __str__(self): + return'' def uid(self): - lst=[self.dest,self.path]+self.source - return Utils.h_list(repr(lst)) - def post(self): - buf=[] - for x in self.source: - if isinstance(x,waflib.Node.Node): - y=x - else: - y=self.path.find_resource(x) - if not y: - if os.path.isabs(x): - y=self.bld.root.make_node(x) - else: - y=self.path.make_node(x) - buf.append(y) - self.inputs=buf + lst=self.inputs+self.outputs+[self.link,self.generator.path.abspath()] + return Utils.h_list(lst) + def init_files(self): + if self.type=='symlink_as': + inputs=[] + else: + inputs=self.generator.to_nodes(self.install_from) + if self.type=='install_as': + assert len(inputs)==1 + self.set_inputs(inputs) + dest=self.get_install_path() + outputs=[] + if self.type=='symlink_as': + if self.relative_trick: + self.link=os.path.relpath(self.link,os.path.dirname(dest)) + outputs.append(self.generator.bld.root.make_node(dest)) + elif self.type=='install_as': + outputs.append(self.generator.bld.root.make_node(dest)) + else: + for y in inputs: + if self.relative_trick: + destfile=os.path.join(dest,y.path_from(self.relative_base)) + else: + destfile=os.path.join(dest,y.name) + outputs.append(self.generator.bld.root.make_node(destfile)) + self.set_outputs(outputs) def runnable_status(self): ret=super(inst,self).runnable_status() - if ret==Task.SKIP_ME: + if ret==Task.SKIP_ME and self.generator.bld.is_install: return Task.RUN_ME return ret - def __str__(self): - return'' - def run(self): - return self.generator.exec_task() + def post_run(self): + pass def get_install_path(self,destdir=True): - dest=Utils.subst_vars(self.dest,self.env) - dest=dest.replace('/',os.sep) + if isinstance(self.install_to,Node.Node): + dest=self.install_to.abspath() + else: + dest=Utils.subst_vars(self.install_to,self.env) + if not os.path.isabs(dest): + dest=os.path.join(self.env.PREFIX,dest) if destdir and Options.options.destdir: dest=os.path.join(Options.options.destdir,os.path.splitdrive(dest)[1].lstrip(os.sep)) return dest - def exec_install_files(self): - destpath=self.get_install_path() - if not destpath: - raise Errors.WafError('unknown installation path %r'%self.generator) - for x,y in zip(self.source,self.inputs): - if self.relative_trick: - destfile=os.path.join(destpath,y.path_from(self.path)) - else: - destfile=os.path.join(destpath,y.name) - self.generator.bld.do_install(y.abspath(),destfile,chmod=self.chmod,tsk=self) - def exec_install_as(self): - destfile=self.get_install_path() - self.generator.bld.do_install(self.inputs[0].abspath(),destfile,chmod=self.chmod,tsk=self) - def exec_symlink_as(self): - destfile=self.get_install_path() - src=self.link - if self.relative_trick: - src=os.path.relpath(src,os.path.dirname(destfile)) - self.generator.bld.do_link(src,destfile,tsk=self) -class InstallContext(BuildContext): - '''installs the targets on the system''' - cmd='install' - def __init__(self,**kw): - super(InstallContext,self).__init__(**kw) - self.uninstall=[] - self.is_install=INSTALL - def copy_fun(self,src,tgt,**kw): + def copy_fun(self,src,tgt): if Utils.is_win32 and len(tgt)>259 and not tgt.startswith('\\\\?\\'): tgt='\\\\?\\'+tgt shutil.copy2(src,tgt) - os.chmod(tgt,kw.get('chmod',Utils.O644)) - def do_install(self,src,tgt,**kw): - d,_=os.path.split(tgt) - if not d: - raise Errors.WafError('Invalid installation given %r->%r'%(src,tgt)) - Utils.check_dir(d) - srclbl=src.replace(self.srcnode.abspath()+os.sep,'') + self.fix_perms(tgt) + def rm_empty_dirs(self,tgt): + while tgt: + tgt=os.path.dirname(tgt) + try: + os.rmdir(tgt) + except OSError: + break + def run(self): + is_install=self.generator.bld.is_install + if not is_install: + return + for x in self.outputs: + if is_install==INSTALL: + x.parent.mkdir() + if self.type=='symlink_as': + fun=is_install==INSTALL and self.do_link or self.do_unlink + fun(self.link,self.outputs[0].abspath()) + else: + fun=is_install==INSTALL and self.do_install or self.do_uninstall + launch_node=self.generator.bld.launch_node() + for x,y in zip(self.inputs,self.outputs): + fun(x.abspath(),y.abspath(),x.path_from(launch_node)) + def run_now(self): + status=self.runnable_status() + if status not in(Task.RUN_ME,Task.SKIP_ME): + raise Errors.TaskNotReady('Could not process %r: status %r'%(self,status)) + self.run() + self.hasrun=Task.SUCCESS + def do_install(self,src,tgt,lbl,**kw): if not Options.options.force: try: st1=os.stat(tgt) @@ -500,11 +575,11 @@ class InstallContext(BuildContext): pass else: if st1.st_mtime+2>=st2.st_mtime and st1.st_size==st2.st_size: - if not self.progress_bar: - Logs.info('- install %s (from %s)'%(tgt,srclbl)) + if not self.generator.bld.progress_bar: + Logs.info('- install %s (from %s)',tgt,lbl) return False - if not self.progress_bar: - Logs.info('+ install %s (from %s)'%(tgt,srclbl)) + if not self.generator.bld.progress_bar: + Logs.info('+ install %s (from %s)',tgt,lbl) try: os.chmod(tgt,Utils.O644|stat.S_IMODE(os.stat(tgt).st_mode)) except EnvironmentError: @@ -514,127 +589,67 @@ class InstallContext(BuildContext): except OSError: pass try: - self.copy_fun(src,tgt,**kw) - except IOError: - try: - os.stat(src) - except EnvironmentError: - Logs.error('File %r does not exist'%src) - raise Errors.WafError('Could not install the file %r'%tgt) - def do_link(self,src,tgt,**kw): - d,_=os.path.split(tgt) - Utils.check_dir(d) - link=False + self.copy_fun(src,tgt) + except EnvironmentError as e: + if not os.path.exists(src): + Logs.error('File %r does not exist',src) + elif not os.path.isfile(src): + Logs.error('Input %r is not a file',src) + raise Errors.WafError('Could not install the file %r'%tgt,e) + def fix_perms(self,tgt): + if not Utils.is_win32: + user=getattr(self,'install_user',None)or getattr(self.generator,'install_user',None) + group=getattr(self,'install_group',None)or getattr(self.generator,'install_group',None) + if user or group: + Utils.lchown(tgt,user or-1,group or-1) if not os.path.islink(tgt): - link=True - elif os.readlink(tgt)!=src: - link=True - if link: - try:os.remove(tgt) - except OSError:pass - if not self.progress_bar: - Logs.info('+ symlink %s (to %s)'%(tgt,src)) - os.symlink(src,tgt) - else: - if not self.progress_bar: - Logs.info('- symlink %s (to %s)'%(tgt,src)) - def run_task_now(self,tsk,postpone): - tsk.post() - if not postpone: - if tsk.runnable_status()==Task.ASK_LATER: - raise self.WafError('cannot post the task %r'%tsk) - tsk.run() - tsk.hasrun=True - def install_files(self,dest,files,env=None,chmod=Utils.O644,relative_trick=False,cwd=None,add=True,postpone=True,task=None): - assert(dest) - tsk=inst(env=env or self.env) - tsk.bld=self - tsk.path=cwd or self.path - tsk.chmod=chmod - tsk.task=task - if isinstance(files,waflib.Node.Node): - tsk.source=[files] + os.chmod(tgt,self.chmod) + def do_link(self,src,tgt,**kw): + if os.path.islink(tgt)and os.readlink(tgt)==src: + if not self.generator.bld.progress_bar: + Logs.info('- symlink %s (to %s)',tgt,src) else: - tsk.source=Utils.to_list(files) - tsk.dest=dest - tsk.exec_task=tsk.exec_install_files - tsk.relative_trick=relative_trick - if add:self.add_to_group(tsk) - self.run_task_now(tsk,postpone) - return tsk - def install_as(self,dest,srcfile,env=None,chmod=Utils.O644,cwd=None,add=True,postpone=True,task=None): - assert(dest) - tsk=inst(env=env or self.env) - tsk.bld=self - tsk.path=cwd or self.path - tsk.chmod=chmod - tsk.source=[srcfile] - tsk.task=task - tsk.dest=dest - tsk.exec_task=tsk.exec_install_as - if add:self.add_to_group(tsk) - self.run_task_now(tsk,postpone) - return tsk - def symlink_as(self,dest,src,env=None,cwd=None,add=True,postpone=True,relative_trick=False,task=None): - if Utils.is_win32: - return - assert(dest) - tsk=inst(env=env or self.env) - tsk.bld=self - tsk.dest=dest - tsk.path=cwd or self.path - tsk.source=[] - tsk.task=task - tsk.link=src - tsk.relative_trick=relative_trick - tsk.exec_task=tsk.exec_symlink_as - if add:self.add_to_group(tsk) - self.run_task_now(tsk,postpone) - return tsk -class UninstallContext(InstallContext): - '''removes the targets installed''' - cmd='uninstall' - def __init__(self,**kw): - super(UninstallContext,self).__init__(**kw) - self.is_install=UNINSTALL - def rm_empty_dirs(self,tgt): - while tgt: - tgt=os.path.dirname(tgt) try: - os.rmdir(tgt) + os.remove(tgt) except OSError: - break - def do_install(self,src,tgt,**kw): - if not self.progress_bar: - Logs.info('- remove %s'%tgt) - self.uninstall.append(tgt) + pass + if not self.generator.bld.progress_bar: + Logs.info('+ symlink %s (to %s)',tgt,src) + os.symlink(src,tgt) + self.fix_perms(tgt) + def do_uninstall(self,src,tgt,lbl,**kw): + if not self.generator.bld.progress_bar: + Logs.info('- remove %s',tgt) try: os.remove(tgt) - except OSError ,e: + except OSError as e: if e.errno!=errno.ENOENT: if not getattr(self,'uninstall_error',None): self.uninstall_error=True Logs.warn('build: some files could not be uninstalled (retry with -vv to list them)') if Logs.verbose>1: - Logs.warn('Could not remove %s (error code %r)'%(e.filename,e.errno)) + Logs.warn('Could not remove %s (error code %r)',e.filename,e.errno) self.rm_empty_dirs(tgt) - def do_link(self,src,tgt,**kw): + def do_unlink(self,src,tgt,**kw): try: - if not self.progress_bar: - Logs.info('- remove %s'%tgt) + if not self.generator.bld.progress_bar: + Logs.info('- remove %s',tgt) os.remove(tgt) except OSError: pass self.rm_empty_dirs(tgt) - def execute(self): - try: - def runnable_status(self): - return Task.SKIP_ME - setattr(Task.Task,'runnable_status_back',Task.Task.runnable_status) - setattr(Task.Task,'runnable_status',runnable_status) - super(UninstallContext,self).execute() - finally: - setattr(Task.Task,'runnable_status',Task.Task.runnable_status_back) +class InstallContext(BuildContext): + '''installs the targets on the system''' + cmd='install' + def __init__(self,**kw): + super(InstallContext,self).__init__(**kw) + self.is_install=INSTALL +class UninstallContext(InstallContext): + '''removes the targets installed''' + cmd='uninstall' + def __init__(self,**kw): + super(UninstallContext,self).__init__(**kw) + self.is_install=UNINSTALL class CleanContext(BuildContext): '''cleans the project''' cmd='clean' @@ -649,16 +664,22 @@ class CleanContext(BuildContext): self.store() def clean(self): Logs.debug('build: clean called') - if self.bldnode!=self.srcnode: + if hasattr(self,'clean_files'): + for n in self.clean_files: + n.delete() + elif self.bldnode!=self.srcnode: lst=[] - for e in self.all_envs.values(): - lst.extend(self.root.find_or_declare(f)for f in e[CFG_FILES]) - for n in self.bldnode.ant_glob('**/*',excl='.lock* *conf_check_*/** config.log c4che/*',quiet=True): + for env in self.all_envs.values(): + lst.extend(self.root.find_or_declare(f)for f in env[CFG_FILES]) + excluded_dirs='.lock* *conf_check_*/** config.log %s/*'%CACHE_DIR + for n in self.bldnode.ant_glob('**/*',excl=excluded_dirs,quiet=True): if n in lst: continue n.delete() self.root.children={} - for v in'node_deps task_sigs raw_deps'.split(): + for v in SAVED_ATTRS: + if v=='root': + continue setattr(self,v,{}) class ListContext(BuildContext): '''lists the targets to execute''' @@ -680,12 +701,17 @@ class ListContext(BuildContext): f() try: self.get_tgen_by_name('') - except Exception: + except Errors.WafError: pass - lst=list(self.task_gen_cache_names.keys()) - lst.sort() - for k in lst: - Logs.pprint('GREEN',k) + targets=sorted(self.task_gen_cache_names) + line_just=max(len(t)for t in targets)if targets else 0 + for target in targets: + tgen=self.task_gen_cache_names[target] + descript=getattr(tgen,'description','') + if descript: + target=target.ljust(line_just) + descript=': %s'%descript + Logs.pprint('GREEN',target,label=descript) class StepContext(BuildContext): '''executes tasks in a step-by-step fashion, for debugging''' cmd='step' @@ -697,7 +723,7 @@ class StepContext(BuildContext): Logs.warn('Add a pattern for the debug build, for example "waf step --files=main.c,app"') BuildContext.compile(self) return - targets=None + targets=[] if self.targets and self.targets!='*': targets=self.targets.split(',') for g in self.groups: @@ -713,23 +739,23 @@ class StepContext(BuildContext): for pat in self.files.split(','): matcher=self.get_matcher(pat) for tg in g: - if isinstance(tg,Task.TaskBase): + if isinstance(tg,Task.Task): lst=[tg] else: lst=tg.tasks for tsk in lst: do_exec=False - for node in getattr(tsk,'inputs',[]): + for node in tsk.inputs: if matcher(node,output=False): do_exec=True break - for node in getattr(tsk,'outputs',[]): + for node in tsk.outputs: if matcher(node,output=True): do_exec=True break if do_exec: ret=tsk.run() - Logs.info('%s -> exit %r'%(str(tsk),ret)) + Logs.info('%s -> exit %r',tsk,ret) def get_matcher(self,pat): inn=True out=True @@ -748,12 +774,19 @@ class StepContext(BuildContext): pat='%s$'%pat pattern=re.compile(pat) def match(node,output): - if output==True and not out: + if output and not out: return False - if output==False and not inn: + if not output and not inn: return False if anode: return anode==node else: return pattern.match(node.abspath()) return match +class EnvContext(BuildContext): + fun=cmd=None + def execute(self): + self.restore() + if not self.all_envs: + self.load_envs() + self.recurse([self.run_dir]) diff --git a/waflib/ConfigSet.py b/waflib/ConfigSet.py index 19f8321..8212586 100644 --- a/waflib/ConfigSet.py +++ b/waflib/ConfigSet.py @@ -12,9 +12,12 @@ class ConfigSet(object): if filename: self.load(filename) def __contains__(self,key): - if key in self.table:return True - try:return self.parent.__contains__(key) - except AttributeError:return False + if key in self.table: + return True + try: + return self.parent.__contains__(key) + except AttributeError: + return False def keys(self): keys=set() cur=self @@ -24,12 +27,14 @@ class ConfigSet(object): keys=list(keys) keys.sort() return keys + def __iter__(self): + return iter(self.keys()) def __str__(self): return"\n".join(["%r %r"%(x,self.__getitem__(x))for x in self.keys()]) def __getitem__(self,key): try: while 1: - x=self.table.get(key,None) + x=self.table.get(key) if not x is None: return x self=self.parent @@ -41,7 +46,7 @@ class ConfigSet(object): self[key]=[] def __getattr__(self,name): if name in self.__slots__: - return object.__getattr__(self,name) + return object.__getattribute__(self,name) else: return self[name] def __setattr__(self,name,value): @@ -72,22 +77,26 @@ class ConfigSet(object): return self def get_flat(self,key): s=self[key] - if isinstance(s,str):return s + if isinstance(s,str): + return s return' '.join(s) def _get_list_value_for_modification(self,key): try: value=self.table[key] except KeyError: - try:value=self.parent[key] - except AttributeError:value=[] - if isinstance(value,list): - value=value[:] + try: + value=self.parent[key] + except AttributeError: + value=[] else: - value=[value] + if isinstance(value,list): + value=value[:] + else: + value=[value] + self.table[key]=value else: if not isinstance(value,list): - value=[value] - self.table[key]=value + self.table[key]=value=[value] return value def append_value(self,var,val): if isinstance(val,str): @@ -110,8 +119,10 @@ class ConfigSet(object): env=self while 1: table_list.insert(0,env.table) - try:env=env.parent - except AttributeError:break + try: + env=env.parent + except AttributeError: + break merged_table={} for table in table_list: merged_table.update(table) @@ -139,10 +150,9 @@ class ConfigSet(object): for m in re_imp.finditer(code): g=m.group tbl[g(2)]=eval(g(3)) - Logs.debug('env: %s'%str(self.table)) + Logs.debug('env: %s',self.table) def update(self,d): - for k,v in d.items(): - self[k]=v + self.table.update(d) def stash(self): orig=self.table tbl=self.table=self.table.copy() diff --git a/waflib/Configure.py b/waflib/Configure.py index 97ce134..a5da91b 100644 --- a/waflib/Configure.py +++ b/waflib/Configure.py @@ -2,10 +2,8 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import os,shlex,sys,time,re,shutil +import os,re,shlex,shutil,sys,time,traceback from waflib import ConfigSet,Utils,Options,Logs,Context,Build,Errors -BREAK='break' -CONTINUE='continue' WAF_CONFIG_LOG='config.log' autoconfig=False conf_template='''# project %(app)s configured on %(now)s by @@ -63,7 +61,7 @@ class ConfigurationContext(Context.Context): self.bldnode=(os.path.isabs(out)and self.root or self.path).make_node(out) self.bldnode.mkdir() if not os.path.isdir(self.bldnode.abspath()): - conf.fatal('Could not create the build directory %s'%self.bldnode.abspath()) + self.fatal('Could not create the build directory %s'%self.bldnode.abspath()) def execute(self): self.init_dirs() self.cachedir=self.bldnode.make_node(Build.CACHE_DIR) @@ -80,7 +78,7 @@ class ConfigurationContext(Context.Context): self.msg('Setting top to',self.srcnode.abspath()) self.msg('Setting out to',self.bldnode.abspath()) if id(self.srcnode)==id(self.bldnode): - Logs.warn('Setting top == out (remember to use "update_outputs")') + Logs.warn('Setting top == out') elif id(self.path)!=id(self.srcnode): if self.srcnode.is_child_of(self.path): Logs.warn('Are you certain that you do not want to set top="." ?') @@ -89,34 +87,36 @@ class ConfigurationContext(Context.Context): Context.top_dir=self.srcnode.abspath() Context.out_dir=self.bldnode.abspath() env=ConfigSet.ConfigSet() - env['argv']=sys.argv - env['options']=Options.options.__dict__ + env.argv=sys.argv + env.options=Options.options.__dict__ + env.config_cmd=self.cmd env.run_dir=Context.run_dir env.top_dir=Context.top_dir env.out_dir=Context.out_dir - env['hash']=self.hash - env['files']=self.files - env['environ']=dict(self.environ) - if not self.env.NO_LOCK_IN_RUN and not getattr(Options.options,'no_lock_in_run'): + env.hash=self.hash + env.files=self.files + env.environ=dict(self.environ) + env.launch_dir=Context.launch_dir + if not(self.env.NO_LOCK_IN_RUN or env.environ.get('NO_LOCK_IN_RUN')or getattr(Options.options,'no_lock_in_run')): env.store(os.path.join(Context.run_dir,Options.lockfile)) - if not self.env.NO_LOCK_IN_TOP and not getattr(Options.options,'no_lock_in_top'): + if not(self.env.NO_LOCK_IN_TOP or env.environ.get('NO_LOCK_IN_TOP')or getattr(Options.options,'no_lock_in_top')): env.store(os.path.join(Context.top_dir,Options.lockfile)) - if not self.env.NO_LOCK_IN_OUT and not getattr(Options.options,'no_lock_in_out'): + if not(self.env.NO_LOCK_IN_OUT or env.environ.get('NO_LOCK_IN_OUT')or getattr(Options.options,'no_lock_in_out')): env.store(os.path.join(Context.out_dir,Options.lockfile)) def prepare_env(self,env): if not env.PREFIX: if Options.options.prefix or Utils.is_win32: - env.PREFIX=Utils.sane_path(Options.options.prefix) + env.PREFIX=Options.options.prefix else: - env.PREFIX='' + env.PREFIX='/' if not env.BINDIR: if Options.options.bindir: - env.BINDIR=Utils.sane_path(Options.options.bindir) + env.BINDIR=Options.options.bindir else: env.BINDIR=Utils.subst_vars('${PREFIX}/bin',env) if not env.LIBDIR: if Options.options.libdir: - env.LIBDIR=Utils.sane_path(Options.options.libdir) + env.LIBDIR=Options.options.libdir else: env.LIBDIR=Utils.subst_vars('${PREFIX}/lib%s'%Utils.lib64(),env) def store(self): @@ -127,31 +127,35 @@ class ConfigurationContext(Context.Context): for key in self.all_envs: tmpenv=self.all_envs[key] tmpenv.store(os.path.join(self.cachedir.abspath(),key+Build.CACHE_SUFFIX)) - def load(self,input,tooldir=None,funs=None,with_sys_path=True): - tools=Utils.to_list(input) - if tooldir:tooldir=Utils.to_list(tooldir) + def load(self,tool_list,tooldir=None,funs=None,with_sys_path=True,cache=False): + tools=Utils.to_list(tool_list) + if tooldir: + tooldir=Utils.to_list(tooldir) for tool in tools: - mag=(tool,id(self.env),tooldir,funs) - if mag in self.tool_cache: - self.to_log('(tool %s is already loaded, skipping)'%tool) - continue - self.tool_cache.append(mag) + if cache: + mag=(tool,id(self.env),tooldir,funs) + if mag in self.tool_cache: + self.to_log('(tool %s is already loaded, skipping)'%tool) + continue + self.tool_cache.append(mag) module=None try: module=Context.load_tool(tool,tooldir,ctx=self,with_sys_path=with_sys_path) - except ImportError ,e: - self.fatal('Could not load the Waf tool %r from %r\n%s'%(tool,sys.path,e)) - except Exception ,e: + except ImportError as e: + self.fatal('Could not load the Waf tool %r from %r\n%s'%(tool,getattr(e,'waf_sys_path',sys.path),e)) + except Exception as e: self.to_log('imp %r (%r & %r)'%(tool,tooldir,funs)) - self.to_log(Utils.ex_stack()) + self.to_log(traceback.format_exc()) raise if funs is not None: self.eval_rules(funs) else: func=getattr(module,'configure',None) if func: - if type(func)is type(Utils.readf):func(self) - else:self.eval_rules(func) + if type(func)is type(Utils.readf): + func(self) + else: + self.eval_rules(func) self.tools.append({'tool':tool,'tooldir':tooldir,'funs':funs}) def post_recurse(self,node): super(ConfigurationContext,self).post_recurse(node) @@ -161,25 +165,12 @@ class ConfigurationContext(Context.Context): self.rules=Utils.to_list(rules) for x in self.rules: f=getattr(self,x) - if not f:self.fatal("No such method '%s'."%x) - try: - f() - except Exception ,e: - ret=self.err_handler(x,e) - if ret==BREAK: - break - elif ret==CONTINUE: - continue - else: - raise - def err_handler(self,fun,error): - pass + if not f: + self.fatal('No such configuration function %r'%x) + f() def conf(f): def fun(*k,**kw): - mandatory=True - if'mandatory'in kw: - mandatory=kw['mandatory'] - del kw['mandatory'] + mandatory=kw.pop('mandatory',True) try: return f(*k,**kw) except Errors.ConfigurationError: @@ -190,7 +181,7 @@ def conf(f): setattr(Build.BuildContext,f.__name__,fun) return f @conf -def add_os_flags(self,var,dest=None,dup=True): +def add_os_flags(self,var,dest=None,dup=False): try: flags=shlex.split(self.environ[var]) except KeyError: @@ -199,16 +190,19 @@ def add_os_flags(self,var,dest=None,dup=True): self.env.append_value(dest or var,flags) @conf def cmd_to_list(self,cmd): - if isinstance(cmd,str)and cmd.find(' '): - try: - os.stat(cmd) - except OSError: + if isinstance(cmd,str): + if os.path.isfile(cmd): + return[cmd] + if os.sep=='/': return shlex.split(cmd) else: - return[cmd] + try: + return shlex.split(cmd,posix=False) + except TypeError: + return shlex.split(cmd) return cmd @conf -def check_waf_version(self,mini='1.7.99',maxi='1.9.0',**kw): +def check_waf_version(self,mini='1.9.99',maxi='2.1.0',**kw): self.start_msg('Checking for waf version in %s-%s'%(str(mini),str(maxi)),**kw) ver=Context.HEXVERSION if Utils.num2ver(mini)>ver: @@ -239,15 +233,12 @@ def find_program(self,filename,**kw): path_list=Utils.to_list(path_list) else: path_list=environ.get('PATH','').split(os.pathsep) - if var in environ: - filename=environ[var] - if os.path.isfile(filename): - ret=[filename] - else: - ret=self.cmd_to_list(filename) + if kw.get('value'): + ret=self.cmd_to_list(kw['value']) + elif environ.get(var): + ret=self.cmd_to_list(environ[var]) elif self.env[var]: - ret=self.env[var] - ret=self.cmd_to_list(ret) + ret=self.cmd_to_list(self.env[var]) else: if not ret: ret=self.find_binary(filename,exts.split(','),path_list) @@ -263,12 +254,12 @@ def find_program(self,filename,**kw): retmsg=ret else: retmsg=False - self.msg("Checking for program '%s'"%msg,retmsg,**kw) - if not kw.get('quiet',None): + self.msg('Checking for program %r'%msg,retmsg,**kw) + if not kw.get('quiet'): self.to_log('find program=%r paths=%r var=%r -> %r'%(filename,path_list,var,ret)) if not ret: self.fatal(kw.get('errmsg','')or'Could not find the program %r'%filename) - interpreter=kw.get('interpreter',None) + interpreter=kw.get('interpreter') if interpreter is None: if not Utils.check_exe(ret[0],env=environ): self.fatal('Program %r is not executable'%ret) @@ -307,9 +298,7 @@ def run_build(self,*k,**kw): if cachemode==1: try: proj=ConfigSet.ConfigSet(os.path.join(dir,'cache_run_build')) - except OSError: - pass - except IOError: + except EnvironmentError: pass else: ret=proj['cache_run_build'] @@ -319,7 +308,8 @@ def run_build(self,*k,**kw): bdir=os.path.join(dir,'testbuild') if not os.path.exists(bdir): os.makedirs(bdir) - self.test_bld=bld=Build.BuildContext(top_dir=dir,out_dir=bdir) + cls_name=kw.get('run_build_cls')or getattr(self,'run_build_cls','build') + self.test_bld=bld=Context.create_context(cls_name,top_dir=dir,out_dir=bdir) bld.init_dirs() bld.progress_bar=0 bld.targets='*' @@ -334,7 +324,7 @@ def run_build(self,*k,**kw): try: bld.compile() except Errors.WafError: - ret='Test does not build: %s'%Utils.ex_stack() + ret='Test does not build: %s'%traceback.format_exc() self.fatal(ret) else: ret=getattr(bld,'retval',0) @@ -355,7 +345,7 @@ def ret_msg(self,msg,args): def test(self,*k,**kw): if not'env'in kw: kw['env']=self.env.derive() - if kw.get('validate',None): + if kw.get('validate'): kw['validate'](kw) self.start_msg(kw['msg'],**kw) ret=None @@ -369,7 +359,7 @@ def test(self,*k,**kw): self.fatal('The configuration failed') else: kw['success']=ret - if kw.get('post_check',None): + if kw.get('post_check'): ret=kw['post_check'](kw) if ret: self.end_msg(kw['errmsg'],'YELLOW',**kw) diff --git a/waflib/Context.py b/waflib/Context.py index 55422f3..ab6b154 100644 --- a/waflib/Context.py +++ b/waflib/Context.py @@ -5,10 +5,10 @@ import os,re,imp,sys from waflib import Utils,Errors,Logs import waflib.Node -HEXVERSION=0x1081600 -WAFVERSION="1.8.22" -WAFREVISION="17d4d4faa52c454eb3580e482df69b2a80e19fa7" -ABI=98 +HEXVERSION=0x2000e00 +WAFVERSION="2.0.14" +WAFREVISION="907519cab9c1c8c7e4f7d4e468ed6200b9250d58" +ABI=20 DBFILE='.wafpickle-%s-%d-%d'%(sys.platform,sys.hexversion,ABI) APPNAME='APPNAME' VERSION='VERSION' @@ -20,16 +20,13 @@ run_dir='' top_dir='' out_dir='' waf_dir='' -local_repo='' -remote_repo='https://raw.githubusercontent.com/waf-project/waf/master/' -remote_locs=['waflib/extras','waflib/Tools'] +default_encoding=Utils.console_encoding() g_module=None STDOUT=1 STDERR=-1 BOTH=0 classes=[] def create_context(cmd_name,*k,**kw): - global classes for x in classes: if x.cmd==cmd_name: return x(*k,**kw) @@ -37,10 +34,10 @@ def create_context(cmd_name,*k,**kw): ctx.fun=cmd_name return ctx class store_context(type): - def __init__(cls,name,bases,dict): - super(store_context,cls).__init__(name,bases,dict) + def __init__(cls,name,bases,dct): + super(store_context,cls).__init__(name,bases,dct) name=cls.__name__ - if name=='ctx'or name=='Context': + if name in('ctx','Context'): return try: cls.cmd @@ -48,7 +45,6 @@ class store_context(type): raise Errors.WafError('Missing command for the context class %r (cmd)'%name) if not getattr(cls,'fun',None): cls.fun=cls.cmd - global classes classes.insert(0,cls) ctx=store_context('ctx',(object,),{}) class Context(ctx): @@ -58,10 +54,9 @@ class Context(ctx): try: rd=kw['run_dir'] except KeyError: - global run_dir rd=run_dir - self.node_class=type("Nod3",(waflib.Node.Node,),{}) - self.node_class.__module__="waflib.Node" + self.node_class=type('Nod3',(waflib.Node.Node,),{}) + self.node_class.__module__='waflib.Node' self.node_class.ctx=self self.root=self.node_class('',None) self.cur_script=None @@ -69,8 +64,6 @@ class Context(ctx): self.stack_path=[] self.exec_dict={'ctx':self,'conf':self,'bld':self,'opt':self} self.logger=None - def __hash__(self): - return id(self) def finalize(self): try: logger=self.logger @@ -89,7 +82,6 @@ class Context(ctx): if fun: fun(self) def execute(self): - global g_module self.recurse([os.path.dirname(g_module.root_path)]) def pre_recurse(self,node): self.stack_path.append(self.cur_script) @@ -130,7 +122,7 @@ class Context(ctx): if not user_function: if not mandatory: continue - raise Errors.WafError('No function %s defined in %s'%(name or self.fun,node.abspath())) + raise Errors.WafError('No function %r defined in %s'%(name or self.fun,node.abspath())) user_function(self) finally: self.post_recurse(node) @@ -142,11 +134,18 @@ class Context(ctx): except OSError: raise Errors.WafError('Cannot read the folder %r'%d) raise Errors.WafError('No wscript file in directory %s'%d) + def log_command(self,cmd,kw): + if Logs.verbose: + fmt=os.environ.get('WAF_CMD_FORMAT') + if fmt=='string': + if not isinstance(cmd,str): + cmd=Utils.shell_escape(cmd) + Logs.debug('runner: %r',cmd) + Logs.debug('runner_env: kw=%s',kw) def exec_command(self,cmd,**kw): subprocess=Utils.subprocess kw['shell']=isinstance(cmd,str) - Logs.debug('runner: %r'%(cmd,)) - Logs.debug('runner_env: kw=%s'%kw) + self.log_command(cmd,kw) if self.logger: self.logger.info(cmd) if'stdout'not in kw: @@ -154,37 +153,37 @@ class Context(ctx): if'stderr'not in kw: kw['stderr']=subprocess.PIPE if Logs.verbose and not kw['shell']and not Utils.check_exe(cmd[0]): - raise Errors.WafError("Program %s not found!"%cmd[0]) - wargs={} + raise Errors.WafError('Program %s not found!'%cmd[0]) + cargs={} if'timeout'in kw: - if kw['timeout']is not None: - wargs['timeout']=kw['timeout'] + if sys.hexversion>=0x3030000: + cargs['timeout']=kw['timeout'] + if not'start_new_session'in kw: + kw['start_new_session']=True del kw['timeout'] if'input'in kw: if kw['input']: - wargs['input']=kw['input'] + cargs['input']=kw['input'] kw['stdin']=subprocess.PIPE del kw['input'] + if'cwd'in kw: + if not isinstance(kw['cwd'],str): + kw['cwd']=kw['cwd'].abspath() + encoding=kw.pop('decode_as',default_encoding) try: - if kw['stdout']or kw['stderr']: - p=subprocess.Popen(cmd,**kw) - (out,err)=p.communicate(**wargs) - ret=p.returncode - else: - out,err=(None,None) - ret=subprocess.Popen(cmd,**kw).wait(**wargs) - except Exception ,e: + ret,out,err=Utils.run_process(cmd,kw,cargs) + except Exception as e: raise Errors.WafError('Execution failure: %s'%str(e),ex=e) if out: if not isinstance(out,str): - out=out.decode(sys.stdout.encoding or'iso8859-1') + out=out.decode(encoding,errors='replace') if self.logger: - self.logger.debug('out: %s'%out) + self.logger.debug('out: %s',out) else: Logs.info(out,extra={'stream':sys.stdout,'c1':''}) if err: if not isinstance(err,str): - err=err.decode(sys.stdout.encoding or'iso8859-1') + err=err.decode(encoding,errors='replace') if self.logger: self.logger.error('err: %s'%err) else: @@ -193,48 +192,45 @@ class Context(ctx): def cmd_and_log(self,cmd,**kw): subprocess=Utils.subprocess kw['shell']=isinstance(cmd,str) - Logs.debug('runner: %r'%(cmd,)) - if'quiet'in kw: - quiet=kw['quiet'] - del kw['quiet'] - else: - quiet=None - if'output'in kw: - to_ret=kw['output'] - del kw['output'] - else: - to_ret=STDOUT + self.log_command(cmd,kw) + quiet=kw.pop('quiet',None) + to_ret=kw.pop('output',STDOUT) if Logs.verbose and not kw['shell']and not Utils.check_exe(cmd[0]): - raise Errors.WafError("Program %s not found!"%cmd[0]) + raise Errors.WafError('Program %r not found!'%cmd[0]) kw['stdout']=kw['stderr']=subprocess.PIPE if quiet is None: self.to_log(cmd) - wargs={} + cargs={} if'timeout'in kw: - if kw['timeout']is not None: - wargs['timeout']=kw['timeout'] + if sys.hexversion>=0x3030000: + cargs['timeout']=kw['timeout'] + if not'start_new_session'in kw: + kw['start_new_session']=True del kw['timeout'] if'input'in kw: if kw['input']: - wargs['input']=kw['input'] + cargs['input']=kw['input'] kw['stdin']=subprocess.PIPE del kw['input'] + if'cwd'in kw: + if not isinstance(kw['cwd'],str): + kw['cwd']=kw['cwd'].abspath() + encoding=kw.pop('decode_as',default_encoding) try: - p=subprocess.Popen(cmd,**kw) - (out,err)=p.communicate(**wargs) - except Exception ,e: + ret,out,err=Utils.run_process(cmd,kw,cargs) + except Exception as e: raise Errors.WafError('Execution failure: %s'%str(e),ex=e) if not isinstance(out,str): - out=out.decode(sys.stdout.encoding or'iso8859-1') + out=out.decode(encoding,errors='replace') if not isinstance(err,str): - err=err.decode(sys.stdout.encoding or'iso8859-1') + err=err.decode(encoding,errors='replace') if out and quiet!=STDOUT and quiet!=BOTH: self.to_log('out: %s'%out) if err and quiet!=STDERR and quiet!=BOTH: self.to_log('err: %s'%err) - if p.returncode: - e=Errors.WafError('Command %r returned %r'%(cmd,p.returncode)) - e.returncode=p.returncode + if ret: + e=Errors.WafError('Command %r returned %r'%(cmd,ret)) + e.returncode=ret e.stderr=err e.stdout=out raise e @@ -247,9 +243,14 @@ class Context(ctx): if self.logger: self.logger.info('from %s: %s'%(self.path.abspath(),msg)) try: - msg='%s\n(complete log in %s)'%(msg,self.logger.handlers[0].baseFilename) - except Exception: + logfile=self.logger.handlers[0].baseFilename + except AttributeError: pass + else: + if os.environ.get('WAF_PRINT_FAILURE_LOG'): + msg='Log from (%s):\n%s\n'%(logfile,Utils.readf(logfile)) + else: + msg='%s\n(complete log in %s)'%(msg,logfile) raise self.errors.ConfigurationError(msg,ex=ex) def to_log(self,msg): if not msg: @@ -269,14 +270,14 @@ class Context(ctx): result=kw['result'] except KeyError: result=k[1] - color=kw.get('color',None) + color=kw.get('color') if not isinstance(color,str): color=result and'GREEN'or'YELLOW' self.end_msg(result,color,**kw) def start_msg(self,*k,**kw): - if kw.get('quiet',None): + if kw.get('quiet'): return - msg=kw.get('msg',None)or k[0] + msg=kw.get('msg')or k[0] try: if self.in_msg: self.in_msg+=1 @@ -292,16 +293,16 @@ class Context(ctx): self.to_log(x) Logs.pprint('NORMAL',"%s :"%msg.ljust(self.line_just),sep='') def end_msg(self,*k,**kw): - if kw.get('quiet',None): + if kw.get('quiet'): return self.in_msg-=1 if self.in_msg: return - result=kw.get('result',None)or k[0] + result=kw.get('result')or k[0] defcolor='GREEN' - if result==True: + if result is True: msg='ok' - elif result==False: + elif not result: msg='not found' defcolor='YELLOW' else: @@ -316,7 +317,6 @@ class Context(ctx): color=defcolor Logs.pprint(color,msg) def load_special_tools(self,var,ban=[]): - global waf_dir if os.path.isdir(waf_dir): lst=self.root.find_node(waf_dir).find_node('waflib/extras').ant_glob(var) for x in lst: @@ -327,12 +327,12 @@ class Context(ctx): waflibs=PyZipFile(waf_dir) lst=waflibs.namelist() for x in lst: - if not re.match("waflib/extras/%s"%var.replace("*",".*"),var): + if not re.match('waflib/extras/%s'%var.replace('*','.*'),var): continue f=os.path.basename(x) doban=False for b in ban: - r=b.replace("*",".*") + r=b.replace('*','.*') if re.match(r,f): doban=True if not doban: @@ -351,8 +351,10 @@ def load_module(path,encoding=None): raise Errors.WafError('Could not read the file %r'%path) module_dir=os.path.dirname(path) sys.path.insert(0,module_dir) - try:exec(compile(code,path,'exec'),module.__dict__) - finally:sys.path.remove(module_dir) + try: + exec(compile(code,path,'exec'),module.__dict__) + finally: + sys.path.remove(module_dir) cache_modules[path]=module return module def load_tool(tool,tooldir=None,ctx=None,with_sys_path=True): @@ -360,14 +362,18 @@ def load_tool(tool,tooldir=None,ctx=None,with_sys_path=True): tool='javaw' else: tool=tool.replace('++','xx') - origSysPath=sys.path - if not with_sys_path:sys.path=[] + if not with_sys_path: + back_path=sys.path + sys.path=[] try: if tooldir: assert isinstance(tooldir,list) sys.path=tooldir+sys.path try: __import__(tool) + except ImportError as e: + e.waf_sys_path=list(sys.path) + raise finally: for d in tooldir: sys.path.remove(d) @@ -375,7 +381,8 @@ def load_tool(tool,tooldir=None,ctx=None,with_sys_path=True): Context.tools[tool]=ret return ret else: - if not with_sys_path:sys.path.insert(0,waf_dir) + if not with_sys_path: + sys.path.insert(0,waf_dir) try: for x in('waflib.Tools.%s','waflib.extras.%s','waflib.%s','%s'): try: @@ -383,12 +390,17 @@ def load_tool(tool,tooldir=None,ctx=None,with_sys_path=True): break except ImportError: x=None - if x is None: + else: __import__(tool) + except ImportError as e: + e.waf_sys_path=list(sys.path) + raise finally: - if not with_sys_path:sys.path.remove(waf_dir) + if not with_sys_path: + sys.path.remove(waf_dir) ret=sys.modules[x%tool] Context.tools[tool]=ret return ret finally: - if not with_sys_path:sys.path+=origSysPath + if not with_sys_path: + sys.path+=back_path diff --git a/waflib/Errors.py b/waflib/Errors.py index 3d98c8d..3ef76fc 100644 --- a/waflib/Errors.py +++ b/waflib/Errors.py @@ -5,6 +5,7 @@ import traceback,sys class WafError(Exception): def __init__(self,msg='',ex=None): + Exception.__init__(self) self.msg=msg assert not isinstance(msg,Exception) self.stack=[] @@ -27,7 +28,8 @@ class BuildError(WafError): lst=['Build failed'] for tsk in self.tasks: txt=tsk.format_error() - if txt:lst.append(txt) + if txt: + lst.append(txt) return'\n'.join(lst) class ConfigurationError(WafError): pass diff --git a/waflib/Logs.py b/waflib/Logs.py index 06ac7dd..4a1f7f8 100644 --- a/waflib/Logs.py +++ b/waflib/Logs.py @@ -12,7 +12,7 @@ if not os.environ.get('NOSYNC',False): import logging LOG_FORMAT=os.environ.get('WAF_LOG_FORMAT','%(asctime)s %(c1)s%(zone)s%(c2)s %(message)s') HOUR_FORMAT=os.environ.get('WAF_HOUR_FORMAT','%H:%M:%S') -zones='' +zones=[] verbose=0 colors_lst={'USE':True,'BOLD':'\x1b[01;1m','RED':'\x1b[01;31m','GREEN':'\x1b[32m','YELLOW':'\x1b[33m','PINK':'\x1b[35m','BLUE':'\x1b[01;34m','CYAN':'\x1b[36m','GREY':'\x1b[37m','NORMAL':'\x1b[0m','cursor_on':'\x1b[?25h','cursor_off':'\x1b[?25l',} indicator='\r\x1b[K%s%s%s' @@ -39,14 +39,15 @@ except AttributeError: def get_term_cols(): return 80 get_term_cols.__doc__=""" - Get the console width in characters. + Returns the console width in characters. :return: the number of characters per line :rtype: int """ def get_color(cl): - if not colors_lst['USE']:return'' - return colors_lst.get(cl,'') + if colors_lst['USE']: + return colors_lst.get(cl,'') + return'' class color_dict(object): def __getattr__(self,a): return get_color(a) @@ -55,8 +56,8 @@ class color_dict(object): colors=color_dict() re_log=re.compile(r'(\w+): (.*)',re.M) class log_filter(logging.Filter): - def __init__(self,name=None): - pass + def __init__(self,name=''): + logging.Filter.__init__(self,name) def filter(self,rec): rec.zone=rec.module if rec.levelno>=logging.INFO: @@ -129,6 +130,8 @@ class formatter(logging.Formatter): else: msg=re.sub(r'\r(?!\n)|\x1B\[(K|.*?(m|h|l))','',msg) if rec.levelno>=logging.INFO: + if rec.args: + return msg%rec.args return msg rec.msg=msg rec.c1=colors.PINK @@ -139,10 +142,8 @@ def debug(*k,**kw): if verbose: k=list(k) k[0]=k[0].replace('\n',' ') - global log log.debug(*k,**kw) def error(*k,**kw): - global log log.error(*k,**kw) if verbose>2: st=traceback.extract_stack() @@ -150,15 +151,14 @@ def error(*k,**kw): st=st[:-1] buf=[] for filename,lineno,name,line in st: - buf.append(' File "%s", line %d, in %s'%(filename,lineno,name)) + buf.append(' File %r, line %d, in %s'%(filename,lineno,name)) if line: buf.append(' %s'%line.strip()) - if buf:log.error("\n".join(buf)) + if buf: + log.error('\n'.join(buf)) def warn(*k,**kw): - global log log.warn(*k,**kw) def info(*k,**kw): - global log log.info(*k,**kw) def init_log(): global log @@ -172,7 +172,11 @@ def init_log(): log.setLevel(logging.DEBUG) def make_logger(path,name): logger=logging.getLogger(name) - hdlr=logging.FileHandler(path,'w') + if sys.hexversion>0x3000000: + encoding=sys.stdout.encoding + else: + encoding=None + hdlr=logging.FileHandler(path,'w',encoding=encoding) formatter=logging.Formatter('%(message)s') hdlr.setFormatter(formatter) logger.addHandler(hdlr) @@ -196,4 +200,4 @@ def free_logger(logger): except Exception: pass def pprint(col,msg,label='',sep='\n'): - info("%s%s%s %s"%(colors(col),msg,colors.NORMAL,label),extra={'terminator':sep}) + info('%s%s%s %s',colors(col),msg,colors.NORMAL,label,extra={'terminator':sep}) diff --git a/waflib/Node.py b/waflib/Node.py index 85e3910..dc979d6 100644 --- a/waflib/Node.py +++ b/waflib/Node.py @@ -10,6 +10,7 @@ exclude_regs=''' **/.#* **/%*% **/._* +**/*.swp **/CVS **/CVS/** **/.cvsignore @@ -35,13 +36,52 @@ exclude_regs=''' **/_darcs/** **/.intlcache **/.DS_Store''' -split_path=Utils.split_path -split_path_unix=Utils.split_path_unix -split_path_cygwin=Utils.split_path_cygwin -split_path_win32=Utils.split_path_win32 +def ant_matcher(s,ignorecase): + reflags=re.I if ignorecase else 0 + ret=[] + for x in Utils.to_list(s): + x=x.replace('\\','/').replace('//','/') + if x.endswith('/'): + x+='**' + accu=[] + for k in x.split('/'): + if k=='**': + accu.append(k) + else: + k=k.replace('.','[.]').replace('*','.*').replace('?','.').replace('+','\\+') + k='^%s$'%k + try: + exp=re.compile(k,flags=reflags) + except Exception as e: + raise Errors.WafError('Invalid pattern: %s'%k,e) + else: + accu.append(exp) + ret.append(accu) + return ret +def ant_sub_filter(name,nn): + ret=[] + for lst in nn: + if not lst: + pass + elif lst[0]=='**': + ret.append(lst) + if len(lst)>1: + if lst[1].match(name): + ret.append(lst[2:]) + else: + ret.append([]) + elif lst[0].match(name): + ret.append(lst[1:]) + return ret +def ant_sub_matcher(name,pats): + nacc=ant_sub_filter(name,pats[0]) + nrej=ant_sub_filter(name,pats[1]) + if[]in nrej: + nacc=[] + return[nacc,nrej] class Node(object): dict_class=dict - __slots__=('name','sig','children','parent','cache_abspath','cache_isdir','cache_sig') + __slots__=('name','parent','children','cache_abspath','cache_isdir') def __init__(self,name,parent): self.name=name self.parent=parent @@ -54,23 +94,17 @@ class Node(object): self.parent=data[1] if data[2]is not None: self.children=self.dict_class(data[2]) - if data[3]is not None: - self.sig=data[3] def __getstate__(self): - return(self.name,self.parent,getattr(self,'children',None),getattr(self,'sig',None)) + return(self.name,self.parent,getattr(self,'children',None)) def __str__(self): - return self.name + return self.abspath() def __repr__(self): return self.abspath() - def __hash__(self): - return id(self) - def __eq__(self,node): - return id(self)==id(node) def __copy__(self): raise Errors.WafError('nodes are not supposed to be copied') - def read(self,flags='r',encoding='ISO8859-1'): + def read(self,flags='r',encoding='latin-1'): return Utils.readf(self.abspath(),flags,encoding) - def write(self,data,flags='w',encoding='ISO8859-1'): + def write(self,data,flags='w',encoding='latin-1'): Utils.writef(self.abspath(),data,flags,encoding) def read_json(self,convert=True,encoding='utf-8'): import json @@ -103,20 +137,25 @@ class Node(object): newline='' output=json.dumps(data,indent=indent,separators=separators,sort_keys=sort_keys)+newline self.write(output,encoding='utf-8') + def exists(self): + return os.path.exists(self.abspath()) + def isdir(self): + return os.path.isdir(self.abspath()) def chmod(self,val): os.chmod(self.abspath(),val) - def delete(self): + def delete(self,evict=True): try: try: - if hasattr(self,'children'): + if os.path.isdir(self.abspath()): shutil.rmtree(self.abspath()) else: os.remove(self.abspath()) - except OSError ,e: + except OSError: if os.path.exists(self.abspath()): - raise e + raise finally: - self.evict() + if evict: + self.evict() def evict(self): del self.parent.children[self.name] def suffix(self): @@ -134,7 +173,7 @@ class Node(object): lst.sort() return lst def mkdir(self): - if getattr(self,'cache_isdir',None): + if self.isdir(): return try: self.parent.mkdir() @@ -145,16 +184,19 @@ class Node(object): os.makedirs(self.abspath()) except OSError: pass - if not os.path.isdir(self.abspath()): - raise Errors.WafError('Could not create the directory %s'%self.abspath()) + if not self.isdir(): + raise Errors.WafError('Could not create the directory %r'%self) try: self.children except AttributeError: self.children=self.dict_class() - self.cache_isdir=True def find_node(self,lst): if isinstance(lst,str): - lst=[x for x in split_path(lst)if x and x!='.'] + lst=[x for x in Utils.split_path(lst)if x and x!='.'] + if lst and lst[0].startswith('\\\\')and not self.parent: + node=self.ctx.root.make_node(lst[0]) + node.cache_isdir=True + return node.find_node(lst[1:]) cur=self for x in lst: if x=='..': @@ -171,43 +213,34 @@ class Node(object): except KeyError: pass cur=self.__class__(x,cur) - try: - os.stat(cur.abspath()) - except OSError: + if not cur.exists(): cur.evict() return None - ret=cur - try: - os.stat(ret.abspath()) - except OSError: - ret.evict() + if not cur.exists(): + cur.evict() return None - try: - while not getattr(cur.parent,'cache_isdir',None): - cur=cur.parent - cur.cache_isdir=True - except AttributeError: - pass - return ret + return cur def make_node(self,lst): if isinstance(lst,str): - lst=[x for x in split_path(lst)if x and x!='.'] + lst=[x for x in Utils.split_path(lst)if x and x!='.'] cur=self for x in lst: if x=='..': cur=cur.parent or cur continue - if getattr(cur,'children',{}): - if x in cur.children: - cur=cur.children[x] - continue - else: + try: + cur=cur.children[x] + except AttributeError: cur.children=self.dict_class() + except KeyError: + pass + else: + continue cur=self.__class__(x,cur) return cur def search_node(self,lst): if isinstance(lst,str): - lst=[x for x in split_path(lst)if x and x!='.'] + lst=[x for x in Utils.split_path(lst)if x and x!='.'] cur=self for x in lst: if x=='..': @@ -233,19 +266,17 @@ class Node(object): up+=1 c2=c2.parent c2h-=1 - while id(c1)!=id(c2): + while not c1 is c2: lst.append(c1.name) up+=1 c1=c1.parent c2=c2.parent if c1.parent: - for i in range(up): - lst.append('..') + lst.extend(['..']*up) + lst.reverse() + return os.sep.join(lst)or'.' else: - if lst and not Utils.is_win32: - lst.append('') - lst.reverse() - return os.sep.join(lst)or'.' + return self.abspath() def abspath(self): try: return self.cache_abspath @@ -279,8 +310,8 @@ class Node(object): while diff>0: diff-=1 p=p.parent - return id(p)==id(node) - def ant_iter(self,accept=None,maxdepth=25,pats=[],dir=False,src=True,remove=True): + return p is node + def ant_iter(self,accept=None,maxdepth=25,pats=[],dir=False,src=True,remove=True,quiet=False): dircont=self.listdir() dircont.sort() try: @@ -296,114 +327,76 @@ class Node(object): if npats and npats[0]: accepted=[]in npats[0] node=self.make_node([name]) - isdir=os.path.isdir(node.abspath()) + isdir=node.isdir() if accepted: if isdir: if dir: yield node - else: - if src: - yield node - if getattr(node,'cache_isdir',None)or isdir: + elif src: + yield node + if isdir: node.cache_isdir=True if maxdepth: - for k in node.ant_iter(accept=accept,maxdepth=maxdepth-1,pats=npats,dir=dir,src=src,remove=remove): + for k in node.ant_iter(accept=accept,maxdepth=maxdepth-1,pats=npats,dir=dir,src=src,remove=remove,quiet=quiet): yield k - raise StopIteration def ant_glob(self,*k,**kw): src=kw.get('src',True) - dir=kw.get('dir',False) + dir=kw.get('dir') excl=kw.get('excl',exclude_regs) incl=k and k[0]or kw.get('incl','**') - reflags=kw.get('ignorecase',0)and re.I - def to_pat(s): - lst=Utils.to_list(s) - ret=[] - for x in lst: - x=x.replace('\\','/').replace('//','/') - if x.endswith('/'): - x+='**' - lst2=x.split('/') - accu=[] - for k in lst2: - if k=='**': - accu.append(k) - else: - k=k.replace('.','[.]').replace('*','.*').replace('?','.').replace('+','\\+') - k='^%s$'%k - try: - accu.append(re.compile(k,flags=reflags)) - except Exception ,e: - raise Errors.WafError("Invalid pattern: %s"%k,e) - ret.append(accu) - return ret - def filtre(name,nn): - ret=[] - for lst in nn: - if not lst: - pass - elif lst[0]=='**': - ret.append(lst) - if len(lst)>1: - if lst[1].match(name): - ret.append(lst[2:]) - else: - ret.append([]) - elif lst[0].match(name): - ret.append(lst[1:]) - return ret - def accept(name,pats): - nacc=filtre(name,pats[0]) - nrej=filtre(name,pats[1]) - if[]in nrej: - nacc=[] - return[nacc,nrej] - ret=[x for x in self.ant_iter(accept=accept,pats=[to_pat(incl),to_pat(excl)],maxdepth=kw.get('maxdepth',25),dir=dir,src=src,remove=kw.get('remove',True))] - if kw.get('flat',False): - return' '.join([x.path_from(self)for x in ret]) - return ret + remove=kw.get('remove',True) + maxdepth=kw.get('maxdepth',25) + ignorecase=kw.get('ignorecase',False) + quiet=kw.get('quiet',False) + pats=(ant_matcher(incl,ignorecase),ant_matcher(excl,ignorecase)) + if kw.get('generator'): + return Utils.lazy_generator(self.ant_iter,(ant_sub_matcher,maxdepth,pats,dir,src,remove,quiet)) + it=self.ant_iter(ant_sub_matcher,maxdepth,pats,dir,src,remove,quiet) + if kw.get('flat'): + return' '.join(x.path_from(self)for x in it) + return list(it) def is_src(self): cur=self - x=id(self.ctx.srcnode) - y=id(self.ctx.bldnode) + x=self.ctx.srcnode + y=self.ctx.bldnode while cur.parent: - if id(cur)==y: + if cur is y: return False - if id(cur)==x: + if cur is x: return True cur=cur.parent return False def is_bld(self): cur=self - y=id(self.ctx.bldnode) + y=self.ctx.bldnode while cur.parent: - if id(cur)==y: + if cur is y: return True cur=cur.parent return False def get_src(self): cur=self - x=id(self.ctx.srcnode) - y=id(self.ctx.bldnode) + x=self.ctx.srcnode + y=self.ctx.bldnode lst=[] while cur.parent: - if id(cur)==y: + if cur is y: lst.reverse() - return self.ctx.srcnode.make_node(lst) - if id(cur)==x: + return x.make_node(lst) + if cur is x: return self lst.append(cur.name) cur=cur.parent return self def get_bld(self): cur=self - x=id(self.ctx.srcnode) - y=id(self.ctx.bldnode) + x=self.ctx.srcnode + y=self.ctx.bldnode lst=[] while cur.parent: - if id(cur)==y: + if cur is y: return self - if id(cur)==x: + if cur is x: lst.reverse() return self.ctx.bldnode.make_node(lst) lst.append(cur.name) @@ -414,42 +407,25 @@ class Node(object): return self.ctx.bldnode.make_node(['__root__']+lst) def find_resource(self,lst): if isinstance(lst,str): - lst=[x for x in split_path(lst)if x and x!='.'] + lst=[x for x in Utils.split_path(lst)if x and x!='.'] node=self.get_bld().search_node(lst) if not node: - self=self.get_src() - node=self.find_node(lst) - if node: - if os.path.isdir(node.abspath()): - return None + node=self.get_src().find_node(lst) + if node and node.isdir(): + return None return node def find_or_declare(self,lst): - if isinstance(lst,str): - lst=[x for x in split_path(lst)if x and x!='.'] - node=self.get_bld().search_node(lst) - if node: - if not os.path.isfile(node.abspath()): - node.sig=None - node.parent.mkdir() - return node - self=self.get_src() - node=self.find_node(lst) - if node: - if not os.path.isfile(node.abspath()): - node.sig=None - node.parent.mkdir() - return node - node=self.get_bld().make_node(lst) + if isinstance(lst,str)and os.path.isabs(lst): + node=self.ctx.root.make_node(lst) + else: + node=self.get_bld().make_node(lst) node.parent.mkdir() return node def find_dir(self,lst): if isinstance(lst,str): - lst=[x for x in split_path(lst)if x and x!='.'] + lst=[x for x in Utils.split_path(lst)if x and x!='.'] node=self.find_node(lst) - try: - if not os.path.isdir(node.abspath()): - return None - except(OSError,AttributeError): + if node and not node.isdir(): return None return node def change_ext(self,ext,ext_in=None): @@ -469,22 +445,33 @@ class Node(object): return self.path_from(self.ctx.srcnode) def relpath(self): cur=self - x=id(self.ctx.bldnode) + x=self.ctx.bldnode while cur.parent: - if id(cur)==x: + if cur is x: return self.bldpath() cur=cur.parent return self.srcpath() def bld_dir(self): return self.parent.bldpath() + def h_file(self): + return Utils.h_file(self.abspath()) def get_bld_sig(self): try: - return self.cache_sig + cache=self.ctx.cache_sig except AttributeError: - pass - if not self.is_bld()or self.ctx.bldnode is self.ctx.srcnode: - self.sig=Utils.h_file(self.abspath()) - self.cache_sig=ret=self.sig + cache=self.ctx.cache_sig={} + try: + ret=cache[self] + except KeyError: + p=self.abspath() + try: + ret=cache[self]=self.h_file() + except EnvironmentError: + if self.isdir(): + st=os.stat(p) + ret=cache[self]=Utils.h_list([p,st.st_ino,st.st_mode]) + return ret + raise return ret pickle_lock=Utils.threading.Lock() class Nod3(Node): diff --git a/waflib/Options.py b/waflib/Options.py index 5101f5f..b61c60a 100644 --- a/waflib/Options.py +++ b/waflib/Options.py @@ -3,18 +3,26 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file import os,tempfile,optparse,sys,re -from waflib import Logs,Utils,Context -cmds='distclean configure build install clean uninstall check dist distcheck'.split() -options={} +from waflib import Logs,Utils,Context,Errors +options=optparse.Values() commands=[] envvars=[] lockfile=os.environ.get('WAFLOCK','.lock-waf_%s_build'%sys.platform) -platform=Utils.unversioned_sys_platform() class opt_parser(optparse.OptionParser): - def __init__(self,ctx): - optparse.OptionParser.__init__(self,conflict_handler="resolve",version='waf %s (%s)'%(Context.WAFVERSION,Context.WAFREVISION)) + def __init__(self,ctx,allow_unknown=False): + optparse.OptionParser.__init__(self,conflict_handler='resolve',add_help_option=False,version='waf %s (%s)'%(Context.WAFVERSION,Context.WAFREVISION)) self.formatter.width=Logs.get_term_cols() self.ctx=ctx + self.allow_unknown=allow_unknown + def _process_args(self,largs,rargs,values): + while rargs: + try: + optparse.OptionParser._process_args(self,largs,rargs,values) + except(optparse.BadOptionError,optparse.AmbiguousOptionError)as e: + if self.allow_unknown: + largs.append(e.opt_str) + else: + self.error(str(e)) def print_usage(self,file=None): return self.print_help(file) def get_usage(self): @@ -52,11 +60,18 @@ class OptionsContext(Context.Context): jobs=self.jobs() p=self.add_option color=os.environ.get('NOCOLOR','')and'no'or'auto' + if os.environ.get('CLICOLOR','')=='0': + color='no' + elif os.environ.get('CLICOLOR_FORCE','')=='1': + color='yes' p('-c','--color',dest='colors',default=color,action='store',help='whether to use colors (yes/no/auto) [default: auto]',choices=('yes','no','auto')) p('-j','--jobs',dest='jobs',default=jobs,type='int',help='amount of parallel jobs (%r)'%jobs) p('-k','--keep',dest='keep',default=0,action='count',help='continue despite errors (-kk to try harder)') p('-v','--verbose',dest='verbose',default=0,action='count',help='verbosity level -v -vv or -vvv [default: 0]') p('--zones',dest='zones',default='',action='store',help='debugging zones (task_gen, deps, tasks, etc)') + p('--profile',dest='profile',default=0,action='store_true',help=optparse.SUPPRESS_HELP) + p('--pdb',dest='pdb',default=0,action='store_true',help=optparse.SUPPRESS_HELP) + p('-h','--help',dest='whelp',default=0,action='store_true',help="show this help message and exit") gr=self.add_option_group('Configuration options') self.option_groups['configure options']=gr gr.add_option('-o','--out',action='store',default='',help='build dir for the project',dest='out') @@ -66,7 +81,7 @@ class OptionsContext(Context.Context): gr.add_option('--no-lock-in-top',action='store_true',default='',help=optparse.SUPPRESS_HELP,dest='no_lock_in_top') default_prefix=getattr(Context.g_module,'default_prefix',os.environ.get('PREFIX')) if not default_prefix: - if platform=='win32': + if Utils.unversioned_sys_platform()=='win32': d=tempfile.gettempdir() default_prefix=d[0].upper()+d[1:] else: @@ -101,7 +116,7 @@ class OptionsContext(Context.Context): if not count and os.name not in('nt','java'): try: tmp=self.cmd_and_log(['sysctl','-n','hw.ncpu'],quiet=0) - except Exception: + except Errors.WafError: pass else: if re.match('^[0-9]+$',tmp): @@ -128,20 +143,58 @@ class OptionsContext(Context.Context): if group.title==opt_str: return group return None - def parse_args(self,_args=None): - global options,commands,envvars + def sanitize_path(self,path,cwd=None): + if not cwd: + cwd=Context.launch_dir + p=os.path.expanduser(path) + p=os.path.join(cwd,p) + p=os.path.normpath(p) + p=os.path.abspath(p) + return p + def parse_cmd_args(self,_args=None,cwd=None,allow_unknown=False): + self.parser.allow_unknown=allow_unknown (options,leftover_args)=self.parser.parse_args(args=_args) + envvars=[] + commands=[] for arg in leftover_args: if'='in arg: envvars.append(arg) - else: + elif arg!='options': commands.append(arg) - if options.destdir: - options.destdir=Utils.sane_path(options.destdir) + for name in'top out destdir prefix bindir libdir'.split(): + if getattr(options,name,None): + path=self.sanitize_path(getattr(options,name),cwd) + setattr(options,name,path) + return options,commands,envvars + def init_module_vars(self,arg_options,arg_commands,arg_envvars): + options.__dict__.clear() + del commands[:] + del envvars[:] + options.__dict__.update(arg_options.__dict__) + commands.extend(arg_commands) + envvars.extend(arg_envvars) + for var in envvars: + (name,value)=var.split('=',1) + os.environ[name.strip()]=value + def init_logs(self,options,commands,envvars): + Logs.verbose=options.verbose if options.verbose>=1: self.load('errcheck') colors={'yes':2,'auto':1,'no':0}[options.colors] Logs.enable_colors(colors) + if options.zones: + Logs.zones=options.zones.split(',') + if not Logs.verbose: + Logs.verbose=1 + elif Logs.verbose>0: + Logs.zones=['runner'] + if Logs.verbose>2: + Logs.zones=['*'] + def parse_args(self,_args=None): + options,commands,envvars=self.parse_cmd_args() + self.init_logs(options,commands,envvars) + self.init_module_vars(options,commands,envvars) def execute(self): super(OptionsContext,self).execute() self.parse_args() + Utils.alloc_process_pool(options.jobs) diff --git a/waflib/Runner.py b/waflib/Runner.py index c22503e..e443021 100644 --- a/waflib/Runner.py +++ b/waflib/Runner.py @@ -2,17 +2,72 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import random,atexit +import heapq,traceback try: - from queue import Queue + from queue import Queue,PriorityQueue except ImportError: from Queue import Queue + try: + from Queue import PriorityQueue + except ImportError: + class PriorityQueue(Queue): + def _init(self,maxsize): + self.maxsize=maxsize + self.queue=[] + def _put(self,item): + heapq.heappush(self.queue,item) + def _get(self): + return heapq.heappop(self.queue) from waflib import Utils,Task,Errors,Logs -GAP=10 -class TaskConsumer(Utils.threading.Thread): +GAP=5 +class PriorityTasks(object): def __init__(self): + self.lst=[] + def __len__(self): + return len(self.lst) + def __iter__(self): + return iter(self.lst) + def __str__(self): + return'PriorityTasks: [%s]'%'\n '.join(str(x)for x in self.lst) + def clear(self): + self.lst=[] + def append(self,task): + heapq.heappush(self.lst,task) + def appendleft(self,task): + heapq.heappush(self.lst,task) + def pop(self): + return heapq.heappop(self.lst) + def extend(self,lst): + if self.lst: + for x in lst: + self.append(x) + else: + if isinstance(lst,list): + self.lst=lst + heapq.heapify(lst) + else: + self.lst=lst.lst +class Consumer(Utils.threading.Thread): + def __init__(self,spawner,task): + Utils.threading.Thread.__init__(self) + self.task=task + self.spawner=spawner + self.setDaemon(1) + self.start() + def run(self): + try: + if not self.spawner.master.stop: + self.spawner.master.process_task(self.task) + finally: + self.spawner.sem.release() + self.spawner.master.out.put(self.task) + self.task=None + self.spawner=None +class Spawner(Utils.threading.Thread): + def __init__(self,master): Utils.threading.Thread.__init__(self) - self.ready=Queue() + self.master=master + self.sem=Utils.threading.Semaphore(master.numjobs) self.setDaemon(1) self.start() def run(self): @@ -21,133 +76,173 @@ class TaskConsumer(Utils.threading.Thread): except Exception: pass def loop(self): + master=self.master while 1: - tsk=self.ready.get() - if not isinstance(tsk,Task.TaskBase): - tsk(self) - else: - tsk.process() -pool=Queue() -def get_pool(): - try: - return pool.get(False) - except Exception: - return TaskConsumer() -def put_pool(x): - pool.put(x) -def _free_resources(): - global pool - lst=[] - while pool.qsize(): - lst.append(pool.get()) - for x in lst: - x.ready.put(None) - for x in lst: - x.join() - pool=None -atexit.register(_free_resources) + task=master.ready.get() + self.sem.acquire() + if not master.stop: + task.log_display(task.generator.bld) + Consumer(self,task) class Parallel(object): def __init__(self,bld,j=2): self.numjobs=j self.bld=bld - self.outstanding=[] - self.frozen=[] + self.outstanding=PriorityTasks() + self.postponed=PriorityTasks() + self.incomplete=set() + self.ready=PriorityQueue(0) self.out=Queue(0) self.count=0 - self.processed=1 + self.processed=0 self.stop=False self.error=[] self.biter=None self.dirty=False + self.revdeps=Utils.defaultdict(set) + self.spawner=None + if self.numjobs>1: + self.spawner=Spawner(self) def get_next_task(self): if not self.outstanding: return None - return self.outstanding.pop(0) + return self.outstanding.pop() def postpone(self,tsk): - if random.randint(0,1): - self.frozen.insert(0,tsk) - else: - self.frozen.append(tsk) + self.postponed.append(tsk) def refill_task_list(self): while self.count>self.numjobs*GAP: self.get_out() while not self.outstanding: if self.count: self.get_out() - elif self.frozen: + if self.outstanding: + break + elif self.postponed: try: cond=self.deadlock==self.processed except AttributeError: pass else: if cond: - msg='check the build order for the tasks' - for tsk in self.frozen: - if not tsk.run_after: - msg='check the methods runnable_status' - break lst=[] - for tsk in self.frozen: - lst.append('%s\t-> %r'%(repr(tsk),[id(x)for x in tsk.run_after])) - raise Errors.WafError('Deadlock detected: %s%s'%(msg,''.join(lst))) + for tsk in self.postponed: + deps=[id(x)for x in tsk.run_after if not x.hasrun] + lst.append('%s\t-> %r'%(repr(tsk),deps)) + if not deps: + lst.append('\n task %r dependencies are done, check its *runnable_status*?'%id(tsk)) + raise Errors.WafError('Deadlock detected: check the task build order%s'%''.join(lst)) self.deadlock=self.processed - if self.frozen: - self.outstanding+=self.frozen - self.frozen=[] + if self.postponed: + self.outstanding.extend(self.postponed) + self.postponed.clear() elif not self.count: - self.outstanding.extend(self.biter.next()) - self.total=self.bld.total() - break + if self.incomplete: + for x in self.incomplete: + for k in x.run_after: + if not k.hasrun: + break + else: + self.incomplete.remove(x) + self.outstanding.append(x) + break + else: + if self.stop or self.error: + break + raise Errors.WafError('Broken revdeps detected on %r'%self.incomplete) + else: + tasks=next(self.biter) + ready,waiting=self.prio_and_split(tasks) + self.outstanding.extend(ready) + self.incomplete.update(waiting) + self.total=self.bld.total() + break def add_more_tasks(self,tsk): if getattr(tsk,'more_tasks',None): - self.outstanding+=tsk.more_tasks + more=set(tsk.more_tasks) + groups_done=set() + def iteri(a,b): + for x in a: + yield x + for x in b: + yield x + for x in iteri(self.outstanding,self.incomplete): + for k in x.run_after: + if isinstance(k,Task.TaskGroup): + if k not in groups_done: + groups_done.add(k) + for j in k.prev&more: + self.revdeps[j].add(k) + elif k in more: + self.revdeps[k].add(x) + ready,waiting=self.prio_and_split(tsk.more_tasks) + self.outstanding.extend(ready) + self.incomplete.update(waiting) self.total+=len(tsk.more_tasks) + def mark_finished(self,tsk): + def try_unfreeze(x): + if x in self.incomplete: + for k in x.run_after: + if not k.hasrun: + break + else: + self.incomplete.remove(x) + self.outstanding.append(x) + if tsk in self.revdeps: + for x in self.revdeps[tsk]: + if isinstance(x,Task.TaskGroup): + x.prev.remove(tsk) + if not x.prev: + for k in x.next: + k.run_after.remove(x) + try_unfreeze(k) + x.next=[] + else: + try_unfreeze(x) + del self.revdeps[tsk] + if hasattr(tsk,'semaphore'): + sem=tsk.semaphore + sem.release(tsk) + while sem.waiting and not sem.is_locked(): + x=sem.waiting.pop() + self._add_task(x) def get_out(self): tsk=self.out.get() if not self.stop: self.add_more_tasks(tsk) + self.mark_finished(tsk) self.count-=1 self.dirty=True return tsk def add_task(self,tsk): - try: - self.pool - except AttributeError: - self.init_task_pool() self.ready.put(tsk) - def init_task_pool(self): - pool=self.pool=[get_pool()for i in range(self.numjobs)] - self.ready=Queue(0) - def setq(consumer): - consumer.ready=self.ready - for x in pool: - x.ready.put(setq) - return pool - def free_task_pool(self): - def setq(consumer): - consumer.ready=Queue(0) - self.out.put(self) - try: - pool=self.pool - except AttributeError: - pass + def _add_task(self,tsk): + if hasattr(tsk,'semaphore'): + sem=tsk.semaphore + try: + sem.acquire(tsk) + except IndexError: + sem.waiting.add(tsk) + return + self.count+=1 + self.processed+=1 + if self.numjobs==1: + tsk.log_display(tsk.generator.bld) + try: + self.process_task(tsk) + finally: + self.out.put(tsk) else: - for x in pool: - self.ready.put(setq) - for x in pool: - self.get_out() - for x in pool: - put_pool(x) - self.pool=[] + self.add_task(tsk) + def process_task(self,tsk): + tsk.process() + if tsk.hasrun!=Task.SUCCESS: + self.error_handler(tsk) def skip(self,tsk): tsk.hasrun=Task.SKIPPED + self.mark_finished(tsk) + def cancel(self,tsk): + tsk.hasrun=Task.CANCELED + self.mark_finished(tsk) def error_handler(self,tsk): - if hasattr(tsk,'scan')and hasattr(tsk,'uid'): - key=(tsk.uid(),'imp') - try: - del self.bld.task_sigs[key] - except KeyError: - pass if not self.bld.keep: self.stop=True self.error.append(tsk) @@ -156,7 +251,7 @@ class Parallel(object): return tsk.runnable_status() except Exception: self.processed+=1 - tsk.err_msg=Utils.ex_stack() + tsk.err_msg=traceback.format_exc() if not self.stop and self.bld.keep: self.skip(tsk) if self.bld.keep==1: @@ -187,21 +282,90 @@ class Parallel(object): break st=self.task_status(tsk) if st==Task.RUN_ME: - tsk.position=(self.processed,self.total) - self.count+=1 - tsk.master=self - self.processed+=1 - if self.numjobs==1: - tsk.process() - else: - self.add_task(tsk) - if st==Task.ASK_LATER: + self._add_task(tsk) + elif st==Task.ASK_LATER: self.postpone(tsk) elif st==Task.SKIP_ME: self.processed+=1 self.skip(tsk) self.add_more_tasks(tsk) + elif st==Task.CANCEL_ME: + if Logs.verbose>1: + self.error.append(tsk) + self.processed+=1 + self.cancel(tsk) while self.error and self.count: self.get_out() - assert(self.count==0 or self.stop) - self.free_task_pool() + self.ready.put(None) + if not self.stop: + assert not self.count + assert not self.postponed + assert not self.incomplete + def prio_and_split(self,tasks): + for x in tasks: + x.visited=0 + reverse=self.revdeps + groups_done=set() + for x in tasks: + for k in x.run_after: + if isinstance(k,Task.TaskGroup): + if k not in groups_done: + groups_done.add(k) + for j in k.prev: + reverse[j].add(k) + else: + reverse[k].add(x) + def visit(n): + if isinstance(n,Task.TaskGroup): + return sum(visit(k)for k in n.next) + if n.visited==0: + n.visited=1 + if n in reverse: + rev=reverse[n] + n.prio_order=n.tree_weight+len(rev)+sum(visit(k)for k in rev) + else: + n.prio_order=n.tree_weight + n.visited=2 + elif n.visited==1: + raise Errors.WafError('Dependency cycle found!') + return n.prio_order + for x in tasks: + if x.visited!=0: + continue + try: + visit(x) + except Errors.WafError: + self.debug_cycles(tasks,reverse) + ready=[] + waiting=[] + for x in tasks: + for k in x.run_after: + if not k.hasrun: + waiting.append(x) + break + else: + ready.append(x) + return(ready,waiting) + def debug_cycles(self,tasks,reverse): + tmp={} + for x in tasks: + tmp[x]=0 + def visit(n,acc): + if isinstance(n,Task.TaskGroup): + for k in n.next: + visit(k,acc) + return + if tmp[n]==0: + tmp[n]=1 + for k in reverse.get(n,[]): + visit(k,[n]+acc) + tmp[n]=2 + elif tmp[n]==1: + lst=[] + for tsk in acc: + lst.append(repr(tsk)) + if tsk is n: + break + raise Errors.WafError('Task dependency cycle in "run_after" constraints: %s'%''.join(lst)) + for x in tasks: + visit(x,[]) diff --git a/waflib/Scripting.py b/waflib/Scripting.py index 7798bb0..7db951d 100644 --- a/waflib/Scripting.py +++ b/waflib/Scripting.py @@ -2,6 +2,7 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file +from __future__ import with_statement import os,shlex,shutil,traceback,errno,sys,stat from waflib import Utils,Configure,Logs,Options,ConfigSet,Context,Errors,Build,Node build_dir_override=None @@ -10,51 +11,50 @@ default_cmd="build" def waf_entry_point(current_directory,version,wafdir): Logs.init_log() if Context.WAFVERSION!=version: - Logs.error('Waf script %r and library %r do not match (directory %r)'%(version,Context.WAFVERSION,wafdir)) + Logs.error('Waf script %r and library %r do not match (directory %r)',version,Context.WAFVERSION,wafdir) sys.exit(1) - if'--version'in sys.argv: - Context.run_dir=current_directory - ctx=Context.create_context('options') - ctx.curdir=current_directory - ctx.parse_args() - sys.exit(0) + Context.waf_dir=wafdir + Context.run_dir=Context.launch_dir=current_directory + start_dir=current_directory + no_climb=os.environ.get('NOCLIMB') if len(sys.argv)>1: potential_wscript=os.path.join(current_directory,sys.argv[1]) - if os.path.basename(potential_wscript)=='wscript'and os.path.isfile(potential_wscript): - current_directory=os.path.normpath(os.path.dirname(potential_wscript)) + if os.path.basename(potential_wscript)==Context.WSCRIPT_FILE and os.path.isfile(potential_wscript): + path=os.path.normpath(os.path.dirname(potential_wscript)) + start_dir=os.path.abspath(path) + no_climb=True sys.argv.pop(1) - Context.waf_dir=wafdir - Context.launch_dir=current_directory - no_climb=os.environ.get('NOCLIMB',None) + ctx=Context.create_context('options') + (options,commands,env)=ctx.parse_cmd_args(allow_unknown=True) + if options.top: + start_dir=Context.run_dir=Context.top_dir=options.top + no_climb=True + if options.out: + Context.out_dir=options.out if not no_climb: for k in no_climb_commands: - for y in sys.argv: + for y in commands: if y.startswith(k): no_climb=True break - for i,x in enumerate(sys.argv): - if x.startswith('--top='): - Context.run_dir=Context.top_dir=Utils.sane_path(x[6:]) - sys.argv[i]='--top='+Context.run_dir - if x.startswith('--out='): - Context.out_dir=Utils.sane_path(x[6:]) - sys.argv[i]='--out='+Context.out_dir - cur=current_directory - while cur and not Context.top_dir: + cur=start_dir + while cur: try: lst=os.listdir(cur) except OSError: lst=[] - Logs.error('Directory %r is unreadable!'%cur) + Logs.error('Directory %r is unreadable!',cur) if Options.lockfile in lst: env=ConfigSet.ConfigSet() try: env.load(os.path.join(cur,Options.lockfile)) ino=os.stat(cur)[stat.ST_INO] - except Exception: + except EnvironmentError: pass else: for x in(env.run_dir,env.top_dir,env.out_dir): + if not x: + continue if Utils.is_win32: if cur==x: load=True @@ -69,7 +69,7 @@ def waf_entry_point(current_directory,version,wafdir): load=True break else: - Logs.warn('invalid lock file in %s'%cur) + Logs.warn('invalid lock file in %s',cur) load=False if load: Context.run_dir=env.run_dir @@ -85,46 +85,59 @@ def waf_entry_point(current_directory,version,wafdir): cur=next if no_climb: break - if not Context.run_dir: - if'-h'in sys.argv or'--help'in sys.argv: - Logs.warn('No wscript file found: the help message may be incomplete') - Context.run_dir=current_directory - ctx=Context.create_context('options') - ctx.curdir=current_directory - ctx.parse_args() + wscript=os.path.normpath(os.path.join(Context.run_dir,Context.WSCRIPT_FILE)) + if not os.path.exists(wscript): + if options.whelp: + Logs.warn('These are the generic options (no wscript/project found)') + ctx.parser.print_help() sys.exit(0) - Logs.error('Waf: Run from a directory containing a file named %r'%Context.WSCRIPT_FILE) + Logs.error('Waf: Run from a folder containing a %r file (or try -h for the generic options)',Context.WSCRIPT_FILE) sys.exit(1) try: os.chdir(Context.run_dir) except OSError: - Logs.error('Waf: The folder %r is unreadable'%Context.run_dir) + Logs.error('Waf: The folder %r is unreadable',Context.run_dir) sys.exit(1) try: - set_main_module(os.path.normpath(os.path.join(Context.run_dir,Context.WSCRIPT_FILE))) - except Errors.WafError ,e: + set_main_module(wscript) + except Errors.WafError as e: Logs.pprint('RED',e.verbose_msg) Logs.error(str(e)) sys.exit(1) - except Exception ,e: - Logs.error('Waf: The wscript in %r is unreadable'%Context.run_dir,e) + except Exception as e: + Logs.error('Waf: The wscript in %r is unreadable',Context.run_dir) traceback.print_exc(file=sys.stdout) sys.exit(2) - try: - run_commands() - except Errors.WafError ,e: - if Logs.verbose>1: - Logs.pprint('RED',e.verbose_msg) - Logs.error(e.msg) - sys.exit(1) - except SystemExit: - raise - except Exception ,e: - traceback.print_exc(file=sys.stdout) - sys.exit(2) - except KeyboardInterrupt: - Logs.pprint('RED','Interrupted') - sys.exit(68) + if options.profile: + import cProfile,pstats + cProfile.runctx('from waflib import Scripting; Scripting.run_commands()',{},{},'profi.txt') + p=pstats.Stats('profi.txt') + p.sort_stats('time').print_stats(75) + else: + try: + try: + run_commands() + except: + if options.pdb: + import pdb + type,value,tb=sys.exc_info() + traceback.print_exc() + pdb.post_mortem(tb) + else: + raise + except Errors.WafError as e: + if Logs.verbose>1: + Logs.pprint('RED',e.verbose_msg) + Logs.error(e.msg) + sys.exit(1) + except SystemExit: + raise + except Exception as e: + traceback.print_exc(file=sys.stdout) + sys.exit(2) + except KeyboardInterrupt: + Logs.pprint('RED','Interrupted') + sys.exit(68) def set_main_module(file_path): Context.g_module=Context.load_module(file_path) Context.g_module.root_path=file_path @@ -132,7 +145,7 @@ def set_main_module(file_path): name=obj.__name__ if not name in Context.g_module.__dict__: setattr(Context.g_module,name,obj) - for k in(update,dist,distclean,distcheck): + for k in(dist,distclean,distcheck): set_def(k) if not'init'in Context.g_module.__dict__: Context.g_module.init=Utils.nada @@ -141,22 +154,13 @@ def set_main_module(file_path): if not'options'in Context.g_module.__dict__: Context.g_module.options=Utils.nada def parse_options(): - Context.create_context('options').execute() - for var in Options.envvars: - (name,value)=var.split('=',1) - os.environ[name.strip()]=value + ctx=Context.create_context('options') + ctx.execute() if not Options.commands: - Options.commands=[default_cmd] - Options.commands=[x for x in Options.commands if x!='options'] - Logs.verbose=Options.options.verbose - if Options.options.zones: - Logs.zones=Options.options.zones.split(',') - if not Logs.verbose: - Logs.verbose=1 - elif Logs.verbose>0: - Logs.zones=['runner'] - if Logs.verbose>2: - Logs.zones=['*'] + Options.commands.append(default_cmd) + if Options.options.whelp: + ctx.parser.print_help() + sys.exit(0) def run_command(cmd_name): ctx=Context.create_context(cmd_name) ctx.log_timer=Utils.Timer() @@ -173,62 +177,64 @@ def run_commands(): while Options.commands: cmd_name=Options.commands.pop(0) ctx=run_command(cmd_name) - Logs.info('%r finished successfully (%s)'%(cmd_name,str(ctx.log_timer))) + Logs.info('%r finished successfully (%s)',cmd_name,ctx.log_timer) run_command('shutdown') -def _can_distclean(name): - for k in'.o .moc .exe'.split(): - if name.endswith(k): - return True - return False def distclean_dir(dirname): for(root,dirs,files)in os.walk(dirname): for f in files: - if _can_distclean(f): + if f.endswith(('.o','.moc','.exe')): fname=os.path.join(root,f) try: os.remove(fname) except OSError: - Logs.warn('Could not remove %r'%fname) + Logs.warn('Could not remove %r',fname) for x in(Context.DBFILE,'config.log'): try: os.remove(x) except OSError: pass try: - shutil.rmtree('c4che') + shutil.rmtree(Build.CACHE_DIR) except OSError: pass def distclean(ctx): - '''removes the build directory''' - lst=os.listdir('.') - for f in lst: - if f==Options.lockfile: - try: - proj=ConfigSet.ConfigSet(f) - except IOError: - Logs.warn('Could not read %r'%f) - continue - if proj['out_dir']!=proj['top_dir']: - try: - shutil.rmtree(proj['out_dir']) - except IOError: - pass - except OSError ,e: - if e.errno!=errno.ENOENT: - Logs.warn('Could not remove %r'%proj['out_dir']) - else: - distclean_dir(proj['out_dir']) - for k in(proj['out_dir'],proj['top_dir'],proj['run_dir']): - p=os.path.join(k,Options.lockfile) - try: - os.remove(p) - except OSError ,e: - if e.errno!=errno.ENOENT: - Logs.warn('Could not remove %r'%p) - if not Options.commands: - for x in'.waf-1. waf-1. .waf3-1. waf3-1.'.split(): - if f.startswith(x): - shutil.rmtree(f,ignore_errors=True) + '''removes build folders and data''' + def remove_and_log(k,fun): + try: + fun(k) + except EnvironmentError as e: + if e.errno!=errno.ENOENT: + Logs.warn('Could not remove %r',k) + if not Options.commands: + for k in os.listdir('.'): + for x in'.waf-2 waf-2 .waf3-2 waf3-2'.split(): + if k.startswith(x): + remove_and_log(k,shutil.rmtree) + cur='.' + if ctx.options.no_lock_in_top: + cur=ctx.options.out + try: + lst=os.listdir(cur) + except OSError: + Logs.warn('Could not read %r',cur) + return + if Options.lockfile in lst: + f=os.path.join(cur,Options.lockfile) + try: + env=ConfigSet.ConfigSet(f) + except EnvironmentError: + Logs.warn('Could not read %r',f) + return + if not env.out_dir or not env.top_dir: + Logs.warn('Invalid lock file %r',f) + return + if env.out_dir==env.top_dir: + distclean_dir(env.out_dir) + else: + remove_and_log(env.out_dir,shutil.rmtree) + for k in(env.out_dir,env.top_dir,env.run_dir): + p=os.path.join(k,Options.lockfile) + remove_and_log(p,os.remove) class Dist(Context.Context): '''creates an archive containing the project source code''' cmd='dist' @@ -252,13 +258,13 @@ class Dist(Context.Context): pass files=self.get_files() if self.algo.startswith('tar.'): - tar=tarfile.open(arch_name,'w:'+self.algo.replace('tar.','')) + tar=tarfile.open(node.abspath(),'w:'+self.algo.replace('tar.','')) for x in files: self.add_tar_file(x,tar) tar.close() elif self.algo=='zip': import zipfile - zip=zipfile.ZipFile(arch_name,'w',compression=zipfile.ZIP_DEFLATED) + zip=zipfile.ZipFile(node.abspath(),'w',compression=zipfile.ZIP_DEFLATED) for x in files: archive_name=self.get_base_name()+'/'+x.path_from(self.base_path) zip.write(x.abspath(),archive_name,zipfile.ZIP_DEFLATED) @@ -266,14 +272,12 @@ class Dist(Context.Context): else: self.fatal('Valid algo types are tar.bz2, tar.gz, tar.xz or zip') try: - from hashlib import sha1 as sha + from hashlib import sha256 except ImportError: - from sha import sha - try: - digest=" (sha=%r)"%sha(node.read()).hexdigest() - except Exception: digest='' - Logs.info('New archive created: %s%s'%(self.arch_name,digest)) + else: + digest=' (sha256=%r)'%sha256(node.read(flags='rb')).hexdigest() + Logs.info('New archive created: %s%s',self.arch_name,digest) def get_tar_path(self,node): return node.abspath() def add_tar_file(self,x,tar): @@ -283,13 +287,11 @@ class Dist(Context.Context): tinfo.gid=0 tinfo.uname='root' tinfo.gname='root' - fu=None - try: - fu=open(p,'rb') - tar.addfile(tinfo,fileobj=fu) - finally: - if fu: - fu.close() + if os.path.isfile(p): + with open(p,'rb')as f: + tar.addfile(tinfo,fileobj=f) + else: + tar.addfile(tinfo) def get_tar_prefix(self): try: return self.tar_prefix @@ -313,7 +315,7 @@ class Dist(Context.Context): try: return self.excl except AttributeError: - self.excl=Node.exclude_regs+' **/waf-1.8.* **/.waf-1.8* **/waf3-1.8.* **/.waf3-1.8* **/*~ **/*.rej **/*.orig **/*.pyc **/*.pyo **/*.bak **/*.swp **/.lock-w*' + self.excl=Node.exclude_regs+' **/waf-2.* **/.waf-2.* **/waf3-2.* **/.waf3-2.* **/*~ **/*.rej **/*.orig **/*.pyc **/*.pyo **/*.bak **/*.swp **/.lock-w*' if Context.out_dir: nd=self.root.find_node(Context.out_dir) if nd: @@ -335,52 +337,30 @@ class DistCheck(Dist): self.recurse([os.path.dirname(Context.g_module.root_path)]) self.archive() self.check() - def check(self): - import tempfile,tarfile - t=None - try: - t=tarfile.open(self.get_arch_name()) - for x in t: - t.extract(x) - finally: - if t: - t.close() + def make_distcheck_cmd(self,tmpdir): cfg=[] if Options.options.distcheck_args: cfg=shlex.split(Options.options.distcheck_args) else: cfg=[x for x in sys.argv if x.startswith('-')] + cmd=[sys.executable,sys.argv[0],'configure','build','install','uninstall','--destdir='+tmpdir]+cfg + return cmd + def check(self): + import tempfile,tarfile + with tarfile.open(self.get_arch_name())as t: + for x in t: + t.extract(x) instdir=tempfile.mkdtemp('.inst',self.get_base_name()) - ret=Utils.subprocess.Popen([sys.executable,sys.argv[0],'configure','install','uninstall','--destdir='+instdir]+cfg,cwd=self.get_base_name()).wait() + cmd=self.make_distcheck_cmd(instdir) + ret=Utils.subprocess.Popen(cmd,cwd=self.get_base_name()).wait() if ret: - raise Errors.WafError('distcheck failed with code %i'%ret) + raise Errors.WafError('distcheck failed with code %r'%ret) if os.path.exists(instdir): raise Errors.WafError('distcheck succeeded, but files were left in %s'%instdir) shutil.rmtree(self.get_base_name()) def distcheck(ctx): '''checks if the project compiles (tarball from 'dist')''' pass -def update(ctx): - lst=Options.options.files - if lst: - lst=lst.split(',') - else: - path=os.path.join(Context.waf_dir,'waflib','extras') - lst=[x for x in Utils.listdir(path)if x.endswith('.py')] - for x in lst: - tool=x.replace('.py','') - if not tool: - continue - try: - dl=Configure.download_tool - except AttributeError: - ctx.fatal('The command "update" is dangerous; include the tool "use_config" in your project!') - try: - dl(tool,force=True,ctx=ctx) - except Errors.WafError: - Logs.error('Could not find the tool %r in the remote repository'%x) - else: - Logs.warn('Updated %r'%tool) def autoconfigure(execute_method): def execute(self): if not Configure.autoconfig: @@ -389,7 +369,7 @@ def autoconfigure(execute_method): do_config=False try: env.load(os.path.join(Context.top_dir,Options.lockfile)) - except Exception: + except EnvironmentError: Logs.warn('Configuring the project') do_config=True else: @@ -397,18 +377,27 @@ def autoconfigure(execute_method): do_config=True else: h=0 - for f in env['files']: - h=Utils.h_list((h,Utils.readf(f,'rb'))) - do_config=h!=env.hash + for f in env.files: + try: + h=Utils.h_list((h,Utils.readf(f,'rb'))) + except EnvironmentError: + do_config=True + break + else: + do_config=h!=env.hash if do_config: - cmd=env['config_cmd']or'configure' + cmd=env.config_cmd or'configure' if Configure.autoconfig=='clobber': tmp=Options.options.__dict__ - Options.options.__dict__=env.options + launch_dir_tmp=Context.launch_dir + if env.options: + Options.options.__dict__=env.options + Context.launch_dir=env.launch_dir try: run_command(cmd) finally: Options.options.__dict__=tmp + Context.launch_dir=launch_dir_tmp else: run_command(cmd) run_command(self.cmd) diff --git a/waflib/Task.py b/waflib/Task.py index 6070f68..f0e2397 100644 --- a/waflib/Task.py +++ b/waflib/Task.py @@ -2,52 +2,71 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import os,re,sys +import os,re,sys,tempfile,traceback from waflib import Utils,Logs,Errors NOT_RUN=0 MISSING=1 CRASHED=2 EXCEPTION=3 +CANCELED=4 SKIPPED=8 SUCCESS=9 ASK_LATER=-1 SKIP_ME=-2 RUN_ME=-3 +CANCEL_ME=-4 COMPILE_TEMPLATE_SHELL=''' def f(tsk): env = tsk.env gen = tsk.generator bld = gen.bld - cwdx = getattr(bld, 'cwdx', bld.bldnode) # TODO single cwd value in waf 1.9 - wd = getattr(tsk, 'cwd', None) + cwdx = tsk.get_cwd() p = env.get_flat + def to_list(xx): + if isinstance(xx, str): return [xx] + return xx tsk.last_cmd = cmd = \'\'\' %s \'\'\' % s - return tsk.exec_command(cmd, cwd=wd, env=env.env or None) + return tsk.exec_command(cmd, cwd=cwdx, env=env.env or None) ''' COMPILE_TEMPLATE_NOSHELL=''' def f(tsk): env = tsk.env gen = tsk.generator bld = gen.bld - cwdx = getattr(bld, 'cwdx', bld.bldnode) # TODO single cwd value in waf 1.9 - wd = getattr(tsk, 'cwd', None) + cwdx = tsk.get_cwd() def to_list(xx): if isinstance(xx, str): return [xx] return xx - tsk.last_cmd = lst = [] + def merge(lst1, lst2): + if lst1 and lst2: + return lst1[:-1] + [lst1[-1] + lst2[0]] + lst2[1:] + return lst1 + lst2 + lst = [] + %s + if '' in lst: + lst = [x for x in lst if x] + tsk.last_cmd = lst + return tsk.exec_command(lst, cwd=cwdx, env=env.env or None) +''' +COMPILE_TEMPLATE_SIG_VARS=''' +def f(tsk): + sig = tsk.generator.bld.hash_env_vars(tsk.env, tsk.vars) + tsk.m.update(sig) + env = tsk.env + gen = tsk.generator + bld = gen.bld + cwdx = tsk.get_cwd() + p = env.get_flat + buf = [] %s - lst = [x for x in lst if x] - return tsk.exec_command(lst, cwd=wd, env=env.env or None) + tsk.m.update(repr(buf).encode()) ''' classes={} class store_task_type(type): def __init__(cls,name,bases,dict): super(store_task_type,cls).__init__(name,bases,dict) name=cls.__name__ - if name.endswith('_task'): - name=name.replace('_task','') - if name!='evil'and name!='TaskBase': - global classes + if name!='evil'and name!='Task': if getattr(cls,'run_str',None): (f,dvars)=compile_fun(cls.run_str,cls.shell) cls.hcode=Utils.h_cmd(cls.run_str) @@ -56,86 +75,127 @@ class store_task_type(type): cls.run=f cls.vars=list(set(cls.vars+dvars)) cls.vars.sort() + if cls.vars: + fun=compile_sig_vars(cls.vars) + if fun: + cls.sig_vars=fun elif getattr(cls,'run',None)and not'hcode'in cls.__dict__: cls.hcode=Utils.h_cmd(cls.run) getattr(cls,'register',classes)[name]=cls evil=store_task_type('evil',(object,),{}) -class TaskBase(evil): +class Task(evil): + vars=[] + always_run=False + shell=False color='GREEN' ext_in=[] ext_out=[] before=[] after=[] - hcode='' + hcode=Utils.SIG_NIL + keep_last_cmd=False + weight=0 + tree_weight=0 + prio_order=0 + __slots__=('hasrun','generator','env','inputs','outputs','dep_nodes','run_after') def __init__(self,*k,**kw): self.hasrun=NOT_RUN try: self.generator=kw['generator'] except KeyError: self.generator=self - def __repr__(self): - return'\n\t{task %r: %s %s}'%(self.__class__.__name__,id(self),str(getattr(self,'fun',''))) - def __str__(self): - if hasattr(self,'fun'): - return self.fun.__name__ - return self.__class__.__name__ - def __hash__(self): - return id(self) - def keyword(self): - if hasattr(self,'fun'): - return'Function' - return'Processing' - def exec_command(self,cmd,**kw): + self.env=kw['env'] + self.inputs=[] + self.outputs=[] + self.dep_nodes=[] + self.run_after=set() + def __lt__(self,other): + return self.priority()>other.priority() + def __le__(self,other): + return self.priority()>=other.priority() + def __gt__(self,other): + return self.priority()<other.priority() + def __ge__(self,other): + return self.priority()<=other.priority() + def get_cwd(self): bld=self.generator.bld - try: - if not kw.get('cwd',None): - kw['cwd']=bld.cwd - except AttributeError: - bld.cwd=kw['cwd']=bld.variant_dir - return bld.exec_command(cmd,**kw) - def runnable_status(self): - return RUN_ME + ret=getattr(self,'cwd',None)or getattr(bld,'cwd',bld.bldnode) + if isinstance(ret,str): + if os.path.isabs(ret): + ret=bld.root.make_node(ret) + else: + ret=self.generator.path.make_node(ret) + return ret + def quote_flag(self,x): + old=x + if'\\'in x: + x=x.replace('\\','\\\\') + if'"'in x: + x=x.replace('"','\\"') + if old!=x or' 'in x or'\t'in x or"'"in x: + x='"%s"'%x + return x + def priority(self): + return(self.weight+self.prio_order,-getattr(self.generator,'tg_idx_count',0)) + def split_argfile(self,cmd): + return([cmd[0]],[self.quote_flag(x)for x in cmd[1:]]) + def exec_command(self,cmd,**kw): + if not'cwd'in kw: + kw['cwd']=self.get_cwd() + if hasattr(self,'timeout'): + kw['timeout']=self.timeout + if self.env.PATH: + env=kw['env']=dict(kw.get('env')or self.env.env or os.environ) + env['PATH']=self.env.PATH if isinstance(self.env.PATH,str)else os.pathsep.join(self.env.PATH) + if hasattr(self,'stdout'): + kw['stdout']=self.stdout + if hasattr(self,'stderr'): + kw['stderr']=self.stderr + if not isinstance(cmd,str)and(len(repr(cmd))>=8192 if Utils.is_win32 else len(cmd)>200000): + cmd,args=self.split_argfile(cmd) + try: + (fd,tmp)=tempfile.mkstemp() + os.write(fd,'\r\n'.join(args).encode()) + os.close(fd) + if Logs.verbose: + Logs.debug('argfile: @%r -> %r',tmp,args) + return self.generator.bld.exec_command(cmd+['@'+tmp],**kw) + finally: + try: + os.remove(tmp) + except OSError: + pass + else: + return self.generator.bld.exec_command(cmd,**kw) def process(self): - m=self.master - if m.stop: - m.out.put(self) - return try: del self.generator.bld.task_sigs[self.uid()] except KeyError: pass try: - self.generator.bld.returned_tasks.append(self) - self.log_display(self.generator.bld) ret=self.run() except Exception: - self.err_msg=Utils.ex_stack() + self.err_msg=traceback.format_exc() self.hasrun=EXCEPTION - m.error_handler(self) - m.out.put(self) - return - if ret: - self.err_code=ret - self.hasrun=CRASHED else: + if ret: + self.err_code=ret + self.hasrun=CRASHED + else: + try: + self.post_run() + except Errors.WafError: + pass + except Exception: + self.err_msg=traceback.format_exc() + self.hasrun=EXCEPTION + else: + self.hasrun=SUCCESS + if self.hasrun!=SUCCESS and self.scan: try: - self.post_run() - except Errors.WafError: + del self.generator.bld.imp_sigs[self.uid()] + except KeyError: pass - except Exception: - self.err_msg=Utils.ex_stack() - self.hasrun=EXCEPTION - else: - self.hasrun=SUCCESS - if self.hasrun!=SUCCESS: - m.error_handler(self) - m.out.put(self) - def run(self): - if hasattr(self,'fun'): - return self.fun(self) - return 0 - def post_run(self): - pass def log_display(self,bld): if self.generator.bld.progress_bar==3: return @@ -154,12 +214,9 @@ class TaskBase(evil): def display(self): col1=Logs.colors(self.color) col2=Logs.colors.NORMAL - master=self.master + master=self.generator.bld.producer def cur(): - tmp=-1 - if hasattr(master,'ready'): - tmp-=master.ready.qsize() - return master.processed+tmp + return master.processed-master.ready.qsize() if self.generator.bld.progress_bar==1: return self.generator.bld.progress_line(cur(),master.total,col1,col2) if self.generator.bld.progress_bar==2: @@ -183,17 +240,13 @@ class TaskBase(evil): if kw: kw+=' ' return fs%(cur(),total,kw,col1,s,col2) - def attr(self,att,default=None): - ret=getattr(self,att,self) - if ret is self:return getattr(self.__class__,att,default) - return ret def hash_constraints(self): - cls=self.__class__ - tup=(str(cls.before),str(cls.after),str(cls.ext_in),str(cls.ext_out),cls.__name__,cls.hcode) - h=hash(tup) - return h + return(tuple(self.before),tuple(self.after),tuple(self.ext_in),tuple(self.ext_out),self.__class__.__name__,self.hcode) def format_error(self): - msg=getattr(self,'last_cmd','') + if Logs.verbose: + msg=': %r\n%r'%(self,getattr(self,'last_cmd','')) + else: + msg=' (run with -v to display more information)' name=getattr(self.generator,'name','') if getattr(self,"err_msg",None): return self.err_msg @@ -201,11 +254,13 @@ class TaskBase(evil): return'task in %r was not executed for some reason: %r'%(name,self) elif self.hasrun==CRASHED: try: - return' -> task in %r failed (exit status %r): %r\n%r'%(name,self.err_code,self,msg) + return' -> task in %r failed with exit status %r%s'%(name,self.err_code,msg) except AttributeError: - return' -> task in %r failed: %r\n%r'%(name,self,msg) + return' -> task in %r failed%s'%(name,msg) elif self.hasrun==MISSING: - return' -> missing files in %r: %r\n%r'%(name,self,msg) + return' -> missing files in %r%s'%(name,msg) + elif self.hasrun==CANCELED: + return' -> %r canceled because of missing dependencies'%name else: return'invalid status for task in %r: %r'%(name,self.hasrun) def colon(self,var1,var2): @@ -224,20 +279,10 @@ class TaskBase(evil): lst.extend(tmp) lst.append(y) return lst -class Task(TaskBase): - vars=[] - shell=False - def __init__(self,*k,**kw): - TaskBase.__init__(self,*k,**kw) - self.env=kw['env'] - self.inputs=[] - self.outputs=[] - self.dep_nodes=[] - self.run_after=set([]) def __str__(self): name=self.__class__.__name__ if self.outputs: - if(name.endswith('lib')or name.endswith('program'))or not self.inputs: + if name.endswith(('lib','program'))or not self.inputs: node=self.outputs[0] return node.path_from(node.ctx.launch_node()) if not(self.inputs or self.outputs): @@ -247,12 +292,14 @@ class Task(TaskBase): return node.path_from(node.ctx.launch_node()) src_str=' '.join([a.path_from(a.ctx.launch_node())for a in self.inputs]) tgt_str=' '.join([a.path_from(a.ctx.launch_node())for a in self.outputs]) - if self.outputs:sep=' -> ' - else:sep='' - return'%s: %s%s%s'%(self.__class__.__name__.replace('_task',''),src_str,sep,tgt_str) + if self.outputs: + sep=' -> ' + else: + sep='' + return'%s: %s%s%s'%(self.__class__.__name__,src_str,sep,tgt_str) def keyword(self): name=self.__class__.__name__ - if name.endswith('lib')or name.endswith('program'): + if name.endswith(('lib','program')): return'Linking' if len(self.inputs)==1 and len(self.outputs)==1: return'Compiling' @@ -274,27 +321,31 @@ class Task(TaskBase): try: return self.uid_ except AttributeError: - m=Utils.md5() + m=Utils.md5(self.__class__.__name__) up=m.update - up(self.__class__.__name__) for x in self.inputs+self.outputs: up(x.abspath()) self.uid_=m.digest() return self.uid_ def set_inputs(self,inp): - if isinstance(inp,list):self.inputs+=inp - else:self.inputs.append(inp) + if isinstance(inp,list): + self.inputs+=inp + else: + self.inputs.append(inp) def set_outputs(self,out): - if isinstance(out,list):self.outputs+=out - else:self.outputs.append(out) + if isinstance(out,list): + self.outputs+=out + else: + self.outputs.append(out) def set_run_after(self,task): - assert isinstance(task,TaskBase) + assert isinstance(task,Task) self.run_after.add(task) def signature(self): - try:return self.cache_sig - except AttributeError:pass - self.m=Utils.md5() - self.m.update(self.hcode) + try: + return self.cache_sig + except AttributeError: + pass + self.m=Utils.md5(self.hcode) self.sig_explicit_deps() self.sig_vars() if self.scan: @@ -305,10 +356,14 @@ class Task(TaskBase): ret=self.cache_sig=self.m.digest() return ret def runnable_status(self): + bld=self.generator.bld + if bld.is_install<0: + return SKIP_ME for t in self.run_after: if not t.hasrun: return ASK_LATER - bld=self.generator.bld + elif t.hasrun<SKIPPED: + return CANCEL_ME try: new_sig=self.signature() except Errors.TaskNotReady: @@ -317,70 +372,68 @@ class Task(TaskBase): try: prev_sig=bld.task_sigs[key] except KeyError: - Logs.debug("task: task %r must run as it was never run before or the task code changed"%self) + Logs.debug('task: task %r must run: it was never run before or the task code changed',self) return RUN_ME - for node in self.outputs: - try: - if node.sig!=new_sig: - return RUN_ME - except AttributeError: - Logs.debug("task: task %r must run as the output nodes do not exist"%self) - return RUN_ME if new_sig!=prev_sig: + Logs.debug('task: task %r must run: the task signature changed',self) return RUN_ME - return SKIP_ME + for node in self.outputs: + sig=bld.node_sigs.get(node) + if not sig: + Logs.debug('task: task %r must run: an output node has no signature',self) + return RUN_ME + if sig!=key: + Logs.debug('task: task %r must run: an output node was produced by another task',self) + return RUN_ME + if not node.exists(): + Logs.debug('task: task %r must run: an output node does not exist',self) + return RUN_ME + return(self.always_run and RUN_ME)or SKIP_ME def post_run(self): bld=self.generator.bld - sig=self.signature() for node in self.outputs: - try: - os.stat(node.abspath()) - except OSError: + if not node.exists(): self.hasrun=MISSING self.err_msg='-> missing file: %r'%node.abspath() raise Errors.WafError(self.err_msg) - node.sig=node.cache_sig=sig - bld.task_sigs[self.uid()]=self.cache_sig + bld.node_sigs[node]=self.uid() + bld.task_sigs[self.uid()]=self.signature() + if not self.keep_last_cmd: + try: + del self.last_cmd + except AttributeError: + pass def sig_explicit_deps(self): bld=self.generator.bld upd=self.m.update for x in self.inputs+self.dep_nodes: - try: - upd(x.get_bld_sig()) - except(AttributeError,TypeError): - raise Errors.WafError('Missing node signature for %r (required by %r)'%(x,self)) + upd(x.get_bld_sig()) if bld.deps_man: additional_deps=bld.deps_man for x in self.inputs+self.outputs: try: - d=additional_deps[id(x)] + d=additional_deps[x] except KeyError: continue for v in d: - if isinstance(v,bld.root.__class__): - try: - v=v.get_bld_sig() - except AttributeError: - raise Errors.WafError('Missing node signature for %r (required by %r)'%(v,self)) - elif hasattr(v,'__call__'): - v=v() + try: + v=v.get_bld_sig() + except AttributeError: + if hasattr(v,'__call__'): + v=v() upd(v) - return self.m.digest() - def sig_vars(self): + def sig_deep_inputs(self): bld=self.generator.bld - env=self.env - upd=self.m.update - act_sig=bld.hash_env_vars(env,self.__class__.vars) - upd(act_sig) - dep_vars=getattr(self,'dep_vars',None) - if dep_vars: - upd(bld.hash_env_vars(env,dep_vars)) - return self.m.digest() + lst=[bld.task_sigs[bld.node_sigs[node]]for node in(self.inputs+self.dep_nodes)if node.is_bld()] + self.m.update(Utils.h_list(lst)) + def sig_vars(self): + sig=self.generator.bld.hash_env_vars(self.env,self.vars) + self.m.update(sig) scan=None def sig_implicit_deps(self): bld=self.generator.bld key=self.uid() - prev=bld.task_sigs.get((key,'imp'),[]) + prev=bld.imp_sigs.get(key,[]) if prev: try: if prev==self.compute_sig_implicit_deps(): @@ -389,38 +442,27 @@ class Task(TaskBase): raise except EnvironmentError: for x in bld.node_deps.get(self.uid(),[]): - if not x.is_bld(): + if not x.is_bld()and not x.exists(): try: - os.stat(x.abspath()) - except OSError: - try: - del x.parent.children[x.name] - except KeyError: - pass - del bld.task_sigs[(key,'imp')] + del x.parent.children[x.name] + except KeyError: + pass + del bld.imp_sigs[key] raise Errors.TaskRescan('rescan') - (nodes,names)=self.scan() + (bld.node_deps[key],bld.raw_deps[key])=self.scan() if Logs.verbose: - Logs.debug('deps: scanner for %s returned %s %s'%(str(self),str(nodes),str(names))) - bld.node_deps[key]=nodes - bld.raw_deps[key]=names - self.are_implicit_nodes_ready() + Logs.debug('deps: scanner for %s: %r; unresolved: %r',self,bld.node_deps[key],bld.raw_deps[key]) try: - bld.task_sigs[(key,'imp')]=sig=self.compute_sig_implicit_deps() - except Exception: - if Logs.verbose: - for k in bld.node_deps.get(self.uid(),[]): - try: - k.get_bld_sig() - except Exception: - Logs.warn('Missing signature for node %r (may cause rebuilds)'%k) - else: - return sig + bld.imp_sigs[key]=self.compute_sig_implicit_deps() + except EnvironmentError: + for k in bld.node_deps.get(self.uid(),[]): + if not k.exists(): + Logs.warn('Dependency %r for %r is missing: check the task declaration and the build order!',k,self) + raise def compute_sig_implicit_deps(self): upd=self.m.update - bld=self.generator.bld self.are_implicit_nodes_ready() - for k in bld.node_deps.get(self.uid(),[]): + for k in self.generator.bld.node_deps.get(self.uid(),[]): upd(k.get_bld_sig()) return self.m.digest() def are_implicit_nodes_ready(self): @@ -430,9 +472,9 @@ class Task(TaskBase): except AttributeError: bld.dct_implicit_nodes=cache={} try: - dct=cache[bld.cur] + dct=cache[bld.current_group] except KeyError: - dct=cache[bld.cur]={} + dct=cache[bld.current_group]={} for tsk in bld.cur_tasks: for x in tsk.outputs: dct[x]=tsk @@ -450,11 +492,10 @@ if sys.hexversion>0x3000000: try: return self.uid_ except AttributeError: - m=Utils.md5() + m=Utils.md5(self.__class__.__name__.encode('latin-1','xmlcharrefreplace')) up=m.update - up(self.__class__.__name__.encode('iso8859-1','xmlcharrefreplace')) for x in self.inputs+self.outputs: - up(x.abspath().encode('iso8859-1','xmlcharrefreplace')) + up(x.abspath().encode('latin-1','xmlcharrefreplace')) self.uid_=m.digest() return self.uid_ uid.__doc__=Task.uid.__doc__ @@ -473,14 +514,27 @@ def set_file_constraints(tasks): ins=Utils.defaultdict(set) outs=Utils.defaultdict(set) for x in tasks: - for a in getattr(x,'inputs',[])+getattr(x,'dep_nodes',[]): - ins[id(a)].add(x) - for a in getattr(x,'outputs',[]): - outs[id(a)].add(x) + for a in x.inputs: + ins[a].add(x) + for a in x.dep_nodes: + ins[a].add(x) + for a in x.outputs: + outs[a].add(x) links=set(ins.keys()).intersection(outs.keys()) for k in links: for a in ins[k]: a.run_after.update(outs[k]) +class TaskGroup(object): + def __init__(self,prev,next): + self.prev=prev + self.next=next + self.done=False + def get_hasrun(self): + for k in self.prev: + if not k.hasrun: + return NOT_RUN + return SUCCESS + hasrun=property(get_hasrun,None) def set_precedence_constraints(tasks): cstr_groups=Utils.defaultdict(list) for x in tasks: @@ -500,38 +554,64 @@ def set_precedence_constraints(tasks): b=i else: continue - aval=set(cstr_groups[keys[a]]) - for x in cstr_groups[keys[b]]: - x.run_after.update(aval) + a=cstr_groups[keys[a]] + b=cstr_groups[keys[b]] + if len(a)<2 or len(b)<2: + for x in b: + x.run_after.update(a) + else: + group=TaskGroup(set(a),set(b)) + for x in b: + x.run_after.add(group) def funex(c): dc={} exec(c,dc) return dc['f'] -re_novar=re.compile(r"^(SRC|TGT)\W+.*?$") -reg_act=re.compile(r"(?P<backslash>\\)|(?P<dollar>\$\$)|(?P<subst>\$\{(?P<var>\w+)(?P<code>.*?)\})",re.M) +re_cond=re.compile('(?P<var>\w+)|(?P<or>\|)|(?P<and>&)') +re_novar=re.compile(r'^(SRC|TGT)\W+.*?$') +reg_act=re.compile(r'(?P<backslash>\\)|(?P<dollar>\$\$)|(?P<subst>\$\{(?P<var>\w+)(?P<code>.*?)\})',re.M) def compile_fun_shell(line): extr=[] def repl(match): g=match.group - if g('dollar'):return"$" - elif g('backslash'):return'\\\\' - elif g('subst'):extr.append((g('var'),g('code')));return"%s" + if g('dollar'): + return"$" + elif g('backslash'): + return'\\\\' + elif g('subst'): + extr.append((g('var'),g('code'))) + return"%s" return None line=reg_act.sub(repl,line)or line - parm=[] dvars=[] + def add_dvar(x): + if x not in dvars: + dvars.append(x) + def replc(m): + if m.group('and'): + return' and ' + elif m.group('or'): + return' or ' + else: + x=m.group('var') + add_dvar(x) + return'env[%r]'%x + parm=[] app=parm.append for(var,meth)in extr: if var=='SRC': - if meth:app('tsk.inputs%s'%meth) - else:app('" ".join([a.path_from(cwdx) for a in tsk.inputs])') + if meth: + app('tsk.inputs%s'%meth) + else: + app('" ".join([a.path_from(cwdx) for a in tsk.inputs])') elif var=='TGT': - if meth:app('tsk.outputs%s'%meth) - else:app('" ".join([a.path_from(cwdx) for a in tsk.outputs])') + if meth: + app('tsk.outputs%s'%meth) + else: + app('" ".join([a.path_from(cwdx) for a in tsk.outputs])') elif meth: if meth.startswith(':'): - if var not in dvars: - dvars.append(var) + add_dvar(var) m=meth[1:] if m=='SRC': m='[a.path_from(cwdx) for a in tsk.inputs]' @@ -541,74 +621,100 @@ def compile_fun_shell(line): m='[tsk.inputs%s]'%m[3:] elif re_novar.match(m): m='[tsk.outputs%s]'%m[3:] - elif m[:3]not in('tsk','gen','bld'): - dvars.append(meth[1:]) - m='%r'%m + else: + add_dvar(m) + if m[:3]not in('tsk','gen','bld'): + m='%r'%m app('" ".join(tsk.colon(%r, %s))'%(var,m)) + elif meth.startswith('?'): + expr=re_cond.sub(replc,meth[1:]) + app('p(%r) if (%s) else ""'%(var,expr)) else: - app('%s%s'%(var,meth)) + call='%s%s'%(var,meth) + add_dvar(call) + app(call) else: - if var not in dvars: - dvars.append(var) + add_dvar(var) app("p('%s')"%var) - if parm:parm="%% (%s) "%(',\n\t\t'.join(parm)) - else:parm='' + if parm: + parm="%% (%s) "%(',\n\t\t'.join(parm)) + else: + parm='' c=COMPILE_TEMPLATE_SHELL%(line,parm) - Logs.debug('action: %s'%c.strip().splitlines()) + Logs.debug('action: %s',c.strip().splitlines()) return(funex(c),dvars) +reg_act_noshell=re.compile(r"(?P<space>\s+)|(?P<subst>\$\{(?P<var>\w+)(?P<code>.*?)\})|(?P<text>([^$ \t\n\r\f\v]|\$\$)+)",re.M) def compile_fun_noshell(line): - extr=[] - def repl(match): - g=match.group - if g('dollar'):return"$" - elif g('backslash'):return'\\' - elif g('subst'):extr.append((g('var'),g('code')));return"<<|@|>>" - return None - line2=reg_act.sub(repl,line) - params=line2.split('<<|@|>>') - assert(extr) buf=[] dvars=[] + merge=False app=buf.append - for x in range(len(extr)): - params[x]=params[x].strip() - if params[x]: - app("lst.extend(%r)"%params[x].split()) - (var,meth)=extr[x] - if var=='SRC': - if meth:app('lst.append(tsk.inputs%s)'%meth) - else:app("lst.extend([a.path_from(cwdx) for a in tsk.inputs])") - elif var=='TGT': - if meth:app('lst.append(tsk.outputs%s)'%meth) - else:app("lst.extend([a.path_from(cwdx) for a in tsk.outputs])") - elif meth: - if meth.startswith(':'): - if not var in dvars: - dvars.append(var) - m=meth[1:] - if m=='SRC': - m='[a.path_from(cwdx) for a in tsk.inputs]' - elif m=='TGT': - m='[a.path_from(cwdx) for a in tsk.outputs]' - elif re_novar.match(m): - m='[tsk.inputs%s]'%m[3:] - elif re_novar.match(m): - m='[tsk.outputs%s]'%m[3:] - elif m[:3]not in('tsk','gen','bld'): - dvars.append(m) - m='%r'%m - app('lst.extend(tsk.colon(%r, %s))'%(var,m)) - else: - app('lst.extend(gen.to_list(%s%s))'%(var,meth)) + def add_dvar(x): + if x not in dvars: + dvars.append(x) + def replc(m): + if m.group('and'): + return' and ' + elif m.group('or'): + return' or ' else: - app('lst.extend(to_list(env[%r]))'%var) - if not var in dvars: - dvars.append(var) - if extr: - if params[-1]: - app("lst.extend(%r)"%params[-1].split()) + x=m.group('var') + add_dvar(x) + return'env[%r]'%x + for m in reg_act_noshell.finditer(line): + if m.group('space'): + merge=False + continue + elif m.group('text'): + app('[%r]'%m.group('text').replace('$$','$')) + elif m.group('subst'): + var=m.group('var') + code=m.group('code') + if var=='SRC': + if code: + app('[tsk.inputs%s]'%code) + else: + app('[a.path_from(cwdx) for a in tsk.inputs]') + elif var=='TGT': + if code: + app('[tsk.outputs%s]'%code) + else: + app('[a.path_from(cwdx) for a in tsk.outputs]') + elif code: + if code.startswith(':'): + add_dvar(var) + m=code[1:] + if m=='SRC': + m='[a.path_from(cwdx) for a in tsk.inputs]' + elif m=='TGT': + m='[a.path_from(cwdx) for a in tsk.outputs]' + elif re_novar.match(m): + m='[tsk.inputs%s]'%m[3:] + elif re_novar.match(m): + m='[tsk.outputs%s]'%m[3:] + else: + add_dvar(m) + if m[:3]not in('tsk','gen','bld'): + m='%r'%m + app('tsk.colon(%r, %s)'%(var,m)) + elif code.startswith('?'): + expr=re_cond.sub(replc,code[1:]) + app('to_list(env[%r] if (%s) else [])'%(var,expr)) + else: + call='%s%s'%(var,code) + add_dvar(call) + app('to_list(%s)'%call) + else: + app('to_list(env[%r])'%var) + add_dvar(var) + if merge: + tmp='merge(%s, %s)'%(buf[-2],buf[-1]) + del buf[-1] + buf[-1]=tmp + merge=True + buf=['lst.extend(%s)'%x for x in buf] fun=COMPILE_TEMPLATE_NOSHELL%"\n\t".join(buf) - Logs.debug('action: %s'%fun.strip().splitlines()) + Logs.debug('action: %s',fun.strip().splitlines()) return(funex(fun),dvars) def compile_fun(line,shell=False): if isinstance(line,str): @@ -630,63 +736,53 @@ def compile_fun(line,shell=False): if ret: return ret return None - return composed_fun,dvars + return composed_fun,dvars_lst if shell: return compile_fun_shell(line) else: return compile_fun_noshell(line) +def compile_sig_vars(vars): + buf=[] + for x in sorted(vars): + if x[:3]in('tsk','gen','bld'): + buf.append('buf.append(%s)'%x) + if buf: + return funex(COMPILE_TEMPLATE_SIG_VARS%'\n\t'.join(buf)) + return None def task_factory(name,func=None,vars=None,color='GREEN',ext_in=[],ext_out=[],before=[],after=[],shell=False,scan=None): - params={'vars':vars or[],'color':color,'name':name,'ext_in':Utils.to_list(ext_in),'ext_out':Utils.to_list(ext_out),'before':Utils.to_list(before),'after':Utils.to_list(after),'shell':shell,'scan':scan,} + params={'vars':vars or[],'color':color,'name':name,'shell':shell,'scan':scan,} if isinstance(func,str)or isinstance(func,tuple): params['run_str']=func else: params['run']=func cls=type(Task)(name,(Task,),params) - global classes classes[name]=cls + if ext_in: + cls.ext_in=Utils.to_list(ext_in) + if ext_out: + cls.ext_out=Utils.to_list(ext_out) + if before: + cls.before=Utils.to_list(before) + if after: + cls.after=Utils.to_list(after) return cls -def always_run(cls): - old=cls.runnable_status - def always(self): - ret=old(self) - if ret==SKIP_ME: - ret=RUN_ME - return ret - cls.runnable_status=always - return cls -def update_outputs(cls): - old_post_run=cls.post_run - def post_run(self): - old_post_run(self) - for node in self.outputs: - node.sig=node.cache_sig=Utils.h_file(node.abspath()) - self.generator.bld.task_sigs[node.abspath()]=self.uid() - cls.post_run=post_run - old_runnable_status=cls.runnable_status - def runnable_status(self): - status=old_runnable_status(self) - if status!=RUN_ME: - return status - try: - bld=self.generator.bld - prev_sig=bld.task_sigs[self.uid()] - if prev_sig==self.signature(): - for x in self.outputs: - if not x.is_child_of(bld.bldnode): - x.sig=Utils.h_file(x.abspath()) - if not x.sig or bld.task_sigs[x.abspath()]!=self.uid(): - return RUN_ME - return SKIP_ME - except OSError: - pass - except IOError: - pass - except KeyError: - pass - except IndexError: - pass - except AttributeError: - pass - return RUN_ME - cls.runnable_status=runnable_status +def deep_inputs(cls): + def sig_explicit_deps(self): + Task.sig_explicit_deps(self) + Task.sig_deep_inputs(self) + cls.sig_explicit_deps=sig_explicit_deps return cls +TaskBase=Task +class TaskSemaphore(object): + def __init__(self,num): + self.num=num + self.locking=set() + self.waiting=set() + def is_locked(self): + return len(self.locking)>=self.num + def acquire(self,tsk): + if self.is_locked(): + raise IndexError('Cannot lock more %r'%self.locking) + self.locking.add(tsk) + def release(self,tsk): + self.locking.remove(tsk) diff --git a/waflib/TaskGen.py b/waflib/TaskGen.py index 046f1f9..ce4f43a 100644 --- a/waflib/TaskGen.py +++ b/waflib/TaskGen.py @@ -2,19 +2,17 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import copy,re,os +import copy,re,os,functools from waflib import Task,Utils,Logs,Errors,ConfigSet,Node feats=Utils.defaultdict(set) HEADER_EXTS=['.h','.hpp','.hxx','.hh'] class task_gen(object): mappings=Utils.ordered_iter_dict() - prec=Utils.defaultdict(list) + prec=Utils.defaultdict(set) def __init__(self,*k,**kw): - self.source='' + self.source=[] self.target='' self.meths=[] - self.prec=Utils.defaultdict(list) - self.mappings={} self.features=[] self.tasks=[] if not'bld'in kw: @@ -24,22 +22,29 @@ class task_gen(object): else: self.bld=kw['bld'] self.env=self.bld.env.derive() - self.path=self.bld.path + self.path=kw.get('path',self.bld.path) + path=self.path.abspath() try: - self.idx=self.bld.idx[id(self.path)]=self.bld.idx.get(id(self.path),0)+1 + self.idx=self.bld.idx[path]=self.bld.idx.get(path,0)+1 except AttributeError: self.bld.idx={} - self.idx=self.bld.idx[id(self.path)]=1 + self.idx=self.bld.idx[path]=1 + try: + self.tg_idx_count=self.bld.tg_idx_count=self.bld.tg_idx_count+1 + except AttributeError: + self.tg_idx_count=self.bld.tg_idx_count=1 for key,val in kw.items(): setattr(self,key,val) def __str__(self): return"<task_gen %r declared in %s>"%(self.name,self.path.abspath()) def __repr__(self): lst=[] - for x in self.__dict__.keys(): + for x in self.__dict__: if x not in('env','bld','compiled_tasks','tasks'): lst.append("%s=%s"%(x,repr(getattr(self,x)))) return"bld(%s) in %s"%(", ".join(lst),self.path.abspath()) + def get_cwd(self): + return self.bld.bldnode def get_name(self): try: return self._name @@ -54,36 +59,41 @@ class task_gen(object): self._name=name name=property(get_name,set_name) def to_list(self,val): - if isinstance(val,str):return val.split() - else:return val + if isinstance(val,str): + return val.split() + else: + return val def post(self): if getattr(self,'posted',None): return False self.posted=True keys=set(self.meths) + keys.update(feats['*']) self.features=Utils.to_list(self.features) - for x in self.features+['*']: + for x in self.features: st=feats[x] - if not st: - if not x in Task.classes: - Logs.warn('feature %r does not exist - bind at least one method to it'%x) - keys.update(list(st)) + if st: + keys.update(st) + elif not x in Task.classes: + Logs.warn('feature %r does not exist - bind at least one method to it?',x) prec={} - prec_tbl=self.prec or task_gen.prec + prec_tbl=self.prec for x in prec_tbl: if x in keys: prec[x]=prec_tbl[x] tmp=[] for a in keys: for x in prec.values(): - if a in x:break + if a in x: + break else: tmp.append(a) - tmp.sort() + tmp.sort(reverse=True) out=[] while tmp: e=tmp.pop() - if e in keys:out.append(e) + if e in keys: + out.append(e) try: nlst=prec[e] except KeyError: @@ -96,31 +106,34 @@ class task_gen(object): break else: tmp.append(x) + tmp.sort(reverse=True) if prec: - txt='\n'.join(['- %s after %s'%(k,repr(v))for k,v in prec.items()]) - raise Errors.WafError('Cycle detected in the method execution\n%s'%txt) - out.reverse() + buf=['Cycle detected in the method execution:'] + for k,v in prec.items(): + buf.append('- %s after %s'%(k,[x for x in v if x in prec])) + raise Errors.WafError('\n'.join(buf)) self.meths=out - Logs.debug('task_gen: posting %s %d'%(self,id(self))) + Logs.debug('task_gen: posting %s %d',self,id(self)) for x in out: try: v=getattr(self,x) except AttributeError: raise Errors.WafError('%r is not a valid task generator method'%x) - Logs.debug('task_gen: -> %s (%d)'%(x,id(self))) + Logs.debug('task_gen: -> %s (%d)',x,id(self)) v() - Logs.debug('task_gen: posted %s'%self.name) + Logs.debug('task_gen: posted %s',self.name) return True def get_hook(self,node): name=node.name - if self.mappings: - for k in self.mappings: + for k in self.mappings: + try: if name.endswith(k): return self.mappings[k] - for k in task_gen.mappings: - if name.endswith(k): - return task_gen.mappings[k] - raise Errors.WafError("File %r has no mapping in %r (have you forgotten to load a waf tool?)"%(node,task_gen.mappings.keys())) + except TypeError: + if k.match(name): + return self.mappings[k] + keys=list(self.mappings.keys()) + raise Errors.WafError("File %r has no mapping in %r (load a waf tool?)"%(node,keys)) def create_task(self,name,src=None,tgt=None,**kw): task=Task.classes[name](env=self.env.derive(),generator=self) if src: @@ -152,12 +165,11 @@ def declare_chain(name='',rule=None,reentrant=None,color='BLUE',ext_in=[],ext_ou name=rule cls=Task.task_factory(name,rule,color=color,ext_in=ext_in,ext_out=ext_out,before=before,after=after,scan=scan,shell=shell) def x_file(self,node): - ext=decider and decider(self,node)or cls.ext_out if ext_in: _ext_in=ext_in[0] tsk=self.create_task(name,node) cnt=0 - keys=set(self.mappings.keys())|set(self.__class__.mappings.keys()) + ext=decider(self,node)if decider else cls.ext_out for x in ext: k=node.change_ext(x,ext_in=_ext_in) tsk.outputs.append(k) @@ -165,13 +177,13 @@ def declare_chain(name='',rule=None,reentrant=None,color='BLUE',ext_in=[],ext_ou if cnt<int(reentrant): self.source.append(k) else: - for y in keys: + for y in self.mappings: if k.name.endswith(y): self.source.append(k) break cnt+=1 if install_path: - self.bld.install_files(install_path,tsk.outputs) + self.install_task=self.add_install_files(install_to=install_path,install_from=tsk.outputs) return tsk for x in cls.ext_in: task_gen.mappings[x]=x_file @@ -190,8 +202,7 @@ def before_method(*k): def deco(func): setattr(task_gen,func.__name__,func) for fun_name in k: - if not func.__name__ in task_gen.prec[fun_name]: - task_gen.prec[fun_name].append(func.__name__) + task_gen.prec[func.__name__].add(fun_name) return func return deco before=before_method @@ -199,8 +210,7 @@ def after_method(*k): def deco(func): setattr(task_gen,func.__name__,func) for fun_name in k: - if not fun_name in task_gen.prec[func.__name__]: - task_gen.prec[func.__name__].append(fun_name) + task_gen.prec[fun_name].add(func.__name__) return func return deco after=after_method @@ -221,10 +231,13 @@ def to_nodes(self,lst,path=None): for x in Utils.to_list(lst): if isinstance(x,str): node=find(x) - else: + elif hasattr(x,'name'): node=x + else: + tmp.extend(self.to_nodes(x)) + continue if not node: - raise Errors.WafError("source not found: %r in %r"%(x,self)) + raise Errors.WafError('source not found: %r in %r'%(x,self)) tmp.append(node) return tmp @feature('*') @@ -242,23 +255,47 @@ def process_rule(self): cache=self.bld.cache_rule_attr except AttributeError: cache=self.bld.cache_rule_attr={} + chmod=getattr(self,'chmod',None) + shell=getattr(self,'shell',True) + color=getattr(self,'color','BLUE') + scan=getattr(self,'scan',None) + _vars=getattr(self,'vars',[]) + cls_str=getattr(self,'cls_str',None) + cls_keyword=getattr(self,'cls_keyword',None) + use_cache=getattr(self,'cache_rule','True') + deep_inputs=getattr(self,'deep_inputs',False) + scan_val=has_deps=hasattr(self,'deps') + if scan: + scan_val=id(scan) + key=Utils.h_list((name,self.rule,chmod,shell,color,cls_str,cls_keyword,scan_val,_vars,deep_inputs)) cls=None - if getattr(self,'cache_rule','True'): + if use_cache: try: - cls=cache[(name,self.rule)] + cls=cache[key] except KeyError: pass if not cls: rule=self.rule - if hasattr(self,'chmod'): + if chmod is not None: def chmod_fun(tsk): for x in tsk.outputs: - os.chmod(x.abspath(),self.chmod) - rule=(self.rule,chmod_fun) - cls=Task.task_factory(name,rule,getattr(self,'vars',[]),shell=getattr(self,'shell',True),color=getattr(self,'color','BLUE'),scan=getattr(self,'scan',None)) - if getattr(self,'scan',None): + os.chmod(x.abspath(),tsk.generator.chmod) + if isinstance(rule,tuple): + rule=list(rule) + rule.append(chmod_fun) + rule=tuple(rule) + else: + rule=(rule,chmod_fun) + cls=Task.task_factory(name,rule,_vars,shell=shell,color=color) + if cls_str: + setattr(cls,'__str__',self.cls_str) + if cls_keyword: + setattr(cls,'keyword',self.cls_keyword) + if deep_inputs: + Task.deep_inputs(cls) + if scan: cls.scan=self.scan - elif getattr(self,'deps',None): + elif has_deps: def scan(self): nodes=[] for x in self.generator.to_list(getattr(self.generator,'deps',None)): @@ -268,19 +305,19 @@ def process_rule(self): nodes.append(node) return[nodes,[]] cls.scan=scan - if getattr(self,'update_outputs',None): - Task.update_outputs(cls) - if getattr(self,'always',None): - Task.always_run(cls) - for x in('after','before','ext_in','ext_out'): - setattr(cls,x,getattr(self,x,[])) - if getattr(self,'cache_rule','True'): - cache[(name,self.rule)]=cls - if getattr(self,'cls_str',None): - setattr(cls,'__str__',self.cls_str) - if getattr(self,'cls_keyword',None): - setattr(cls,'keyword',self.cls_keyword) + if use_cache: + cache[key]=cls tsk=self.create_task(name) + for x in('after','before','ext_in','ext_out'): + setattr(tsk,x,getattr(self,x,[])) + if hasattr(self,'stdout'): + tsk.stdout=self.stdout + if hasattr(self,'stderr'): + tsk.stderr=self.stderr + if getattr(self,'timeout',None): + tsk.timeout=self.timeout + if getattr(self,'always',None): + tsk.always_run=True if getattr(self,'target',None): if isinstance(self.target,str): self.target=self.target.split() @@ -293,12 +330,14 @@ def process_rule(self): x.parent.mkdir() tsk.outputs.append(x) if getattr(self,'install_path',None): - self.bld.install_files(self.install_path,tsk.outputs,chmod=getattr(self,'chmod',Utils.O644)) + self.install_task=self.add_install_files(install_to=self.install_path,install_from=tsk.outputs,chmod=getattr(self,'chmod',Utils.O644)) if getattr(self,'source',None): tsk.inputs=self.to_nodes(self.source) self.source=[] if getattr(self,'cwd',None): tsk.cwd=self.cwd + if isinstance(tsk.run,functools.partial): + tsk.run=functools.partial(tsk.run,tsk) @feature('seq') def sequence_order(self): if self.meths and self.meths[-1]!='sequence_order': @@ -322,6 +361,8 @@ class subst_pc(Task.Task): if getattr(self.generator,'is_copy',None): for i,x in enumerate(self.outputs): x.write(self.inputs[i].read('rb'),'wb') + stat=os.stat(self.inputs[i].abspath()) + os.utime(self.outputs[i].abspath(),(stat.st_atime,stat.st_mtime)) self.force_permissions() return None if getattr(self.generator,'fun',None): @@ -329,11 +370,11 @@ class subst_pc(Task.Task): if not ret: self.force_permissions() return ret - code=self.inputs[0].read(encoding=getattr(self.generator,'encoding','ISO8859-1')) + code=self.inputs[0].read(encoding=getattr(self.generator,'encoding','latin-1')) if getattr(self.generator,'subst_fun',None): code=self.generator.subst_fun(self,code) if code is not None: - self.outputs[0].write(code,encoding=getattr(self.generator,'encoding','ISO8859-1')) + self.outputs[0].write(code,encoding=getattr(self.generator,'encoding','latin-1')) self.force_permissions() return None code=code.replace('%','%%') @@ -344,7 +385,6 @@ class subst_pc(Task.Task): lst.append(g(1)) return"%%(%s)s"%g(1) return'' - global re_m4 code=getattr(self.generator,'re_m4',re_m4).sub(repl,code) try: d=self.generator.dct @@ -358,19 +398,21 @@ class subst_pc(Task.Task): tmp=str(tmp) d[x]=tmp code=code%d - self.outputs[0].write(code,encoding=getattr(self.generator,'encoding','ISO8859-1')) - self.generator.bld.raw_deps[self.uid()]=self.dep_vars=lst - try:delattr(self,'cache_sig') - except AttributeError:pass + self.outputs[0].write(code,encoding=getattr(self.generator,'encoding','latin-1')) + self.generator.bld.raw_deps[self.uid()]=lst + try: + delattr(self,'cache_sig') + except AttributeError: + pass self.force_permissions() def sig_vars(self): bld=self.generator.bld env=self.env upd=self.m.update if getattr(self.generator,'fun',None): - upd(Utils.h_fun(self.generator.fun)) + upd(Utils.h_fun(self.generator.fun).encode()) if getattr(self.generator,'subst_fun',None): - upd(Utils.h_fun(self.generator.subst_fun)) + upd(Utils.h_fun(self.generator.subst_fun).encode()) vars=self.generator.bld.raw_deps.get(self.uid(),[]) act_sig=bld.hash_env_vars(env,vars) upd(act_sig) @@ -380,7 +422,7 @@ class subst_pc(Task.Task): @extension('.pc.in') def add_pcfile(self,node): tsk=self.create_task('subst_pc',node,node.change_ext('.pc','.pc.in')) - self.bld.install_files(getattr(self,'install_path','${LIBDIR}/pkgconfig/'),tsk.outputs) + self.install_task=self.add_install_files(install_to=getattr(self,'install_path','${LIBDIR}/pkgconfig/'),install_from=tsk.outputs) class subst(subst_pc): pass @feature('subst') @@ -402,7 +444,6 @@ def process_subst(self): a=self.path.find_node(x) b=self.path.get_bld().make_node(y) if not os.path.isfile(b.abspath()): - b.sig=None b.parent.mkdir() else: if isinstance(x,str): @@ -415,20 +456,16 @@ def process_subst(self): b=y if not a: raise Errors.WafError('could not find %r for %r'%(x,self)) - has_constraints=False tsk=self.create_task('subst',a,b) for k in('after','before','ext_in','ext_out'): val=getattr(self,k,None) if val: - has_constraints=True setattr(tsk,k,val) - if not has_constraints: - global HEADER_EXTS - for xt in HEADER_EXTS: - if b.name.endswith(xt): - tsk.before=[k for k in('c','cxx')if k in Task.classes] - break + for xt in HEADER_EXTS: + if b.name.endswith(xt): + tsk.ext_in=tsk.ext_in+['.h'] + break inst_to=getattr(self,'install_path',None) if inst_to: - self.bld.install_files(inst_to,b,chmod=getattr(self,'chmod',Utils.O644)) + self.install_task=self.add_install_files(install_to=inst_to,install_from=b,chmod=getattr(self,'chmod',Utils.O644)) self.source=[] diff --git a/waflib/Tools/asm.py b/waflib/Tools/asm.py index 3f1e135..d6a6d45 100644 --- a/waflib/Tools/asm.py +++ b/waflib/Tools/asm.py @@ -3,7 +3,6 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file from waflib import Task -import waflib.Task from waflib.Tools.ccroot import link_task,stlink_task from waflib.TaskGen import extension class asm(Task.Task): @@ -21,4 +20,4 @@ class asmshlib(asmprogram): class asmstlib(stlink_task): pass def configure(conf): - conf.env['ASMPATH_ST']='-I%s' + conf.env.ASMPATH_ST='-I%s' diff --git a/waflib/Tools/bison.py b/waflib/Tools/bison.py index c85c8d6..146921f 100644 --- a/waflib/Tools/bison.py +++ b/waflib/Tools/bison.py @@ -10,7 +10,7 @@ class bison(Task.Task): ext_out=['.h'] @extension('.y','.yc','.yy') def big_bison(self,node): - has_h='-d'in self.env['BISONFLAGS'] + has_h='-d'in self.env.BISONFLAGS outs=[] if node.name.endswith('.yc'): outs.append(node.change_ext('.tab.cc')) @@ -21,7 +21,7 @@ def big_bison(self,node): if has_h: outs.append(node.change_ext('.tab.h')) tsk=self.create_task('bison',node,outs) - tsk.cwd=node.parent.get_bld().abspath() + tsk.cwd=node.parent.get_bld() self.source.append(outs[0]) def configure(conf): conf.find_program('bison',var='BISON') diff --git a/waflib/Tools/c.py b/waflib/Tools/c.py index e3e7665..7c794f1 100644 --- a/waflib/Tools/c.py +++ b/waflib/Tools/c.py @@ -11,7 +11,7 @@ def c_hook(self,node): return self.create_compiled_task('cxx',node) return self.create_compiled_task('c',node) class c(Task.Task): - run_str='${CC} ${ARCH_ST:ARCH} ${CFLAGS} ${CPPFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CC_SRC_F}${SRC} ${CC_TGT_F}${TGT[0].abspath()}' + run_str='${CC} ${ARCH_ST:ARCH} ${CFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CC_SRC_F}${SRC} ${CC_TGT_F}${TGT[0].abspath()} ${CPPFLAGS}' vars=['CCDEPS'] ext_in=['.h'] scan=c_preproc.scan diff --git a/waflib/Tools/c_aliases.py b/waflib/Tools/c_aliases.py index e947f0d..b1c1031 100644 --- a/waflib/Tools/c_aliases.py +++ b/waflib/Tools/c_aliases.py @@ -7,16 +7,13 @@ from waflib.Configure import conf def get_extensions(lst): ret=[] for x in Utils.to_list(lst): - try: - if not isinstance(x,str): - x=x.name - ret.append(x[x.rfind('.')+1:]) - except Exception: - pass + if not isinstance(x,str): + x=x.name + ret.append(x[x.rfind('.')+1:]) return ret def sniff_features(**kw): exts=get_extensions(kw['source']) - type=kw['_type'] + typ=kw['typ'] feats=[] for x in'cxx cpp c++ cc C'.split(): if x in exts: @@ -33,17 +30,17 @@ def sniff_features(**kw): if'java'in exts: feats.append('java') return'java' - if type in('program','shlib','stlib'): + if typ in('program','shlib','stlib'): will_link=False for x in feats: if x in('cxx','d','fc','c'): - feats.append(x+type) + feats.append(x+typ) will_link=True if not will_link and not kw.get('features',[]): raise Errors.WafError('Cannot link from %r, try passing eg: features="c cprogram"?'%kw) return feats -def set_features(kw,_type): - kw['_type']=_type +def set_features(kw,typ): + kw['typ']=typ kw['features']=Utils.to_list(kw.get('features',[]))+Utils.to_list(sniff_features(**kw)) @conf def program(bld,*k,**kw): diff --git a/waflib/Tools/c_config.py b/waflib/Tools/c_config.py index 5f4e308..c8350c3 100644 --- a/waflib/Tools/c_config.py +++ b/waflib/Tools/c_config.py @@ -2,6 +2,7 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file +from __future__ import with_statement import os,re,shlex from waflib import Build,Utils,Task,Options,Logs,Errors,Runner from waflib.TaskGen import after_method,feature @@ -9,37 +10,12 @@ from waflib.Configure import conf WAF_CONFIG_H='config.h' DEFKEYS='define_key' INCKEYS='include_key' -cfg_ver={'atleast-version':'>=','exact-version':'==','max-version':'<=',} -SNIP_FUNCTION=''' -int main(int argc, char **argv) { - void (*p)(); - (void)argc; (void)argv; - p=(void(*)())(%s); - return !p; -} -''' -SNIP_TYPE=''' -int main(int argc, char **argv) { - (void)argc; (void)argv; - if ((%(type_name)s *) 0) return 0; - if (sizeof (%(type_name)s)) return 0; - return 1; -} -''' SNIP_EMPTY_PROGRAM=''' int main(int argc, char **argv) { (void)argc; (void)argv; return 0; } ''' -SNIP_FIELD=''' -int main(int argc, char **argv) { - char *off; - (void)argc; (void)argv; - off = (char*) &((%(type_name)s*)0)->%(field_name)s; - return (size_t) off < sizeof(%(type_name)s); -} -''' MACRO_TO_DESTOS={'__linux__':'linux','__GNU__':'gnu','__FreeBSD__':'freebsd','__NetBSD__':'netbsd','__OpenBSD__':'openbsd','__sun':'sunos','__hpux':'hpux','__sgi':'irix','_AIX':'aix','__CYGWIN__':'cygwin','__MSYS__':'cygwin','_UWIN':'uwin','_WIN64':'win32','_WIN32':'win32','__ENVIRONMENT_MAC_OS_X_VERSION_MIN_REQUIRED__':'darwin','__ENVIRONMENT_IPHONE_OS_VERSION_MIN_REQUIRED__':'darwin','__QNX__':'qnx','__native_client__':'nacl'} MACRO_TO_DEST_CPU={'__x86_64__':'x86_64','__amd64__':'x86_64','__i386__':'x86','__ia64__':'ia','__mips__':'mips','__sparc__':'sparc','__alpha__':'alpha','__aarch64__':'aarch64','__thumb__':'thumb','__arm__':'arm','__hppa__':'hppa','__powerpc__':'powerpc','__ppc__':'powerpc','__convex__':'convex','__m68k__':'m68k','__s390x__':'s390x','__s390__':'s390','__sh__':'sh','__xtensa__':'xtensa',} @conf @@ -54,141 +30,143 @@ def parse_flags(self,line,uselib_store,env=None,force_static=False,posix=None): lex.whitespace_split=True lex.commenters='' lst=list(lex) - app=env.append_value - appu=env.append_unique uselib=uselib_store + def app(var,val): + env.append_value('%s_%s'%(var,uselib),val) + def appu(var,val): + env.append_unique('%s_%s'%(var,uselib),val) static=False while lst: x=lst.pop(0) st=x[:2] ot=x[2:] if st=='-I'or st=='/I': - if not ot:ot=lst.pop(0) - appu('INCLUDES_'+uselib,[ot]) + if not ot: + ot=lst.pop(0) + appu('INCLUDES',ot) elif st=='-i': tmp=[x,lst.pop(0)] app('CFLAGS',tmp) app('CXXFLAGS',tmp) elif st=='-D'or(env.CXX_NAME=='msvc'and st=='/D'): - if not ot:ot=lst.pop(0) - app('DEFINES_'+uselib,[ot]) + if not ot: + ot=lst.pop(0) + app('DEFINES',ot) elif st=='-l': - if not ot:ot=lst.pop(0) - prefix=(force_static or static)and'STLIB_'or'LIB_' - appu(prefix+uselib,[ot]) + if not ot: + ot=lst.pop(0) + prefix='STLIB'if(force_static or static)else'LIB' + app(prefix,ot) elif st=='-L': - if not ot:ot=lst.pop(0) - prefix=(force_static or static)and'STLIBPATH_'or'LIBPATH_' - appu(prefix+uselib,[ot]) + if not ot: + ot=lst.pop(0) + prefix='STLIBPATH'if(force_static or static)else'LIBPATH' + appu(prefix,ot) elif x.startswith('/LIBPATH:'): - prefix=(force_static or static)and'STLIBPATH_'or'LIBPATH_' - appu(prefix+uselib,[x.replace('/LIBPATH:','')]) + prefix='STLIBPATH'if(force_static or static)else'LIBPATH' + appu(prefix,x.replace('/LIBPATH:','')) elif x.startswith('-std='): - if'++'in x: - app('CXXFLAGS_'+uselib,[x]) - else: - app('CFLAGS_'+uselib,[x]) - elif x=='-pthread'or x.startswith('+'): - app('CFLAGS_'+uselib,[x]) - app('CXXFLAGS_'+uselib,[x]) - app('LINKFLAGS_'+uselib,[x]) + prefix='CXXFLAGS'if'++'in x else'CFLAGS' + app(prefix,x) + elif x.startswith('+')or x in('-pthread','-fPIC','-fpic','-fPIE','-fpie'): + app('CFLAGS',x) + app('CXXFLAGS',x) + app('LINKFLAGS',x) elif x=='-framework': - appu('FRAMEWORK_'+uselib,[lst.pop(0)]) + appu('FRAMEWORK',lst.pop(0)) elif x.startswith('-F'): - appu('FRAMEWORKPATH_'+uselib,[x[2:]]) + appu('FRAMEWORKPATH',x[2:]) elif x=='-Wl,-rpath'or x=='-Wl,-R': - app('RPATH_'+uselib,lst.pop(0).lstrip('-Wl,')) + app('RPATH',lst.pop(0).lstrip('-Wl,')) elif x.startswith('-Wl,-R,'): - app('RPATH_'+uselib,x[7:]) + app('RPATH',x[7:]) elif x.startswith('-Wl,-R'): - app('RPATH_'+uselib,x[6:]) + app('RPATH',x[6:]) elif x.startswith('-Wl,-rpath,'): - app('RPATH_'+uselib,x[11:]) + app('RPATH',x[11:]) elif x=='-Wl,-Bstatic'or x=='-Bstatic': static=True elif x=='-Wl,-Bdynamic'or x=='-Bdynamic': static=False - elif x.startswith('-Wl'): - app('LINKFLAGS_'+uselib,[x]) - elif x.startswith('-m')or x.startswith('-f')or x.startswith('-dynamic'): - app('CFLAGS_'+uselib,[x]) - app('CXXFLAGS_'+uselib,[x]) + elif x.startswith('-Wl')or x in('-rdynamic','-pie'): + app('LINKFLAGS',x) + elif x.startswith(('-m','-f','-dynamic','-O','-g')): + app('CFLAGS',x) + app('CXXFLAGS',x) elif x.startswith('-bundle'): - app('LINKFLAGS_'+uselib,[x]) - elif x.startswith('-undefined')or x.startswith('-Xlinker'): + app('LINKFLAGS',x) + elif x.startswith(('-undefined','-Xlinker')): arg=lst.pop(0) - app('LINKFLAGS_'+uselib,[x,arg]) - elif x.startswith('-arch')or x.startswith('-isysroot'): + app('LINKFLAGS',[x,arg]) + elif x.startswith(('-arch','-isysroot')): tmp=[x,lst.pop(0)] - app('CFLAGS_'+uselib,tmp) - app('CXXFLAGS_'+uselib,tmp) - app('LINKFLAGS_'+uselib,tmp) - elif x.endswith('.a')or x.endswith('.so')or x.endswith('.dylib')or x.endswith('.lib'): - appu('LINKFLAGS_'+uselib,[x]) + app('CFLAGS',tmp) + app('CXXFLAGS',tmp) + app('LINKFLAGS',tmp) + elif x.endswith(('.a','.so','.dylib','.lib')): + appu('LINKFLAGS',x) + else: + self.to_log('Unhandled flag %r'%x) @conf def validate_cfg(self,kw): if not'path'in kw: if not self.env.PKGCONFIG: self.find_program('pkg-config',var='PKGCONFIG') kw['path']=self.env.PKGCONFIG - if'atleast_pkgconfig_version'in kw: - if not'msg'in kw: + s=('atleast_pkgconfig_version'in kw)+('modversion'in kw)+('package'in kw) + if s!=1: + raise ValueError('exactly one of atleast_pkgconfig_version, modversion and package must be set') + if not'msg'in kw: + if'atleast_pkgconfig_version'in kw: kw['msg']='Checking for pkg-config version >= %r'%kw['atleast_pkgconfig_version'] - return - if not'okmsg'in kw: + elif'modversion'in kw: + kw['msg']='Checking for %r version'%kw['modversion'] + else: + kw['msg']='Checking for %r'%(kw['package']) + if not'okmsg'in kw and not'modversion'in kw: kw['okmsg']='yes' if not'errmsg'in kw: kw['errmsg']='not found' - if'modversion'in kw: - if not'msg'in kw: - kw['msg']='Checking for %r version'%kw['modversion'] - return - for x in cfg_ver.keys(): - y=x.replace('-','_') - if y in kw: - if not'package'in kw: - raise ValueError('%s requires a package'%x) - if not'msg'in kw: - kw['msg']='Checking for %r %s %s'%(kw['package'],cfg_ver[x],kw[y]) - return - if not'define_name'in kw: - pkgname=kw.get('uselib_store',kw['package'].upper()) - kw['define_name']=self.have_define(pkgname) - if not'uselib_store'in kw: - self.undefine(kw['define_name']) - if not'msg'in kw: - kw['msg']='Checking for %r'%(kw['package']or kw['path']) + if'atleast_pkgconfig_version'in kw: + pass + elif'modversion'in kw: + if not'uselib_store'in kw: + kw['uselib_store']=kw['modversion'] + if not'define_name'in kw: + kw['define_name']='%s_VERSION'%Utils.quote_define_name(kw['uselib_store']) + else: + if not'uselib_store'in kw: + kw['uselib_store']=Utils.to_list(kw['package'])[0].upper() + if not'define_name'in kw: + kw['define_name']=self.have_define(kw['uselib_store']) @conf def exec_cfg(self,kw): path=Utils.to_list(kw['path']) env=self.env.env or None + if kw.get('pkg_config_path'): + if not env: + env=dict(self.environ) + env['PKG_CONFIG_PATH']=kw['pkg_config_path'] def define_it(): - pkgname=kw.get('uselib_store',kw['package'].upper()) - if kw.get('global_define'): - self.define(self.have_define(kw['package']),1,False) + define_name=kw['define_name'] + if kw.get('global_define',1): + self.define(define_name,1,False) else: - self.env.append_unique('DEFINES_%s'%pkgname,"%s=1"%self.have_define(pkgname)) - self.env[self.have_define(pkgname)]=1 + self.env.append_unique('DEFINES_%s'%kw['uselib_store'],"%s=1"%define_name) + if kw.get('add_have_to_env',1): + self.env[define_name]=1 if'atleast_pkgconfig_version'in kw: cmd=path+['--atleast-pkgconfig-version=%s'%kw['atleast_pkgconfig_version']] self.cmd_and_log(cmd,env=env) - if not'okmsg'in kw: - kw['okmsg']='yes' return - for x in cfg_ver: - y=x.replace('-','_') - if y in kw: - self.cmd_and_log(path+['--%s=%s'%(x,kw[y]),kw['package']],env=env) - if not'okmsg'in kw: - kw['okmsg']='yes' - define_it() - break if'modversion'in kw: version=self.cmd_and_log(path+['--modversion',kw['modversion']],env=env).strip() - self.define('%s_VERSION'%Utils.quote_define_name(kw.get('uselib_store',kw['modversion'])),version) + if not'okmsg'in kw: + kw['okmsg']=version + self.define(kw['define_name'],version) return version lst=[]+path - defi=kw.get('define_variable',None) + defi=kw.get('define_variable') if not defi: defi=self.env.PKG_CONFIG_DEFINES or{} for key,val in defi.items(): @@ -202,40 +180,30 @@ def exec_cfg(self,kw): lst.extend(Utils.to_list(kw['package'])) if'variables'in kw: v_env=kw.get('env',self.env) - uselib=kw.get('uselib_store',kw['package'].upper()) vars=Utils.to_list(kw['variables']) for v in vars: val=self.cmd_and_log(lst+['--variable='+v],env=env).strip() - var='%s_%s'%(uselib,v) + var='%s_%s'%(kw['uselib_store'],v) v_env[var]=val - if not'okmsg'in kw: - kw['okmsg']='yes' return ret=self.cmd_and_log(lst,env=env) - if not'okmsg'in kw: - kw['okmsg']='yes' define_it() - self.parse_flags(ret,kw.get('uselib_store',kw['package'].upper()),kw.get('env',self.env),force_static=static,posix=kw.get('posix',None)) + self.parse_flags(ret,kw['uselib_store'],kw.get('env',self.env),force_static=static,posix=kw.get('posix')) return ret @conf def check_cfg(self,*k,**kw): - if k: - lst=k[0].split() - kw['package']=lst[0] - kw['args']=' '.join(lst[1:]) self.validate_cfg(kw) if'msg'in kw: self.start_msg(kw['msg'],**kw) ret=None try: ret=self.exec_cfg(kw) - except self.errors.WafError: + except self.errors.WafError as e: if'errmsg'in kw: self.end_msg(kw['errmsg'],'YELLOW',**kw) if Logs.verbose>1: - raise - else: - self.fatal('The configuration failed') + self.to_log('Command failure: %s'%e) + self.fatal('The configuration failed') else: if not ret: ret=True @@ -250,10 +218,13 @@ def build_fun(bld): o=bld(features=bld.kw['features'],source=bld.kw['compile_filename'],target='testprog') for k,v in bld.kw.items(): setattr(o,k,v) - if not bld.kw.get('quiet',None): + if not bld.kw.get('quiet'): bld.conf.to_log("==>\n%s\n<=="%bld.kw['code']) @conf def validate_c(self,kw): + for x in('type_name','field_name','function_name'): + if x in kw: + Logs.warn('Invalid argument %r in test'%x) if not'build_fun'in kw: kw['build_fun']=build_fun if not'env'in kw: @@ -261,16 +232,16 @@ def validate_c(self,kw): env=kw['env'] if not'compiler'in kw and not'features'in kw: kw['compiler']='c' - if env['CXX_NAME']and Task.classes.get('cxx',None): + if env.CXX_NAME and Task.classes.get('cxx'): kw['compiler']='cxx' - if not self.env['CXX']: + if not self.env.CXX: self.fatal('a c++ compiler is required') else: - if not self.env['CC']: + if not self.env.CC: self.fatal('a c compiler is required') if not'compile_mode'in kw: kw['compile_mode']='c' - if'cxx'in Utils.to_list(kw.get('features',[]))or kw.get('compiler','')=='cxx': + if'cxx'in Utils.to_list(kw.get('features',[]))or kw.get('compiler')=='cxx': kw['compile_mode']='cxx' if not'type'in kw: kw['type']='cprogram' @@ -292,46 +263,19 @@ def validate_c(self,kw): fwkname=kw['framework_name'] if not'uselib_store'in kw: kw['uselib_store']=fwkname.upper() - if not kw.get('no_header',False): - if not'header_name'in kw: - kw['header_name']=[] + if not kw.get('no_header'): fwk='%s/%s.h'%(fwkname,fwkname) - if kw.get('remove_dot_h',None): + if kw.get('remove_dot_h'): fwk=fwk[:-2] - kw['header_name']=Utils.to_list(kw['header_name'])+[fwk] + val=kw.get('header_name',[]) + kw['header_name']=Utils.to_list(val)+[fwk] kw['msg']='Checking for framework %s'%fwkname kw['framework']=fwkname - if'function_name'in kw: - fu=kw['function_name'] - if not'msg'in kw: - kw['msg']='Checking for function %s'%fu - kw['code']=to_header(kw)+SNIP_FUNCTION%fu - if not'uselib_store'in kw: - kw['uselib_store']=fu.upper() - if not'define_name'in kw: - kw['define_name']=self.have_define(fu) - elif'type_name'in kw: - tu=kw['type_name'] - if not'header_name'in kw: - kw['header_name']='stdint.h' - if'field_name'in kw: - field=kw['field_name'] - kw['code']=to_header(kw)+SNIP_FIELD%{'type_name':tu,'field_name':field} - if not'msg'in kw: - kw['msg']='Checking for field %s in %s'%(field,tu) - if not'define_name'in kw: - kw['define_name']=self.have_define((tu+'_'+field).upper()) - else: - kw['code']=to_header(kw)+SNIP_TYPE%{'type_name':tu} - if not'msg'in kw: - kw['msg']='Checking for type %s'%tu - if not'define_name'in kw: - kw['define_name']=self.have_define(tu.upper()) elif'header_name'in kw: if not'msg'in kw: kw['msg']='Checking for header %s'%kw['header_name'] l=Utils.to_list(kw['header_name']) - assert len(l)>0,'list of headers in header_name is empty' + assert len(l),'list of headers in header_name is empty' kw['code']=to_header(kw)+SNIP_EMPTY_PROGRAM if not'uselib_store'in kw: kw['uselib_store']=l[0].upper() @@ -363,7 +307,7 @@ def validate_c(self,kw): kw['execute']=False if kw['execute']: kw['features'].append('test_exec') - kw['chmod']=493 + kw['chmod']=Utils.O755 if not'errmsg'in kw: kw['errmsg']='not found' if not'okmsg'in kw: @@ -372,10 +316,11 @@ def validate_c(self,kw): kw['code']=SNIP_EMPTY_PROGRAM if self.env[INCKEYS]: kw['code']='\n'.join(['#include <%s>'%x for x in self.env[INCKEYS]])+'\n'+kw['code'] - if kw.get('merge_config_header',False)or env.merge_config_header: + if kw.get('merge_config_header')or env.merge_config_header: kw['code']='%s\n\n%s'%(self.get_config_header(),kw['code']) env.DEFINES=[] - if not kw.get('success'):kw['success']=None + if not kw.get('success'): + kw['success']=None if'define_name'in kw: self.undefine(kw['define_name']) if not'msg'in kw: @@ -385,40 +330,52 @@ def post_check(self,*k,**kw): is_success=0 if kw['execute']: if kw['success']is not None: - if kw.get('define_ret',False): + if kw.get('define_ret'): is_success=kw['success'] else: is_success=(kw['success']==0) else: is_success=(kw['success']==0) - if'define_name'in kw: + if kw.get('define_name'): comment=kw.get('comment','') define_name=kw['define_name'] - if'header_name'in kw or'function_name'in kw or'type_name'in kw or'fragment'in kw: - if kw['execute']and kw.get('define_ret',None)and isinstance(is_success,str): + if kw['execute']and kw.get('define_ret')and isinstance(is_success,str): + if kw.get('global_define',1): self.define(define_name,is_success,quote=kw.get('quote',1),comment=comment) else: - self.define_cond(define_name,is_success,comment=comment) + if kw.get('quote',1): + succ='"%s"'%is_success + else: + succ=int(is_success) + val='%s=%s'%(define_name,succ) + var='DEFINES_%s'%kw['uselib_store'] + self.env.append_value(var,val) else: - self.define_cond(define_name,is_success,comment=comment) - if kw.get('global_define',None): - self.env[kw['define_name']]=is_success + if kw.get('global_define',1): + self.define_cond(define_name,is_success,comment=comment) + else: + var='DEFINES_%s'%kw['uselib_store'] + self.env.append_value(var,'%s=%s'%(define_name,int(is_success))) + if kw.get('add_have_to_env',1): + if kw.get('uselib_store'): + self.env[self.have_define(kw['uselib_store'])]=1 + elif kw['execute']and kw.get('define_ret'): + self.env[define_name]=is_success + else: + self.env[define_name]=int(is_success) if'header_name'in kw: - if kw.get('auto_add_header_name',False): + if kw.get('auto_add_header_name'): self.env.append_value(INCKEYS,Utils.to_list(kw['header_name'])) if is_success and'uselib_store'in kw: from waflib.Tools import ccroot - _vars=set([]) + _vars=set() for x in kw['features']: if x in ccroot.USELIB_VARS: _vars|=ccroot.USELIB_VARS[x] for k in _vars: - lk=k.lower() - if lk in kw: - val=kw[lk] - if isinstance(val,str): - val=val.rstrip(os.path.sep) - self.env.append_unique(k+'_'+kw['uselib_store'],Utils.to_list(val)) + x=k.lower() + if x in kw: + self.env.append_value(k+'_'+kw['uselib_store'],kw[x]) return is_success @conf def check(self,*k,**kw): @@ -483,7 +440,9 @@ def get_define_comment(self,key): return coms.get(key,'') @conf def define(self,key,val,quote=True,comment=''): - assert key and isinstance(key,str) + assert isinstance(key,str) + if not key: + return if val is True: val=1 elif val in(False,None): @@ -494,7 +453,7 @@ def define(self,key,val,quote=True,comment=''): s=quote and'%s="%s"'or'%s=%s' app=s%(key,str(val)) ban=key+'=' - lst=self.env['DEFINES'] + lst=self.env.DEFINES for x in lst: if x.startswith(ban): lst[lst.index(x)]=app @@ -505,15 +464,19 @@ def define(self,key,val,quote=True,comment=''): self.set_define_comment(key,comment) @conf def undefine(self,key,comment=''): - assert key and isinstance(key,str) + assert isinstance(key,str) + if not key: + return ban=key+'=' - lst=[x for x in self.env['DEFINES']if not x.startswith(ban)] - self.env['DEFINES']=lst + lst=[x for x in self.env.DEFINES if not x.startswith(ban)] + self.env.DEFINES=lst self.env.append_unique(DEFKEYS,key) self.set_define_comment(key,comment) @conf def define_cond(self,key,val,comment=''): - assert key and isinstance(key,str) + assert isinstance(key,str) + if not key: + return if val: self.define(key,1,comment=comment) else: @@ -522,7 +485,7 @@ def define_cond(self,key,val,comment=''): def is_defined(self,key): assert key and isinstance(key,str) ban=key+'=' - for x in self.env['DEFINES']: + for x in self.env.DEFINES: if x.startswith(ban): return True return False @@ -530,7 +493,7 @@ def is_defined(self,key): def get_define(self,key): assert key and isinstance(key,str) ban=key+'=' - for x in self.env['DEFINES']: + for x in self.env.DEFINES: if x.startswith(ban): return x[len(ban):] return None @@ -539,7 +502,8 @@ def have_define(self,key): return(self.env.HAVE_PAT or'HAVE_%s')%Utils.quote_define_name(key) @conf def write_config_header(self,configfile='',guard='',top=False,defines=True,headers=False,remove=True,define_prefix=''): - if not configfile:configfile=WAF_CONFIG_H + if not configfile: + configfile=WAF_CONFIG_H waf_guard=guard or'W_%s_WAF'%Utils.quote_define_name(configfile) node=top and self.bldnode or self.path.get_bld() node=node.make_node(configfile) @@ -564,7 +528,7 @@ def get_config_header(self,defines=True,headers=False,define_prefix=''): lst.append('#include <%s>'%x) if defines: tbl={} - for k in self.env['DEFINES']: + for k in self.env.DEFINES: a,_,b=k.partition('=') tbl[a]=b for k in self.env[DEFKEYS]: @@ -604,8 +568,8 @@ def get_cc_version(conf,cc,gcc=False,icc=False,clang=False): cmd=cc+['-dM','-E','-'] env=conf.env.env or None try: - out,err=conf.cmd_and_log(cmd,output=0,input='\n',env=env) - except Exception: + out,err=conf.cmd_and_log(cmd,output=0,input='\n'.encode(),env=env) + except Errors.WafError: conf.fatal('Could not determine the compiler version %r'%cmd) if gcc: if out.find('__INTEL_COMPILER')>=0: @@ -644,6 +608,8 @@ def get_cc_version(conf,cc,gcc=False,icc=False,clang=False): conf.env.DEST_BINFMT='elf' elif isD('__WINNT__')or isD('__CYGWIN__')or isD('_WIN32'): conf.env.DEST_BINFMT='pe' + if not conf.env.IMPLIBDIR: + conf.env.IMPLIBDIR=conf.env.LIBDIR conf.env.LIBDIR=conf.env.BINDIR elif isD('__APPLE__'): conf.env.DEST_BINFMT='mac-o' @@ -656,12 +622,12 @@ def get_cc_version(conf,cc,gcc=False,icc=False,clang=False): Logs.debug('ccroot: dest platform: '+' '.join([conf.env[x]or'?'for x in('DEST_OS','DEST_BINFMT','DEST_CPU')])) if icc: ver=k['__INTEL_COMPILER'] - conf.env['CC_VERSION']=(ver[:-2],ver[-2],ver[-1]) + conf.env.CC_VERSION=(ver[:-2],ver[-2],ver[-1]) else: if isD('__clang__')and isD('__clang_major__'): - conf.env['CC_VERSION']=(k['__clang_major__'],k['__clang_minor__'],k['__clang_patchlevel__']) + conf.env.CC_VERSION=(k['__clang_major__'],k['__clang_minor__'],k['__clang_patchlevel__']) else: - conf.env['CC_VERSION']=(k['__GNUC__'],k['__GNUC_MINOR__'],k.get('__GNUC_PATCHLEVEL__','0')) + conf.env.CC_VERSION=(k['__GNUC__'],k['__GNUC_MINOR__'],k.get('__GNUC_PATCHLEVEL__','0')) return k @conf def get_xlc_version(conf,cc): @@ -675,7 +641,7 @@ def get_xlc_version(conf,cc): match=version_re(out or err) if match: k=match.groupdict() - conf.env['CC_VERSION']=(k['major'],k['minor']) + conf.env.CC_VERSION=(k['major'],k['minor']) break else: conf.fatal('Could not determine the XLC version.') @@ -684,7 +650,7 @@ def get_suncc_version(conf,cc): cmd=cc+['-V'] try: out,err=conf.cmd_and_log(cmd,output=0) - except Errors.WafError ,e: + except Errors.WafError as e: if not(hasattr(e,'returncode')and hasattr(e,'stdout')and hasattr(e,'stderr')): conf.fatal('Could not find suncc %r'%cmd) out=e.stdout @@ -695,20 +661,28 @@ def get_suncc_version(conf,cc): match=version_re(version) if match: k=match.groupdict() - conf.env['CC_VERSION']=(k['major'],k['minor']) + conf.env.CC_VERSION=(k['major'],k['minor']) else: conf.fatal('Could not determine the suncc version.') @conf def add_as_needed(self): if self.env.DEST_BINFMT=='elf'and'gcc'in(self.env.CXX_NAME,self.env.CC_NAME): self.env.append_unique('LINKFLAGS','-Wl,--as-needed') -class cfgtask(Task.TaskBase): +class cfgtask(Task.Task): + def __init__(self,*k,**kw): + Task.Task.__init__(self,*k,**kw) + self.run_after=set() def display(self): return'' def runnable_status(self): + for x in self.run_after: + if not x.hasrun: + return Task.ASK_LATER return Task.RUN_ME def uid(self): return Utils.SIG_NIL + def signature(self): + return Utils.SIG_NIL def run(self): conf=self.conf bld=Build.BuildContext(top_dir=conf.srcnode.abspath(),out_dir=conf.bldnode.abspath()) @@ -716,17 +690,40 @@ class cfgtask(Task.TaskBase): bld.init_dirs() bld.in_msg=1 bld.logger=self.logger + bld.multicheck_task=self + args=self.args try: - bld.check(**self.args) + if'func'in args: + bld.test(build_fun=args['func'],msg=args.get('msg',''),okmsg=args.get('okmsg',''),errmsg=args.get('errmsg',''),) + else: + args['multicheck_mandatory']=args.get('mandatory',True) + args['mandatory']=True + try: + bld.check(**args) + finally: + args['mandatory']=args['multicheck_mandatory'] except Exception: return 1 + def process(self): + Task.Task.process(self) + if'msg'in self.args: + with self.generator.bld.multicheck_lock: + self.conf.start_msg(self.args['msg']) + if self.hasrun==Task.NOT_RUN: + self.conf.end_msg('test cancelled','YELLOW') + elif self.hasrun!=Task.SUCCESS: + self.conf.end_msg(self.args.get('errmsg','no'),'YELLOW') + else: + self.conf.end_msg(self.args.get('okmsg','yes'),'GREEN') @conf def multicheck(self,*k,**kw): self.start_msg(kw.get('msg','Executing %d configuration tests'%len(k)),**kw) + for var in('DEFINES',DEFKEYS): + self.env.append_value(var,[]) + self.env.DEFINE_COMMENTS=self.env.DEFINE_COMMENTS or{} class par(object): def __init__(self): self.keep=False - self.returned_tasks=[] self.task_sigs={} self.progress_bar=0 def total(self): @@ -734,32 +731,74 @@ def multicheck(self,*k,**kw): def to_log(self,*k,**kw): return bld=par() + bld.keep=kw.get('run_all_tests',True) + bld.imp_sigs={} tasks=[] + id_to_task={} for dct in k: - x=cfgtask(bld=bld) + x=Task.classes['cfgtask'](bld=bld,env=None) tasks.append(x) x.args=dct x.bld=bld x.conf=self x.args=dct x.logger=Logs.make_mem_logger(str(id(x)),self.logger) + if'id'in dct: + id_to_task[dct['id']]=x + for x in tasks: + for key in Utils.to_list(x.args.get('before_tests',[])): + tsk=id_to_task[key] + if not tsk: + raise ValueError('No test named %r'%key) + tsk.run_after.add(x) + for key in Utils.to_list(x.args.get('after_tests',[])): + tsk=id_to_task[key] + if not tsk: + raise ValueError('No test named %r'%key) + x.run_after.add(tsk) def it(): yield tasks while 1: yield[] - p=Runner.Parallel(bld,Options.options.jobs) + bld.producer=p=Runner.Parallel(bld,Options.options.jobs) + bld.multicheck_lock=Utils.threading.Lock() p.biter=it() + self.end_msg('started') p.start() for x in tasks: x.logger.memhandler.flush() + self.start_msg('-> processing test results') if p.error: for x in p.error: if getattr(x,'err_msg',None): self.to_log(x.err_msg) self.end_msg('fail',color='RED') raise Errors.WafError('There is an error in the library, read config.log for more information') + failure_count=0 + for x in tasks: + if x.hasrun not in(Task.SUCCESS,Task.NOT_RUN): + failure_count+=1 + if failure_count: + self.end_msg(kw.get('errmsg','%s test failed'%failure_count),color='YELLOW',**kw) + else: + self.end_msg('all ok',**kw) for x in tasks: if x.hasrun!=Task.SUCCESS: - self.end_msg(kw.get('errmsg','no'),color='YELLOW',**kw) - self.fatal(kw.get('fatalmsg',None)or'One of the tests has failed, read config.log for more information') - self.end_msg('ok',**kw) + if x.args.get('mandatory',True): + self.fatal(kw.get('fatalmsg')or'One of the tests has failed, read config.log for more information') +@conf +def check_gcc_o_space(self,mode='c'): + if int(self.env.CC_VERSION[0])>4: + return + self.env.stash() + if mode=='c': + self.env.CCLNK_TGT_F=['-o',''] + elif mode=='cxx': + self.env.CXXLNK_TGT_F=['-o',''] + features='%s %sshlib'%(mode,mode) + try: + self.check(msg='Checking if the -o link must be split from arguments',fragment=SNIP_EMPTY_PROGRAM,features=features) + except self.errors.ConfigurationError: + self.env.revert() + else: + self.env.commit() diff --git a/waflib/Tools/c_osx.py b/waflib/Tools/c_osx.py index 8cb4bce..847b433 100644 --- a/waflib/Tools/c_osx.py +++ b/waflib/Tools/c_osx.py @@ -3,7 +3,7 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file import os,shutil,platform -from waflib import Task,Utils,Errors +from waflib import Task,Utils from waflib.TaskGen import taskgen_method,feature,after_method,before_method app_info=''' <?xml version="1.0" encoding="UTF-8"?> @@ -25,8 +25,8 @@ app_info=''' ''' @feature('c','cxx') def set_macosx_deployment_target(self): - if self.env['MACOSX_DEPLOYMENT_TARGET']: - os.environ['MACOSX_DEPLOYMENT_TARGET']=self.env['MACOSX_DEPLOYMENT_TARGET'] + if self.env.MACOSX_DEPLOYMENT_TARGET: + os.environ['MACOSX_DEPLOYMENT_TARGET']=self.env.MACOSX_DEPLOYMENT_TARGET elif'MACOSX_DEPLOYMENT_TARGET'not in os.environ: if Utils.unversioned_sys_platform()=='darwin': os.environ['MACOSX_DEPLOYMENT_TARGET']='.'.join(platform.mac_ver()[0].split('.')[:2]) @@ -48,14 +48,14 @@ def bundle_name_for_output(out): @feature('cprogram','cxxprogram') @after_method('apply_link') def create_task_macapp(self): - if self.env['MACAPP']or getattr(self,'mac_app',False): + if self.env.MACAPP or getattr(self,'mac_app',False): out=self.link_task.outputs[0] name=bundle_name_for_output(out) dir=self.create_bundle_dirs(name,out) n1=dir.find_or_declare(['Contents','MacOS',out.name]) self.apptask=self.create_task('macapp',self.link_task.outputs,n1) inst_to=getattr(self,'install_path','/Applications')+'/%s/Contents/MacOS/'%name - self.bld.install_files(inst_to,n1,chmod=Utils.O755) + self.add_install_files(install_to=inst_to,install_from=n1,chmod=Utils.O755) if getattr(self,'mac_files',None): mac_files_root=getattr(self,'mac_files_root',None) if isinstance(mac_files_root,str): @@ -67,29 +67,13 @@ def create_task_macapp(self): for node in self.to_nodes(self.mac_files): relpath=node.path_from(mac_files_root or node.parent) self.create_task('macapp',node,res_dir.make_node(relpath)) - self.bld.install_as(os.path.join(inst_to,relpath),node) - if getattr(self,'mac_resources',None): - res_dir=n1.parent.parent.make_node('Resources') - inst_to=getattr(self,'install_path','/Applications')+'/%s/Resources'%name - for x in self.to_list(self.mac_resources): - node=self.path.find_node(x) - if not node: - raise Errors.WafError('Missing mac_resource %r in %r'%(x,self)) - parent=node.parent - if os.path.isdir(node.abspath()): - nodes=node.ant_glob('**') - else: - nodes=[node] - for node in nodes: - rel=node.path_from(parent) - self.create_task('macapp',node,res_dir.make_node(rel)) - self.bld.install_as(inst_to+'/%s'%rel,node) + self.add_install_as(install_to=os.path.join(inst_to,relpath),install_from=node) if getattr(self.bld,'is_install',None): self.install_task.hasrun=Task.SKIP_ME @feature('cprogram','cxxprogram') @after_method('apply_link') def create_task_macplist(self): - if self.env['MACAPP']or getattr(self,'mac_app',False): + if self.env.MACAPP or getattr(self,'mac_app',False): out=self.link_task.outputs[0] name=bundle_name_for_output(out) dir=self.create_bundle_dirs(name,out) @@ -108,13 +92,13 @@ def create_task_macplist(self): else: plisttask.code=app_info inst_to=getattr(self,'install_path','/Applications')+'/%s/Contents/'%name - self.bld.install_files(inst_to,n1) + self.add_install_files(install_to=inst_to,install_from=n1) @feature('cshlib','cxxshlib') @before_method('apply_link','propagate_uselib_vars') def apply_bundle(self): - if self.env['MACBUNDLE']or getattr(self,'mac_bundle',False): - self.env['LINKFLAGS_cshlib']=self.env['LINKFLAGS_cxxshlib']=[] - self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['macbundle_PATTERN'] + if self.env.MACBUNDLE or getattr(self,'mac_bundle',False): + self.env.LINKFLAGS_cshlib=self.env.LINKFLAGS_cxxshlib=[] + self.env.cshlib_PATTERN=self.env.cxxshlib_PATTERN=self.env.macbundle_PATTERN use=self.use=self.to_list(getattr(self,'use',[])) if not'MACBUNDLE'in use: use.append('MACBUNDLE') diff --git a/waflib/Tools/c_preproc.py b/waflib/Tools/c_preproc.py index 846ca07..8781b73 100644 --- a/waflib/Tools/c_preproc.py +++ b/waflib/Tools/c_preproc.py @@ -4,19 +4,20 @@ import re,string,traceback from waflib import Logs,Utils,Errors -from waflib.Logs import debug,error class PreprocError(Errors.WafError): pass +FILE_CACHE_SIZE=100000 +LINE_CACHE_SIZE=100000 POPFILE='-' recursion_limit=150 go_absolute=False -standard_includes=['/usr/include'] +standard_includes=['/usr/local/include','/usr/include'] if Utils.is_win32: standard_includes=[] use_trigraphs=0 strict_quotes=0 g_optrans={'not':'!','not_eq':'!','and':'&&','and_eq':'&=','or':'||','or_eq':'|=','xor':'^','xor_eq':'^=','bitand':'&','bitor':'|','compl':'~',} -re_lines=re.compile('^[ \t]*(#|%:)[ \t]*(ifdef|ifndef|if|else|elif|endif|include|import|define|undef|pragma)[ \t]*(.*)\r*$',re.IGNORECASE|re.MULTILINE) +re_lines=re.compile('^[ \t]*(?:#|%:)[ \t]*(ifdef|ifndef|if|else|elif|endif|include|import|define|undef|pragma)[ \t]*(.*)\r*$',re.IGNORECASE|re.MULTILINE) re_mac=re.compile("^[a-zA-Z_]\w*") re_fun=re.compile('^[a-zA-Z_][a-zA-Z0-9_]*[(]') re_pragma_once=re.compile('^\s*once\s*',re.IGNORECASE) @@ -37,57 +38,69 @@ ignored='i' undefined='u' skipped='s' def repl(m): - s=m.group(0) - if s.startswith('/'): + s=m.group() + if s[0]=='/': return' ' return s -def filter_comments(filename): - code=Utils.readf(filename) - if use_trigraphs: - for(a,b)in trig_def:code=code.split(a).join(b) - code=re_nl.sub('',code) - code=re_cpp.sub(repl,code) - return[(m.group(2),m.group(3))for m in re.finditer(re_lines,code)] prec={} ops=['* / %','+ -','<< >>','< <= >= >','== !=','& | ^','&& ||',','] -for x in range(len(ops)): - syms=ops[x] +for x,syms in enumerate(ops): for u in syms.split(): prec[u]=x -def trimquotes(s): - if not s:return'' - s=s.rstrip() - if s[0]=="'"and s[-1]=="'":return s[1:-1] - return s def reduce_nums(val_1,val_2,val_op): - try:a=0+val_1 - except TypeError:a=int(val_1) - try:b=0+val_2 - except TypeError:b=int(val_2) + try: + a=0+val_1 + except TypeError: + a=int(val_1) + try: + b=0+val_2 + except TypeError: + b=int(val_2) d=val_op - if d=='%':c=a%b - elif d=='+':c=a+b - elif d=='-':c=a-b - elif d=='*':c=a*b - elif d=='/':c=a/b - elif d=='^':c=a^b - elif d=='==':c=int(a==b) - elif d=='|'or d=='bitor':c=a|b - elif d=='||'or d=='or':c=int(a or b) - elif d=='&'or d=='bitand':c=a&b - elif d=='&&'or d=='and':c=int(a and b) - elif d=='!='or d=='not_eq':c=int(a!=b) - elif d=='^'or d=='xor':c=int(a^b) - elif d=='<=':c=int(a<=b) - elif d=='<':c=int(a<b) - elif d=='>':c=int(a>b) - elif d=='>=':c=int(a>=b) - elif d=='<<':c=a<<b - elif d=='>>':c=a>>b - else:c=0 + if d=='%': + c=a%b + elif d=='+': + c=a+b + elif d=='-': + c=a-b + elif d=='*': + c=a*b + elif d=='/': + c=a/b + elif d=='^': + c=a^b + elif d=='==': + c=int(a==b) + elif d=='|'or d=='bitor': + c=a|b + elif d=='||'or d=='or': + c=int(a or b) + elif d=='&'or d=='bitand': + c=a&b + elif d=='&&'or d=='and': + c=int(a and b) + elif d=='!='or d=='not_eq': + c=int(a!=b) + elif d=='^'or d=='xor': + c=int(a^b) + elif d=='<=': + c=int(a<=b) + elif d=='<': + c=int(a<b) + elif d=='>': + c=int(a>b) + elif d=='>=': + c=int(a>=b) + elif d=='<<': + c=a<<b + elif d=='>>': + c=a>>b + else: + c=0 return c def get_num(lst): - if not lst:raise PreprocError("empty list for get_num") + if not lst: + raise PreprocError('empty list for get_num') (p,v)=lst[0] if p==OP: if v=='(': @@ -104,7 +117,7 @@ def get_num(lst): count_par+=1 i+=1 else: - raise PreprocError("rparen expected %r"%lst) + raise PreprocError('rparen expected %r'%lst) (num,_)=get_term(lst[1:i]) return(num,lst[i+1:]) elif v=='+': @@ -119,15 +132,16 @@ def get_num(lst): num,lst=get_num(lst[1:]) return(~int(num),lst) else: - raise PreprocError("Invalid op token %r for get_num"%lst) + raise PreprocError('Invalid op token %r for get_num'%lst) elif p==NUM: return v,lst[1:] elif p==IDENT: return 0,lst[1:] else: - raise PreprocError("Invalid token %r for get_num"%lst) + raise PreprocError('Invalid token %r for get_num'%lst) def get_term(lst): - if not lst:raise PreprocError("empty list for get_term") + if not lst: + raise PreprocError('empty list for get_term') num,lst=get_num(lst) if not lst: return(num,[]) @@ -150,7 +164,7 @@ def get_term(lst): break i+=1 else: - raise PreprocError("rparen expected %r"%lst) + raise PreprocError('rparen expected %r'%lst) if int(num): return get_term(lst[1:i]) else: @@ -162,7 +176,7 @@ def get_term(lst): return get_term([(NUM,num2)]+lst) p2,v2=lst[0] if p2!=OP: - raise PreprocError("op expected %r"%lst) + raise PreprocError('op expected %r'%lst) if prec[v2]>=prec[v]: num2=reduce_nums(num,num2,v) return get_term([(NUM,num2)]+lst) @@ -170,7 +184,7 @@ def get_term(lst): num3,lst=get_num(lst[1:]) num3=reduce_nums(num2,num3,v2) return get_term([(NUM,num),(p,v),(NUM,num3)]+lst) - raise PreprocError("cannot reduce %r"%lst) + raise PreprocError('cannot reduce %r'%lst) def reduce_eval(lst): num,lst=get_term(lst) return(NUM,num) @@ -210,7 +224,7 @@ def reduce_tokens(lst,defs,ban=[]): else: lst[i]=(NUM,0) else: - raise PreprocError("Invalid define expression %r"%lst) + raise PreprocError('Invalid define expression %r'%lst) elif p==IDENT and v in defs: if isinstance(defs[v],str): a,b=extract_macro(defs[v]) @@ -221,17 +235,17 @@ def reduce_tokens(lst,defs,ban=[]): del lst[i] accu=to_add[:] reduce_tokens(accu,defs,ban+[v]) - for x in range(len(accu)): - lst.insert(i,accu[x]) + for tmp in accu: + lst.insert(i,tmp) i+=1 else: args=[] del lst[i] if i>=len(lst): - raise PreprocError("expected '(' after %r (got nothing)"%v) + raise PreprocError('expected ( after %r (got nothing)'%v) (p2,v2)=lst[i] if p2!=OP or v2!='(': - raise PreprocError("expected '(' after %r"%v) + raise PreprocError('expected ( after %r'%v) del lst[i] one_param=[] count_paren=0 @@ -243,18 +257,22 @@ def reduce_tokens(lst,defs,ban=[]): one_param.append((p2,v2)) count_paren+=1 elif v2==')': - if one_param:args.append(one_param) + if one_param: + args.append(one_param) break elif v2==',': - if not one_param:raise PreprocError("empty param in funcall %s"%v) + if not one_param: + raise PreprocError('empty param in funcall %r'%v) args.append(one_param) one_param=[] else: one_param.append((p2,v2)) else: one_param.append((p2,v2)) - if v2=='(':count_paren+=1 - elif v2==')':count_paren-=1 + if v2=='(': + count_paren+=1 + elif v2==')': + count_paren-=1 else: raise PreprocError('malformed macro') accu=[] @@ -287,7 +305,8 @@ def reduce_tokens(lst,defs,ban=[]): for x in args[pt-st+1:]: va_toks.extend(x) va_toks.append((OP,',')) - if va_toks:va_toks.pop() + if va_toks: + va_toks.pop() if len(accu)>1: (p3,v3)=accu[-1] (p4,v4)=accu[-2] @@ -314,15 +333,21 @@ def reduce_tokens(lst,defs,ban=[]): i+=1 def eval_macro(lst,defs): reduce_tokens(lst,defs,[]) - if not lst:raise PreprocError("missing tokens to evaluate") - (p,v)=reduce_eval(lst) + if not lst: + raise PreprocError('missing tokens to evaluate') + if lst: + p,v=lst[0] + if p==IDENT and v not in defs: + raise PreprocError('missing macro %r'%lst) + p,v=reduce_eval(lst) return int(v)!=0 def extract_macro(txt): t=tokenize(txt) if re_fun.search(txt): p,name=t[0] p,v=t[1] - if p!=OP:raise PreprocError("expected open parenthesis") + if p!=OP: + raise PreprocError('expected (') i=1 pindex=0 params={} @@ -338,27 +363,27 @@ def extract_macro(txt): elif p==OP and v==')': break else: - raise PreprocError("unexpected token (3)") + raise PreprocError('unexpected token (3)') elif prev==IDENT: if p==OP and v==',': prev=v elif p==OP and v==')': break else: - raise PreprocError("comma or ... expected") + raise PreprocError('comma or ... expected') elif prev==',': if p==IDENT: params[v]=pindex pindex+=1 prev=p elif p==OP and v=='...': - raise PreprocError("not implemented (1)") + raise PreprocError('not implemented (1)') else: - raise PreprocError("comma or ... expected (2)") + raise PreprocError('comma or ... expected (2)') elif prev=='...': - raise PreprocError("not implemented (2)") + raise PreprocError('not implemented (2)') else: - raise PreprocError("unexpected else") + raise PreprocError('unexpected else') return(name,[params,t[i+1:]]) else: (p,v)=t[0] @@ -366,16 +391,16 @@ def extract_macro(txt): return(v,[[],t[1:]]) else: return(v,[[],[('T','')]]) -re_include=re.compile('^\s*(<(?P<a>.*)>|"(?P<b>.*)")') +re_include=re.compile('^\s*(<(?:.*)>|"(?:.*)")') def extract_include(txt,defs): m=re_include.search(txt) if m: - if m.group('a'):return'<',m.group('a') - if m.group('b'):return'"',m.group('b') + txt=m.group(1) + return txt[0],txt[1:-1] toks=tokenize(txt) reduce_tokens(toks,defs,['waf_include']) if not toks: - raise PreprocError("could not parse include %s"%txt) + raise PreprocError('could not parse include %r'%txt) if len(toks)==1: if toks[0][0]==STR: return'"',toks[0][1] @@ -383,26 +408,30 @@ def extract_include(txt,defs): if toks[0][1]=='<'and toks[-1][1]=='>': ret='<',stringize(toks).lstrip('<').rstrip('>') return ret - raise PreprocError("could not parse include %s."%txt) + raise PreprocError('could not parse include %r'%txt) def parse_char(txt): - if not txt:raise PreprocError("attempted to parse a null char") + if not txt: + raise PreprocError('attempted to parse a null char') if txt[0]!='\\': return ord(txt) c=txt[1] if c=='x': - if len(txt)==4 and txt[3]in string.hexdigits:return int(txt[2:],16) + if len(txt)==4 and txt[3]in string.hexdigits: + return int(txt[2:],16) return int(txt[2:],16) elif c.isdigit(): - if c=='0'and len(txt)==2:return 0 + if c=='0'and len(txt)==2: + return 0 for i in 3,2,1: if len(txt)>i and txt[1:1+i].isdigit(): return(1+i,int(txt[1:1+i],8)) else: - try:return chr_esc[c] - except KeyError:raise PreprocError("could not parse char literal '%s'"%txt) + try: + return chr_esc[c] + except KeyError: + raise PreprocError('could not parse char literal %r'%txt) def tokenize(s): return tokenize_private(s)[:] -@Utils.run_once def tokenize_private(s): ret=[] for match in re_clexer.finditer(s): @@ -411,35 +440,49 @@ def tokenize_private(s): v=m(name) if v: if name==IDENT: - try: - g_optrans[v] + if v in g_optrans: name=OP - except KeyError: - if v.lower()=="true": - v=1 - name=NUM - elif v.lower()=="false": - v=0 - name=NUM + elif v.lower()=="true": + v=1 + name=NUM + elif v.lower()=="false": + v=0 + name=NUM elif name==NUM: - if m('oct'):v=int(v,8) - elif m('hex'):v=int(m('hex'),16) - elif m('n0'):v=m('n0') + if m('oct'): + v=int(v,8) + elif m('hex'): + v=int(m('hex'),16) + elif m('n0'): + v=m('n0') else: v=m('char') - if v:v=parse_char(v) - else:v=m('n2')or m('n4') + if v: + v=parse_char(v) + else: + v=m('n2')or m('n4') elif name==OP: - if v=='%:':v='#' - elif v=='%:%:':v='##' + if v=='%:': + v='#' + elif v=='%:%:': + v='##' elif name==STR: v=v[1:-1] ret.append((name,v)) break return ret -@Utils.run_once -def define_name(line): - return re_mac.match(line).group(0) +def format_defines(lst): + ret=[] + for y in lst: + if y: + pos=y.find('=') + if pos==-1: + ret.append(y) + elif pos>0: + ret.append('%s %s'%(y[:pos],y[pos+1:])) + else: + raise ValueError('Invalid define expression %r'%y) + return ret class c_parser(object): def __init__(self,nodepaths=None,defines=None): self.lines=[] @@ -454,15 +497,16 @@ class c_parser(object): self.nodes=[] self.names=[] self.curfile='' - self.ban_includes=set([]) + self.ban_includes=set() + self.listed=set() def cached_find_resource(self,node,filename): try: - nd=node.ctx.cache_nd + cache=node.ctx.preproc_cache_node except AttributeError: - nd=node.ctx.cache_nd={} - tup=(node,filename) + cache=node.ctx.preproc_cache_node=Utils.lru_cache(FILE_CACHE_SIZE) + key=(node,filename) try: - return nd[tup] + return cache[key] except KeyError: ret=node.find_resource(filename) if ret: @@ -472,68 +516,82 @@ class c_parser(object): tmp=node.ctx.srcnode.search_node(ret.path_from(node.ctx.bldnode)) if tmp and getattr(tmp,'children',None): ret=None - nd[tup]=ret + cache[key]=ret return ret - def tryfind(self,filename): + def tryfind(self,filename,kind='"',env=None): if filename.endswith('.moc'): self.names.append(filename) return None self.curfile=filename - found=self.cached_find_resource(self.currentnode_stack[-1],filename) - for n in self.nodepaths: - if found: - break - found=self.cached_find_resource(n,filename) + found=None + if kind=='"': + if env.MSVC_VERSION: + for n in reversed(self.currentnode_stack): + found=self.cached_find_resource(n,filename) + if found: + break + else: + found=self.cached_find_resource(self.currentnode_stack[-1],filename) + if not found: + for n in self.nodepaths: + found=self.cached_find_resource(n,filename) + if found: + break + listed=self.listed if found and not found in self.ban_includes: - self.nodes.append(found) + if found not in listed: + listed.add(found) + self.nodes.append(found) self.addlines(found) else: - if not filename in self.names: + if filename not in listed: + listed.add(filename) self.names.append(filename) return found + def filter_comments(self,node): + code=node.read() + if use_trigraphs: + for(a,b)in trig_def: + code=code.split(a).join(b) + code=re_nl.sub('',code) + code=re_cpp.sub(repl,code) + return re_lines.findall(code) + def parse_lines(self,node): + try: + cache=node.ctx.preproc_cache_lines + except AttributeError: + cache=node.ctx.preproc_cache_lines=Utils.lru_cache(LINE_CACHE_SIZE) + try: + return cache[node] + except KeyError: + cache[node]=lines=self.filter_comments(node) + lines.append((POPFILE,'')) + lines.reverse() + return lines def addlines(self,node): self.currentnode_stack.append(node.parent) - filepath=node.abspath() self.count_files+=1 if self.count_files>recursion_limit: - raise PreprocError("recursion limit exceeded") - pc=self.parse_cache - debug('preproc: reading file %r',filepath) + raise PreprocError('recursion limit exceeded') + if Logs.verbose: + Logs.debug('preproc: reading file %r',node) try: - lns=pc[filepath] - except KeyError: - pass - else: - self.lines.extend(lns) - return - try: - lines=filter_comments(filepath) - lines.append((POPFILE,'')) - lines.reverse() - pc[filepath]=lines - self.lines.extend(lines) - except IOError: - raise PreprocError("could not read the file %s"%filepath) + lines=self.parse_lines(node) + except EnvironmentError: + raise PreprocError('could not read the file %r'%node) except Exception: if Logs.verbose>0: - error("parsing %s failed"%filepath) - traceback.print_exc() + Logs.error('parsing %r failed %s',node,traceback.format_exc()) + else: + self.lines.extend(lines) def start(self,node,env): - debug('preproc: scanning %s (in %s)',node.name,node.parent.name) - bld=node.ctx - try: - self.parse_cache=bld.parse_cache - except AttributeError: - self.parse_cache=bld.parse_cache={} + Logs.debug('preproc: scanning %s (in %s)',node.name,node.parent.name) self.current_file=node self.addlines(node) - if env['DEFINES']: - try: - lst=['%s %s'%(x[0],trimquotes('='.join(x[1:])))for x in[y.split('=')for y in env['DEFINES']]] - lst.reverse() - self.lines.extend([('define',x)for x in lst]) - except AttributeError: - pass + if env.DEFINES: + lst=format_defines(env.DEFINES) + lst.reverse() + self.lines.extend([('define',x)for x in lst]) while self.lines: (token,line)=self.lines.pop() if token==POPFILE: @@ -541,8 +599,6 @@ class c_parser(object): self.currentnode_stack.pop() continue try: - ve=Logs.verbose - if ve:debug('preproc: line is %s - %s state is %s',token,line,self.state) state=self.state if token[:2]=='if': state.append(undefined) @@ -553,23 +609,27 @@ class c_parser(object): continue if token=='if': ret=eval_macro(tokenize(line),self.defs) - if ret:state[-1]=accepted - else:state[-1]=ignored + if ret: + state[-1]=accepted + else: + state[-1]=ignored elif token=='ifdef': m=re_mac.match(line) - if m and m.group(0)in self.defs:state[-1]=accepted - else:state[-1]=ignored + if m and m.group()in self.defs: + state[-1]=accepted + else: + state[-1]=ignored elif token=='ifndef': m=re_mac.match(line) - if m and m.group(0)in self.defs:state[-1]=ignored - else:state[-1]=accepted + if m and m.group()in self.defs: + state[-1]=ignored + else: + state[-1]=accepted elif token=='include'or token=='import': (kind,inc)=extract_include(line,self.defs) - if ve:debug('preproc: include found %s (%s) ',inc,kind) - if kind=='"'or not strict_quotes: - self.current_file=self.tryfind(inc) - if token=='import': - self.ban_includes.add(self.current_file) + self.current_file=self.tryfind(inc,kind,env) + if token=='import': + self.ban_includes.add(self.current_file) elif token=='elif': if state[-1]==accepted: state[-1]=skipped @@ -577,25 +637,28 @@ class c_parser(object): if eval_macro(tokenize(line),self.defs): state[-1]=accepted elif token=='else': - if state[-1]==accepted:state[-1]=skipped - elif state[-1]==ignored:state[-1]=accepted + if state[-1]==accepted: + state[-1]=skipped + elif state[-1]==ignored: + state[-1]=accepted elif token=='define': try: - self.defs[define_name(line)]=line - except Exception: - raise PreprocError("Invalid define line %s"%line) + self.defs[self.define_name(line)]=line + except AttributeError: + raise PreprocError('Invalid define line %r'%line) elif token=='undef': m=re_mac.match(line) - if m and m.group(0)in self.defs: - self.defs.__delitem__(m.group(0)) + if m and m.group()in self.defs: + self.defs.__delitem__(m.group()) elif token=='pragma': if re_pragma_once.match(line.lower()): self.ban_includes.add(self.current_file) - except Exception ,e: + except Exception as e: if Logs.verbose: - debug('preproc: line parsing failed (%s): %s %s',e,line,Utils.ex_stack()) + Logs.debug('preproc: line parsing failed (%s): %s %s',e,line,traceback.format_exc()) + def define_name(self,line): + return re_mac.match(line).group() def scan(task): - global go_absolute try: incn=task.generator.includes_nodes except AttributeError: @@ -606,6 +669,4 @@ def scan(task): nodepaths=[x for x in incn if x.is_child_of(x.ctx.srcnode)or x.is_child_of(x.ctx.bldnode)] tmp=c_parser(nodepaths) tmp.start(task.inputs[0],task.env) - if Logs.verbose: - debug('deps: deps for %r: %r; unresolved %r'%(task.inputs,tmp.nodes,tmp.names)) return(tmp.nodes,tmp.names) diff --git a/waflib/Tools/c_tests.py b/waflib/Tools/c_tests.py index 7791f23..30b9f38 100644 --- a/waflib/Tools/c_tests.py +++ b/waflib/Tools/c_tests.py @@ -47,7 +47,7 @@ def check_library(self,mode=None,test_exec=True): mode='c' if self.env.CXX: mode='cxx' - self.check(compile_filename=[],features='link_lib_test',msg='Checking for libraries',mode=mode,test_exec=test_exec,) + self.check(compile_filename=[],features='link_lib_test',msg='Checking for libraries',mode=mode,test_exec=test_exec) INLINE_CODE=''' typedef int foo_t; static %s foo_t static_foo () {return 0; } @@ -132,7 +132,7 @@ extern int foo; class grep_for_endianness(Task.Task): color='PINK' def run(self): - txt=self.inputs[0].read(flags='rb').decode('iso8859-1') + txt=self.inputs[0].read(flags='rb').decode('latin-1') if txt.find('LiTTleEnDian')>-1: self.generator.tmp.append('little') elif txt.find('BIGenDianSyS')>-1: @@ -148,5 +148,5 @@ def check_endianness(self): tmp=[] def check_msg(self): return tmp[0] - self.check(fragment=ENDIAN_FRAGMENT,features='c grep_for_endianness',msg="Checking for endianness",define='ENDIANNESS',tmp=tmp,okmsg=check_msg) + self.check(fragment=ENDIAN_FRAGMENT,features='c grep_for_endianness',msg='Checking for endianness',define='ENDIANNESS',tmp=tmp,okmsg=check_msg) return tmp[0] diff --git a/waflib/Tools/ccroot.py b/waflib/Tools/ccroot.py index 498a0ab..9d865bf 100644 --- a/waflib/Tools/ccroot.py +++ b/waflib/Tools/ccroot.py @@ -3,7 +3,7 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file import os,re -from waflib import Task,Utils,Node,Errors +from waflib import Task,Utils,Node,Errors,Logs from waflib.TaskGen import after_method,before_method,feature,taskgen_method,extension from waflib.Tools import c_aliases,c_preproc,c_config,c_osx,c_tests from waflib.Configure import conf @@ -32,7 +32,7 @@ def create_compiled_task(self,name,node): @taskgen_method def to_incnodes(self,inlst): lst=[] - seen=set([]) + seen=set() for x in self.to_list(inlst): if x in seen or not x: continue @@ -57,15 +57,21 @@ def to_incnodes(self,inlst): @feature('c','cxx','d','asm','fc','includes') @after_method('propagate_uselib_vars','process_source') def apply_incpaths(self): - lst=self.to_incnodes(self.to_list(getattr(self,'includes',[]))+self.env['INCLUDES']) + lst=self.to_incnodes(self.to_list(getattr(self,'includes',[]))+self.env.INCLUDES) self.includes_nodes=lst - self.env['INCPATHS']=[x.abspath()for x in lst] + cwd=self.get_cwd() + self.env.INCPATHS=[x.path_from(cwd)for x in lst] class link_task(Task.Task): color='YELLOW' + weight=3 inst_to=None chmod=Utils.O755 def add_target(self,target): if isinstance(target,str): + base=self.generator.path + if target.startswith('#'): + target=target[1:] + base=self.generator.bld.bldnode pattern=self.env[self.__class__.__name__+'_PATTERN'] if not pattern: pattern='%s' @@ -82,19 +88,52 @@ class link_task(Task.Task): tmp=folder+os.sep+pattern%name else: tmp=pattern%name - target=self.generator.path.find_or_declare(tmp) + target=base.find_or_declare(tmp) self.set_outputs(target) + def exec_command(self,*k,**kw): + ret=super(link_task,self).exec_command(*k,**kw) + if not ret and self.env.DO_MANIFEST: + ret=self.exec_mf() + return ret + def exec_mf(self): + if not self.env.MT: + return 0 + manifest=None + for out_node in self.outputs: + if out_node.name.endswith('.manifest'): + manifest=out_node.abspath() + break + else: + return 0 + mode='' + for x in Utils.to_list(self.generator.features): + if x in('cprogram','cxxprogram','fcprogram','fcprogram_test'): + mode=1 + elif x in('cshlib','cxxshlib','fcshlib'): + mode=2 + Logs.debug('msvc: embedding manifest in mode %r',mode) + lst=[]+self.env.MT + lst.extend(Utils.to_list(self.env.MTFLAGS)) + lst.extend(['-manifest',manifest]) + lst.append('-outputresource:%s;%s'%(self.outputs[0].abspath(),mode)) + return super(link_task,self).exec_command(lst) class stlink_task(link_task): run_str='${AR} ${ARFLAGS} ${AR_TGT_F}${TGT} ${AR_SRC_F}${SRC}' chmod=Utils.O644 def rm_tgt(cls): old=cls.run def wrap(self): - try:os.remove(self.outputs[0].abspath()) - except OSError:pass + try: + os.remove(self.outputs[0].abspath()) + except OSError: + pass return old(self) setattr(cls,'run',wrap) rm_tgt(stlink_task) +@feature('skip_stlib_link_deps') +@before_method('process_use') +def apply_skip_stlib_link_deps(self): + self.env.SKIP_STLIB_LINK_DEPS=True @feature('c','cxx','d','fc','asm') @after_method('process_source') def apply_link(self): @@ -115,9 +154,9 @@ def apply_link(self): try: inst_to=self.install_path except AttributeError: - inst_to=self.link_task.__class__.inst_to + inst_to=self.link_task.inst_to if inst_to: - self.install_task=self.bld.install_files(inst_to,self.link_task.outputs[:],env=self.env,chmod=self.link_task.chmod,task=self.link_task) + self.install_task=self.add_install_files(install_to=inst_to,install_from=self.link_task.outputs[:],chmod=self.link_task.chmod,task=self.link_task) @taskgen_method def use_rec(self,name,**kw): if name in self.tmp_use_not or name in self.tmp_use_seen: @@ -156,7 +195,7 @@ def use_rec(self,name,**kw): @before_method('apply_incpaths','propagate_uselib_vars') @after_method('apply_link','process_source') def process_use(self): - use_not=self.tmp_use_not=set([]) + use_not=self.tmp_use_not=set() self.tmp_use_seen=[] use_prec=self.tmp_use_prec={} self.uselib=self.to_list(getattr(self,'uselib',[])) @@ -167,7 +206,7 @@ def process_use(self): for x in use_not: if x in use_prec: del use_prec[x] - out=[] + out=self.tmp_use_sorted=[] tmp=[] for x in self.tmp_use_seen: for k in use_prec.values(): @@ -198,16 +237,18 @@ def process_use(self): y=self.bld.get_tgen_by_name(x) var=y.tmp_use_var if var and link_task: - if var=='LIB'or y.tmp_use_stlib or x in names: + if self.env.SKIP_STLIB_LINK_DEPS and isinstance(link_task,stlink_task): + pass + elif var=='LIB'or y.tmp_use_stlib or x in names: self.env.append_value(var,[y.target[y.target.rfind(os.sep)+1:]]) self.link_task.dep_nodes.extend(y.link_task.outputs) - tmp_path=y.link_task.outputs[0].parent.path_from(self.bld.bldnode) + tmp_path=y.link_task.outputs[0].parent.path_from(self.get_cwd()) self.env.append_unique(var+'PATH',[tmp_path]) else: if y.tmp_use_objects: self.add_objects_from_tgen(y) if getattr(y,'export_includes',None): - self.includes.extend(y.to_incnodes(y.export_includes)) + self.includes=self.includes+y.to_incnodes(y.export_includes) if getattr(y,'export_defines',None): self.env.append_value('DEFINES',self.to_list(y.export_defines)) for x in names: @@ -236,7 +277,7 @@ def add_objects_from_tgen(self,tg): link_task.inputs.append(x) @taskgen_method def get_uselib_vars(self): - _vars=set([]) + _vars=set() for x in self.features: if x in USELIB_VARS: _vars|=USELIB_VARS[x] @@ -267,16 +308,16 @@ def apply_implib(self): name=self.target.name else: name=os.path.split(self.target)[1] - implib=self.env['implib_PATTERN']%name + implib=self.env.implib_PATTERN%name implib=dll.parent.find_or_declare(implib) - self.env.append_value('LINKFLAGS',self.env['IMPLIB_ST']%implib.bldpath()) + self.env.append_value('LINKFLAGS',self.env.IMPLIB_ST%implib.bldpath()) self.link_task.outputs.append(implib) if getattr(self,'defs',None)and self.env.DEST_BINFMT=='pe': node=self.path.find_resource(self.defs) if not node: raise Errors.WafError('invalid def file %r'%self.defs) - if'msvc'in(self.env.CC_NAME,self.env.CXX_NAME): - self.env.append_value('LINKFLAGS','/def:%s'%node.path_from(self.bld.bldnode)) + if self.env.def_PATTERN: + self.env.append_value('LINKFLAGS',self.env.def_PATTERN%node.path_from(self.get_cwd())) self.link_task.dep_nodes.append(node) else: self.link_task.inputs.append(node) @@ -288,10 +329,10 @@ def apply_implib(self): inst_to=self.install_path except AttributeError: inst_to='${IMPLIBDIR}' - self.install_task.dest='${BINDIR}' + self.install_task.install_to='${BINDIR}' if not self.env.IMPLIBDIR: self.env.IMPLIBDIR=self.env.LIBDIR - self.implib_install_task=self.bld.install_files(inst_to,implib,env=self.env,chmod=self.link_task.chmod,task=self.link_task) + self.implib_install_task=self.add_install_files(install_to=inst_to,install_from=implib,chmod=self.link_task.chmod,task=self.link_task) re_vnum=re.compile('^([1-9]\\d*|0)([.]([1-9]\\d*|0)){0,2}?$') @feature('cshlib','cxxshlib','dshlib','fcshlib','vnum') @after_method('apply_link','propagate_uselib_vars') @@ -318,31 +359,31 @@ def apply_vnum(self): v=self.env.SONAME_ST%name2 self.env.append_value('LINKFLAGS',v.split()) if self.env.DEST_OS!='openbsd': - outs=[node.parent.find_or_declare(name3)] + outs=[node.parent.make_node(name3)] if name2!=name3: - outs.append(node.parent.find_or_declare(name2)) + outs.append(node.parent.make_node(name2)) self.create_task('vnum',node,outs) if getattr(self,'install_task',None): - self.install_task.hasrun=Task.SKIP_ME - bld=self.bld - path=self.install_task.dest + self.install_task.hasrun=Task.SKIPPED + self.install_task.no_errcheck_out=True + path=self.install_task.install_to if self.env.DEST_OS=='openbsd': libname=self.link_task.outputs[0].name - t1=bld.install_as('%s%s%s'%(path,os.sep,libname),node,env=self.env,chmod=self.link_task.chmod) + t1=self.add_install_as(install_to='%s/%s'%(path,libname),install_from=node,chmod=self.link_task.chmod) self.vnum_install_task=(t1,) else: - t1=bld.install_as(path+os.sep+name3,node,env=self.env,chmod=self.link_task.chmod) - t3=bld.symlink_as(path+os.sep+libname,name3) + t1=self.add_install_as(install_to=path+os.sep+name3,install_from=node,chmod=self.link_task.chmod) + t3=self.add_symlink_as(install_to=path+os.sep+libname,install_from=name3) if name2!=name3: - t2=bld.symlink_as(path+os.sep+name2,name3) + t2=self.add_symlink_as(install_to=path+os.sep+name2,install_from=name3) self.vnum_install_task=(t1,t2,t3) else: self.vnum_install_task=(t1,t3) - if'-dynamiclib'in self.env['LINKFLAGS']: + if'-dynamiclib'in self.env.LINKFLAGS: try: inst_to=self.install_path except AttributeError: - inst_to=self.link_task.__class__.inst_to + inst_to=self.link_task.inst_to if inst_to: p=Utils.subst_vars(inst_to,self.env) path=os.path.join(p,name2) @@ -351,7 +392,6 @@ def apply_vnum(self): self.env.append_value('LINKFLAGS','-Wl,-current_version,%s'%self.vnum) class vnum(Task.Task): color='CYAN' - quient=True ext_in=['.bin'] def keyword(self): return'Symlinking' @@ -371,16 +411,12 @@ class fake_shlib(link_task): for t in self.run_after: if not t.hasrun: return Task.ASK_LATER - for x in self.outputs: - x.sig=Utils.h_file(x.abspath()) return Task.SKIP_ME class fake_stlib(stlink_task): def runnable_status(self): for t in self.run_after: if not t.hasrun: return Task.ASK_LATER - for x in self.outputs: - x.sig=Utils.h_file(x.abspath()) return Task.SKIP_ME @conf def read_shlib(self,name,paths=[],export_includes=[],export_defines=[]): @@ -401,7 +437,10 @@ def process_lib(self): for y in names: node=x.find_node(y) if node: - node.sig=Utils.h_file(node.abspath()) + try: + Utils.h_file(node.abspath()) + except EnvironmentError: + raise ValueError('Could not read %r'%y) break else: continue diff --git a/waflib/Tools/compiler_c.py b/waflib/Tools/compiler_c.py index 3d7f34e..ee607be 100644 --- a/waflib/Tools/compiler_c.py +++ b/waflib/Tools/compiler_c.py @@ -6,28 +6,32 @@ import re from waflib.Tools import ccroot from waflib import Utils from waflib.Logs import debug -c_compiler={'win32':['msvc','gcc','clang'],'cygwin':['gcc'],'darwin':['clang','gcc'],'aix':['xlc','gcc','clang'],'linux':['gcc','clang','icc'],'sunos':['suncc','gcc'],'irix':['gcc','irixcc'],'hpux':['gcc'],'osf1V':['gcc'],'gnu':['gcc','clang'],'java':['gcc','msvc','clang','icc'],'default':['gcc','clang'],} +c_compiler={'win32':['msvc','gcc','clang'],'cygwin':['gcc'],'darwin':['clang','gcc'],'aix':['xlc','gcc','clang'],'linux':['gcc','clang','icc'],'sunos':['suncc','gcc'],'irix':['gcc','irixcc'],'hpux':['gcc'],'osf1V':['gcc'],'gnu':['gcc','clang'],'java':['gcc','msvc','clang','icc'],'default':['clang','gcc'],} def default_compilers(): build_platform=Utils.unversioned_sys_platform() possible_compiler_list=c_compiler.get(build_platform,c_compiler['default']) return' '.join(possible_compiler_list) def configure(conf): - try:test_for_compiler=conf.options.check_c_compiler or default_compilers() - except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_c')") + try: + test_for_compiler=conf.options.check_c_compiler or default_compilers() + except AttributeError: + conf.fatal("Add options(opt): opt.load('compiler_c')") for compiler in re.split('[ ,]+',test_for_compiler): conf.env.stash() conf.start_msg('Checking for %r (C compiler)'%compiler) try: conf.load(compiler) - except conf.errors.ConfigurationError ,e: + except conf.errors.ConfigurationError as e: conf.env.revert() conf.end_msg(False) - debug('compiler_c: %r'%e) + debug('compiler_c: %r',e) else: - if conf.env['CC']: + if conf.env.CC: conf.end_msg(conf.env.get_flat('CC')) - conf.env['COMPILER_CC']=compiler + conf.env.COMPILER_CC=compiler + conf.env.commit() break + conf.env.revert() conf.end_msg(False) else: conf.fatal('could not configure a C compiler!') diff --git a/waflib/Tools/compiler_cxx.py b/waflib/Tools/compiler_cxx.py index eb5d8da..cbd267f 100644 --- a/waflib/Tools/compiler_cxx.py +++ b/waflib/Tools/compiler_cxx.py @@ -6,28 +6,32 @@ import re from waflib.Tools import ccroot from waflib import Utils from waflib.Logs import debug -cxx_compiler={'win32':['msvc','g++','clang++'],'cygwin':['g++'],'darwin':['clang++','g++'],'aix':['xlc++','g++','clang++'],'linux':['g++','clang++','icpc'],'sunos':['sunc++','g++'],'irix':['g++'],'hpux':['g++'],'osf1V':['g++'],'gnu':['g++','clang++'],'java':['g++','msvc','clang++','icpc'],'default':['g++','clang++']} +cxx_compiler={'win32':['msvc','g++','clang++'],'cygwin':['g++'],'darwin':['clang++','g++'],'aix':['xlc++','g++','clang++'],'linux':['g++','clang++','icpc'],'sunos':['sunc++','g++'],'irix':['g++'],'hpux':['g++'],'osf1V':['g++'],'gnu':['g++','clang++'],'java':['g++','msvc','clang++','icpc'],'default':['clang++','g++']} def default_compilers(): build_platform=Utils.unversioned_sys_platform() possible_compiler_list=cxx_compiler.get(build_platform,cxx_compiler['default']) return' '.join(possible_compiler_list) def configure(conf): - try:test_for_compiler=conf.options.check_cxx_compiler or default_compilers() - except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_cxx')") + try: + test_for_compiler=conf.options.check_cxx_compiler or default_compilers() + except AttributeError: + conf.fatal("Add options(opt): opt.load('compiler_cxx')") for compiler in re.split('[ ,]+',test_for_compiler): conf.env.stash() conf.start_msg('Checking for %r (C++ compiler)'%compiler) try: conf.load(compiler) - except conf.errors.ConfigurationError ,e: + except conf.errors.ConfigurationError as e: conf.env.revert() conf.end_msg(False) - debug('compiler_cxx: %r'%e) + debug('compiler_cxx: %r',e) else: - if conf.env['CXX']: + if conf.env.CXX: conf.end_msg(conf.env.get_flat('CXX')) - conf.env['COMPILER_CXX']=compiler + conf.env.COMPILER_CXX=compiler + conf.env.commit() break + conf.env.revert() conf.end_msg(False) else: conf.fatal('could not configure a C++ compiler!') diff --git a/waflib/Tools/compiler_d.py b/waflib/Tools/compiler_d.py index 0d81675..2ca7e26 100644 --- a/waflib/Tools/compiler_d.py +++ b/waflib/Tools/compiler_d.py @@ -10,22 +10,26 @@ def default_compilers(): possible_compiler_list=d_compiler.get(build_platform,d_compiler['default']) return' '.join(possible_compiler_list) def configure(conf): - try:test_for_compiler=conf.options.check_d_compiler or default_compilers() - except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_d')") + try: + test_for_compiler=conf.options.check_d_compiler or default_compilers() + except AttributeError: + conf.fatal("Add options(opt): opt.load('compiler_d')") for compiler in re.split('[ ,]+',test_for_compiler): conf.env.stash() conf.start_msg('Checking for %r (D compiler)'%compiler) try: conf.load(compiler) - except conf.errors.ConfigurationError ,e: + except conf.errors.ConfigurationError as e: conf.env.revert() conf.end_msg(False) - Logs.debug('compiler_d: %r'%e) + Logs.debug('compiler_d: %r',e) else: if conf.env.D: conf.end_msg(conf.env.get_flat('D')) - conf.env['COMPILER_D']=compiler + conf.env.COMPILER_D=compiler + conf.env.commit() break + conf.env.revert() conf.end_msg(False) else: conf.fatal('could not configure a D compiler!') diff --git a/waflib/Tools/compiler_fc.py b/waflib/Tools/compiler_fc.py index 2cdeeb4..8b23a2b 100644 --- a/waflib/Tools/compiler_fc.py +++ b/waflib/Tools/compiler_fc.py @@ -11,22 +11,26 @@ def default_compilers(): possible_compiler_list=fc_compiler.get(build_platform,fc_compiler['default']) return' '.join(possible_compiler_list) def configure(conf): - try:test_for_compiler=conf.options.check_fortran_compiler or default_compilers() - except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_fc')") + try: + test_for_compiler=conf.options.check_fortran_compiler or default_compilers() + except AttributeError: + conf.fatal("Add options(opt): opt.load('compiler_fc')") for compiler in re.split('[ ,]+',test_for_compiler): conf.env.stash() conf.start_msg('Checking for %r (Fortran compiler)'%compiler) try: conf.load(compiler) - except conf.errors.ConfigurationError ,e: + except conf.errors.ConfigurationError as e: conf.env.revert() conf.end_msg(False) - Logs.debug('compiler_fortran: %r'%e) + Logs.debug('compiler_fortran: %r',e) else: - if conf.env['FC']: + if conf.env.FC: conf.end_msg(conf.env.get_flat('FC')) conf.env.COMPILER_FORTRAN=compiler + conf.env.commit() break + conf.env.revert() conf.end_msg(False) else: conf.fatal('could not configure a Fortran compiler!') diff --git a/waflib/Tools/cs.py b/waflib/Tools/cs.py index 8f1589a..df73c94 100644 --- a/waflib/Tools/cs.py +++ b/waflib/Tools/cs.py @@ -6,7 +6,6 @@ from waflib import Utils,Task,Options,Errors from waflib.TaskGen import before_method,after_method,feature from waflib.Tools import ccroot from waflib.Configure import conf -import os,tempfile ccroot.USELIB_VARS['cs']=set(['CSFLAGS','ASSEMBLIES','RESOURCES']) ccroot.lib_patterns['csshlib']=['%s'] @feature('cs') @@ -28,7 +27,7 @@ def apply_cs(self): inst_to=getattr(self,'install_path',bintype=='exe'and'${BINDIR}'or'${LIBDIR}') if inst_to: mod=getattr(self,'chmod',bintype=='exe'and Utils.O755 or Utils.O644) - self.install_task=self.bld.install_files(inst_to,self.cs_task.outputs[:],env=self.env,chmod=mod) + self.install_task=self.add_install_files(install_to=inst_to,install_from=self.cs_task.outputs[:],chmod=mod) @feature('cs') @after_method('apply_cs') def use_cs(self): @@ -59,10 +58,8 @@ def debug_cs(self): else: out=node.change_ext('.pdb') self.cs_task.outputs.append(out) - try: - self.install_task.source.append(out) - except AttributeError: - pass + if getattr(self,'install_task',None): + self.pdb_install_task=self.add_install_files(install_to=self.install_task.install_to,install_from=out) if csdebug=='pdbonly': val=['/debug+','/debug:pdbonly'] elif csdebug=='full': @@ -70,44 +67,30 @@ def debug_cs(self): else: val=['/debug-'] self.env.append_value('CSFLAGS',val) +@feature('cs') +@after_method('debug_cs') +def doc_cs(self): + csdoc=getattr(self,'csdoc',self.env.CSDOC) + if not csdoc: + return + node=self.cs_task.outputs[0] + out=node.change_ext('.xml') + self.cs_task.outputs.append(out) + if getattr(self,'install_task',None): + self.doc_install_task=self.add_install_files(install_to=self.install_task.install_to,install_from=out) + self.env.append_value('CSFLAGS','/doc:%s'%out.abspath()) class mcs(Task.Task): color='YELLOW' run_str='${MCS} ${CSTYPE} ${CSFLAGS} ${ASS_ST:ASSEMBLIES} ${RES_ST:RESOURCES} ${OUT} ${SRC}' - def exec_command(self,cmd,**kw): - bld=self.generator.bld - try: - if not kw.get('cwd',None): - kw['cwd']=bld.cwd - except AttributeError: - bld.cwd=kw['cwd']=bld.variant_dir - try: - tmp=None - if isinstance(cmd,list)and len(' '.join(cmd))>=8192: - program=cmd[0] - cmd=[self.quote_response_command(x)for x in cmd] - (fd,tmp)=tempfile.mkstemp() - os.write(fd,'\r\n'.join(i.replace('\\','\\\\')for i in cmd[1:])) - os.close(fd) - cmd=[program,'@'+tmp] - ret=self.generator.bld.exec_command(cmd,**kw) - finally: - if tmp: - try: - os.remove(tmp) - except OSError: - pass - return ret - def quote_response_command(self,flag): - if flag.lower()=='/noconfig': - return'' - if flag.find(' ')>-1: - for x in('/r:','/reference:','/resource:','/lib:','/out:'): - if flag.startswith(x): - flag='%s"%s"'%(x,'","'.join(flag[len(x):].split(','))) - break + def split_argfile(self,cmd): + inline=[cmd[0]] + infile=[] + for x in cmd[1:]: + if x.lower()=='/noconfig': + inline.append(x) else: - flag='"%s"'%flag - return flag + infile.append(self.quote_flag(x)) + return(inline,infile) def configure(conf): csc=getattr(Options.options,'cscbinary',None) if csc: @@ -124,8 +107,6 @@ class fake_csshlib(Task.Task): color='YELLOW' inst_to=None def runnable_status(self): - for x in self.outputs: - x.sig=Utils.h_file(x.abspath()) return Task.SKIP_ME @conf def read_csshlib(self,name,paths=[]): diff --git a/waflib/Tools/cxx.py b/waflib/Tools/cxx.py index 6f039e9..e63ad8b 100644 --- a/waflib/Tools/cxx.py +++ b/waflib/Tools/cxx.py @@ -11,7 +11,7 @@ def cxx_hook(self,node): if not'.c'in TaskGen.task_gen.mappings: TaskGen.task_gen.mappings['.c']=TaskGen.task_gen.mappings['.cpp'] class cxx(Task.Task): - run_str='${CXX} ${ARCH_ST:ARCH} ${CXXFLAGS} ${CPPFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CXX_SRC_F}${SRC} ${CXX_TGT_F}${TGT[0].abspath()}' + run_str='${CXX} ${ARCH_ST:ARCH} ${CXXFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CXX_SRC_F}${SRC} ${CXX_TGT_F}${TGT[0].abspath()} ${CPPFLAGS}' vars=['CXXDEPS'] ext_in=['.h'] scan=c_preproc.scan diff --git a/waflib/Tools/d.py b/waflib/Tools/d.py index e8c98f0..6d1c3c6 100644 --- a/waflib/Tools/d.py +++ b/waflib/Tools/d.py @@ -35,7 +35,7 @@ def d_hook(self,node): return task if getattr(self,'generate_headers',None): tsk=create_compiled_task(self,'d_with_header',node) - tsk.outputs.append(node.change_ext(self.env['DHEADER_ext'])) + tsk.outputs.append(node.change_ext(self.env.DHEADER_ext)) else: tsk=create_compiled_task(self,'d',node) return tsk diff --git a/waflib/Tools/d_config.py b/waflib/Tools/d_config.py index 71b7b6e..3b4bdf0 100644 --- a/waflib/Tools/d_config.py +++ b/waflib/Tools/d_config.py @@ -11,17 +11,17 @@ def d_platform_flags(self): v.DEST_OS=Utils.unversioned_sys_platform() binfmt=Utils.destos_to_binfmt(self.env.DEST_OS) if binfmt=='pe': - v['dprogram_PATTERN']='%s.exe' - v['dshlib_PATTERN']='lib%s.dll' - v['dstlib_PATTERN']='lib%s.a' + v.dprogram_PATTERN='%s.exe' + v.dshlib_PATTERN='lib%s.dll' + v.dstlib_PATTERN='lib%s.a' elif binfmt=='mac-o': - v['dprogram_PATTERN']='%s' - v['dshlib_PATTERN']='lib%s.dylib' - v['dstlib_PATTERN']='lib%s.a' + v.dprogram_PATTERN='%s' + v.dshlib_PATTERN='lib%s.dylib' + v.dstlib_PATTERN='lib%s.a' else: - v['dprogram_PATTERN']='%s' - v['dshlib_PATTERN']='lib%s.so' - v['dstlib_PATTERN']='lib%s.a' + v.dprogram_PATTERN='%s' + v.dshlib_PATTERN='lib%s.so' + v.dstlib_PATTERN='lib%s.a' DLIB=''' version(D_Version2) { import std.stdio; diff --git a/waflib/Tools/d_scan.py b/waflib/Tools/d_scan.py index 47f9196..09ccfa9 100644 --- a/waflib/Tools/d_scan.py +++ b/waflib/Tools/d_scan.py @@ -3,7 +3,7 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file import re -from waflib import Utils,Logs +from waflib import Utils def filter_comments(filename): txt=Utils.readf(filename) i=0 @@ -18,7 +18,8 @@ def filter_comments(filename): i+=1 while i<max: c=txt[i] - if c==delim:break + if c==delim: + break elif c=='\\': i+=1 i+=1 @@ -27,7 +28,8 @@ def filter_comments(filename): elif c=='/': buf.append(txt[begin:i]) i+=1 - if i==max:break + if i==max: + break c=txt[i] if c=='+': i+=1 @@ -41,7 +43,8 @@ def filter_comments(filename): c=None elif prev=='+'and c=='/': nesting-=1 - if nesting==0:break + if nesting==0: + break c=None i+=1 elif c=='*': @@ -50,7 +53,8 @@ def filter_comments(filename): while i<max: prev=c c=txt[i] - if prev=='*'and c=='/':break + if prev=='*'and c=='/': + break i+=1 elif c=='/': i+=1 @@ -118,7 +122,8 @@ class d_parser(object): code="".join(filter_comments(path)) names=self.get_strings(code) for x in names: - if x in self.allnames:continue + if x in self.allnames: + continue self.allnames.append(x) self.tryfind(x) def scan(self): @@ -128,6 +133,4 @@ def scan(self): gruik.start(node) nodes=gruik.nodes names=gruik.names - if Logs.verbose: - Logs.debug('deps: deps for %s: %r; unresolved %r'%(str(node),nodes,names)) return(nodes,names) diff --git a/waflib/Tools/dbus.py b/waflib/Tools/dbus.py index 9ce00d1..c54ab7a 100644 --- a/waflib/Tools/dbus.py +++ b/waflib/Tools/dbus.py @@ -11,7 +11,7 @@ def add_dbus_file(self,filename,prefix,mode): if not'process_dbus'in self.meths: self.meths.append('process_dbus') self.dbus_lst.append([filename,prefix,mode]) -@before_method('apply_core') +@before_method('process_source') def process_dbus(self): for filename,prefix,mode in getattr(self,'dbus_lst',[]): node=self.path.find_resource(filename) diff --git a/waflib/Tools/dmd.py b/waflib/Tools/dmd.py index 59aab6d..2711628 100644 --- a/waflib/Tools/dmd.py +++ b/waflib/Tools/dmd.py @@ -16,32 +16,32 @@ def find_dmd(conf): @conf def common_flags_ldc(conf): v=conf.env - v['DFLAGS']=['-d-version=Posix'] - v['LINKFLAGS']=[] - v['DFLAGS_dshlib']=['-relocation-model=pic'] + v.DFLAGS=['-d-version=Posix'] + v.LINKFLAGS=[] + v.DFLAGS_dshlib=['-relocation-model=pic'] @conf def common_flags_dmd(conf): v=conf.env - v['D_SRC_F']=['-c'] - v['D_TGT_F']='-of%s' - v['D_LINKER']=v['D'] - v['DLNK_SRC_F']='' - v['DLNK_TGT_F']='-of%s' - v['DINC_ST']='-I%s' - v['DSHLIB_MARKER']=v['DSTLIB_MARKER']='' - v['DSTLIB_ST']=v['DSHLIB_ST']='-L-l%s' - v['DSTLIBPATH_ST']=v['DLIBPATH_ST']='-L-L%s' - v['LINKFLAGS_dprogram']=['-quiet'] - v['DFLAGS_dshlib']=['-fPIC'] - v['LINKFLAGS_dshlib']=['-L-shared'] - v['DHEADER_ext']='.di' + v.D_SRC_F=['-c'] + v.D_TGT_F='-of%s' + v.D_LINKER=v.D + v.DLNK_SRC_F='' + v.DLNK_TGT_F='-of%s' + v.DINC_ST='-I%s' + v.DSHLIB_MARKER=v.DSTLIB_MARKER='' + v.DSTLIB_ST=v.DSHLIB_ST='-L-l%s' + v.DSTLIBPATH_ST=v.DLIBPATH_ST='-L-L%s' + v.LINKFLAGS_dprogram=['-quiet'] + v.DFLAGS_dshlib=['-fPIC'] + v.LINKFLAGS_dshlib=['-L-shared'] + v.DHEADER_ext='.di' v.DFLAGS_d_with_header=['-H','-Hf'] - v['D_HDR_F']='%s' + v.D_HDR_F='%s' def configure(conf): conf.find_dmd() if sys.platform=='win32': out=conf.cmd_and_log(conf.env.D+['--help']) - if out.find("D Compiler v2.")>-1: + if out.find('D Compiler v2.')>-1: conf.fatal('dmd2 on Windows is not supported, use gdc or ldc2 instead') conf.load('ar') conf.load('d') diff --git a/waflib/Tools/errcheck.py b/waflib/Tools/errcheck.py index 421dfa6..f993e58 100644 --- a/waflib/Tools/errcheck.py +++ b/waflib/Tools/errcheck.py @@ -2,17 +2,19 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -typos={'feature':'features','sources':'source','targets':'target','include':'includes','export_include':'export_includes','define':'defines','importpath':'includes','installpath':'install_path','iscopy':'is_copy',} +typos={'feature':'features','sources':'source','targets':'target','include':'includes','export_include':'export_includes','define':'defines','importpath':'includes','installpath':'install_path','iscopy':'is_copy','uses':'use',} meths_typos=['__call__','program','shlib','stlib','objects'] import sys from waflib import Logs,Build,Node,Task,TaskGen,ConfigSet,Errors,Utils -import waflib.Tools.ccroot +from waflib.Tools import ccroot def check_same_targets(self): mp=Utils.defaultdict(list) uids={} def check_task(tsk): if not isinstance(tsk,Task.Task): return + if hasattr(tsk,'no_errcheck_out'): + return for node in tsk.outputs: mp[node].append(tsk) try: @@ -34,37 +36,41 @@ def check_same_targets(self): Logs.error(msg) for x in v: if Logs.verbose>1: - Logs.error(' %d. %r'%(1+v.index(x),x.generator)) + Logs.error(' %d. %r',1+v.index(x),x.generator) else: - Logs.error(' %d. %r in %r'%(1+v.index(x),x.generator.name,getattr(x.generator,'path',None))) + Logs.error(' %d. %r in %r',1+v.index(x),x.generator.name,getattr(x.generator,'path',None)) + Logs.error('If you think that this is an error, set no_errcheck_out on the task instance') if not dupe: for(k,v)in uids.items(): if len(v)>1: Logs.error('* Several tasks use the same identifier. Please check the information on\n https://waf.io/apidocs/Task.html?highlight=uid#waflib.Task.Task.uid') + tg_details=tsk.generator.name + if Logs.verbose>2: + tg_details=tsk.generator for tsk in v: - Logs.error(' - object %r (%r) defined in %r'%(tsk.__class__.__name__,tsk,tsk.generator)) + Logs.error(' - object %r (%r) defined in %r',tsk.__class__.__name__,tsk,tg_details) def check_invalid_constraints(self): - feat=set([]) + feat=set() for x in list(TaskGen.feats.values()): feat.union(set(x)) for(x,y)in TaskGen.task_gen.prec.items(): feat.add(x) feat.union(set(y)) - ext=set([]) + ext=set() for x in TaskGen.task_gen.mappings.values(): ext.add(x.__name__) invalid=ext&feat if invalid: - Logs.error('The methods %r have invalid annotations: @extension <-> @feature/@before_method/@after_method'%list(invalid)) + Logs.error('The methods %r have invalid annotations: @extension <-> @feature/@before_method/@after_method',list(invalid)) for cls in list(Task.classes.values()): if sys.hexversion>0x3000000 and issubclass(cls,Task.Task)and isinstance(cls.hcode,str): raise Errors.WafError('Class %r has hcode value %r of type <str>, expecting <bytes> (use Utils.h_cmd() ?)'%(cls,cls.hcode)) for x in('before','after'): for y in Utils.to_list(getattr(cls,x,[])): - if not Task.classes.get(y,None): - Logs.error('Erroneous order constraint %r=%r on task class %r'%(x,y,cls.__name__)) + if not Task.classes.get(y): + Logs.error('Erroneous order constraint %r=%r on task class %r',x,y,cls.__name__) if getattr(cls,'rule',None): - Logs.error('Erroneous attribute "rule" on task class %r (rename to "run_str")'%cls.__name__) + Logs.error('Erroneous attribute "rule" on task class %r (rename to "run_str")',cls.__name__) def replace(m): oldcall=getattr(Build.BuildContext,m) def call(self,*k,**kw): @@ -73,7 +79,7 @@ def replace(m): if x in kw: if x=='iscopy'and'subst'in getattr(self,'features',''): continue - Logs.error('Fix the typo %r -> %r on %r'%(x,typos[x],ret)) + Logs.error('Fix the typo %r -> %r on %r',x,typos[x],ret) return ret setattr(Build.BuildContext,m,call) def enhance_lib(): @@ -83,22 +89,30 @@ def enhance_lib(): if k: lst=Utils.to_list(k[0]) for pat in lst: - if'..'in pat.split('/'): - Logs.error("In ant_glob pattern %r: '..' means 'two dots', not 'parent directory'"%k[0]) - if kw.get('remove',True): - try: - if self.is_child_of(self.ctx.bldnode)and not kw.get('quiet',False): - Logs.error('Using ant_glob on the build folder (%r) is dangerous (quiet=True to disable this warning)'%self) - except AttributeError: - pass + sp=pat.split('/') + if'..'in sp: + Logs.error("In ant_glob pattern %r: '..' means 'two dots', not 'parent directory'",k[0]) + if'.'in sp: + Logs.error("In ant_glob pattern %r: '.' means 'one dot', not 'current directory'",k[0]) return self.old_ant_glob(*k,**kw) Node.Node.old_ant_glob=Node.Node.ant_glob Node.Node.ant_glob=ant_glob + def ant_iter(self,accept=None,maxdepth=25,pats=[],dir=False,src=True,remove=True,quiet=False): + if remove: + try: + if self.is_child_of(self.ctx.bldnode)and not quiet: + quiet=True + Logs.error('Calling ant_glob on build folders (%r) is dangerous: add quiet=True / remove=False',self) + except AttributeError: + pass + return self.old_ant_iter(accept,maxdepth,pats,dir,src,remove,quiet) + Node.Node.old_ant_iter=Node.Node.ant_iter + Node.Node.ant_iter=ant_iter old=Task.is_before def is_before(t1,t2): ret=old(t1,t2) if ret and old(t2,t1): - Logs.error('Contradictory order constraints in classes %r %r'%(t1,t2)) + Logs.error('Contradictory order constraints in classes %r %r',t1,t2) return ret Task.is_before=is_before def check_err_features(self): @@ -107,18 +121,18 @@ def enhance_lib(): Logs.error('feature shlib -> cshlib, dshlib or cxxshlib') for x in('c','cxx','d','fc'): if not x in lst and lst and lst[0]in[x+y for y in('program','shlib','stlib')]: - Logs.error('%r features is probably missing %r'%(self,x)) + Logs.error('%r features is probably missing %r',self,x) TaskGen.feature('*')(check_err_features) def check_err_order(self): if not hasattr(self,'rule')and not'subst'in Utils.to_list(self.features): for x in('before','after','ext_in','ext_out'): if hasattr(self,x): - Logs.warn('Erroneous order constraint %r on non-rule based task generator %r'%(x,self)) + Logs.warn('Erroneous order constraint %r on non-rule based task generator %r',x,self) else: for x in('before','after'): for y in self.to_list(getattr(self,x,[])): - if not Task.classes.get(y,None): - Logs.error('Erroneous order constraint %s=%r on %r (no such class)'%(x,y,self)) + if not Task.classes.get(y): + Logs.error('Erroneous order constraint %s=%r on %r (no such class)',x,y,self) TaskGen.feature('*')(check_err_order) def check_compile(self): check_invalid_constraints(self) @@ -147,17 +161,15 @@ def enhance_lib(): self.orig_use_rec(name,**kw) TaskGen.task_gen.orig_use_rec=TaskGen.task_gen.use_rec TaskGen.task_gen.use_rec=use_rec - def getattri(self,name,default=None): + def _getattr(self,name,default=None): if name=='append'or name=='add': raise Errors.WafError('env.append and env.add do not exist: use env.append_value/env.append_unique') elif name=='prepend': raise Errors.WafError('env.prepend does not exist: use env.prepend_value') if name in self.__slots__: - return object.__getattr__(self,name,default) + return super(ConfigSet.ConfigSet,self).__getattr__(name,default) else: return self[name] - ConfigSet.ConfigSet.__getattr__=getattri + ConfigSet.ConfigSet.__getattr__=_getattr def options(opt): enhance_lib() -def configure(conf): - pass diff --git a/waflib/Tools/fc.py b/waflib/Tools/fc.py index bc9f0b0..64c479d 100644 --- a/waflib/Tools/fc.py +++ b/waflib/Tools/fc.py @@ -2,37 +2,40 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -from waflib import Utils,Task,Logs +from waflib import Utils,Task,Errors from waflib.Tools import ccroot,fc_config,fc_scan -from waflib.TaskGen import feature,extension +from waflib.TaskGen import extension from waflib.Configure import conf -ccroot.USELIB_VARS['fc']=set(['FCFLAGS','DEFINES','INCLUDES']) +ccroot.USELIB_VARS['fc']=set(['FCFLAGS','DEFINES','INCLUDES','FCPPFLAGS']) ccroot.USELIB_VARS['fcprogram_test']=ccroot.USELIB_VARS['fcprogram']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS']) ccroot.USELIB_VARS['fcshlib']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS']) ccroot.USELIB_VARS['fcstlib']=set(['ARFLAGS','LINKDEPS']) -@feature('fcprogram','fcshlib','fcstlib','fcprogram_test') -def dummy(self): - pass -@extension('.f','.f90','.F','.F90','.for','.FOR') +@extension('.f','.F','.f90','.F90','.for','.FOR','.f95','.F95','.f03','.F03','.f08','.F08') def fc_hook(self,node): return self.create_compiled_task('fc',node) @conf def modfile(conf,name): - return{'lower':name.lower()+'.mod','lower.MOD':name.upper()+'.MOD','UPPER.mod':name.upper()+'.mod','UPPER':name.upper()+'.MOD'}[conf.env.FC_MOD_CAPITALIZATION or'lower'] + if name.find(':')>=0: + separator=conf.env.FC_SUBMOD_SEPARATOR or'@' + modpath=name.split(':') + modname=modpath[0]+separator+modpath[-1] + suffix=conf.env.FC_SUBMOD_SUFFIX or'.smod' + else: + modname=name + suffix='.mod' + return{'lower':modname.lower()+suffix.lower(),'lower.MOD':modname.lower()+suffix.upper(),'UPPER.mod':modname.upper()+suffix.lower(),'UPPER':modname.upper()+suffix.upper()}[conf.env.FC_MOD_CAPITALIZATION or'lower'] def get_fortran_tasks(tsk): bld=tsk.generator.bld tasks=bld.get_tasks_group(bld.get_group_idx(tsk.generator)) return[x for x in tasks if isinstance(x,fc)and not getattr(x,'nomod',None)and not getattr(x,'mod_fortran_done',None)] class fc(Task.Task): color='GREEN' - run_str='${FC} ${FCFLAGS} ${FCINCPATH_ST:INCPATHS} ${FCDEFINES_ST:DEFINES} ${_FCMODOUTFLAGS} ${FC_TGT_F}${TGT[0].abspath()} ${FC_SRC_F}${SRC[0].abspath()}' + run_str='${FC} ${FCFLAGS} ${FCINCPATH_ST:INCPATHS} ${FCDEFINES_ST:DEFINES} ${_FCMODOUTFLAGS} ${FC_TGT_F}${TGT[0].abspath()} ${FC_SRC_F}${SRC[0].abspath()} ${FCPPFLAGS}' vars=["FORTRANMODPATHFLAG"] def scan(self): tmp=fc_scan.fortran_parser(self.generator.includes_nodes) tmp.task=self tmp.start(self.inputs[0]) - if Logs.verbose: - Logs.debug('deps: deps for %r: %r; unresolved %r'%(self.inputs,tmp.nodes,tmp.names)) return(tmp.nodes,tmp.names) def runnable_status(self): if getattr(self,'mod_fortran_done',None): @@ -55,10 +58,8 @@ class fc(Task.Task): if x.startswith('MOD@'): name=bld.modfile(x.replace('MOD@','')) node=bld.srcnode.find_or_declare(name) - if not getattr(node,'sig',None): - node.sig=Utils.SIG_NIL tsk.set_outputs(node) - outs[id(node)].add(tsk) + outs[node].add(tsk) for tsk in lst: key=tsk.uid() for x in bld.raw_deps[key]: @@ -68,10 +69,12 @@ class fc(Task.Task): if node and node not in tsk.outputs: if not node in bld.node_deps[key]: bld.node_deps[key].append(node) - ins[id(node)].add(tsk) + ins[node].add(tsk) for k in ins.keys(): for a in ins[k]: a.run_after.update(outs[k]) + for x in outs[k]: + self.generator.bld.producer.revdeps[x].add(a) tmp=[] for t in outs[k]: tmp.extend(t.outputs) @@ -89,6 +92,8 @@ class fcprogram(ccroot.link_task): inst_to='${BINDIR}' class fcshlib(fcprogram): inst_to='${LIBDIR}' +class fcstlib(ccroot.stlink_task): + pass class fcprogram_test(fcprogram): def runnable_status(self): ret=super(fcprogram_test,self).runnable_status() @@ -99,17 +104,15 @@ class fcprogram_test(fcprogram): bld=self.generator.bld kw['shell']=isinstance(cmd,str) kw['stdout']=kw['stderr']=Utils.subprocess.PIPE - kw['cwd']=bld.variant_dir + kw['cwd']=self.get_cwd() bld.out=bld.err='' bld.to_log('command: %s\n'%cmd) kw['output']=0 try: (bld.out,bld.err)=bld.cmd_and_log(cmd,**kw) - except Exception: + except Errors.WafError: return-1 if bld.out: - bld.to_log("out: %s\n"%bld.out) + bld.to_log('out: %s\n'%bld.out) if bld.err: - bld.to_log("err: %s\n"%bld.err) -class fcstlib(ccroot.stlink_task): - pass + bld.to_log('err: %s\n'%bld.err) diff --git a/waflib/Tools/fc_config.py b/waflib/Tools/fc_config.py index f57ae90..d0d4c45 100644 --- a/waflib/Tools/fc_config.py +++ b/waflib/Tools/fc_config.py @@ -10,26 +10,28 @@ FC_FRAGMENT2=' PROGRAM MAIN\n END\n' @conf def fc_flags(conf): v=conf.env - v['FC_SRC_F']=[] - v['FC_TGT_F']=['-c','-o'] - v['FCINCPATH_ST']='-I%s' - v['FCDEFINES_ST']='-D%s' - if not v['LINK_FC']:v['LINK_FC']=v['FC'] - v['FCLNK_SRC_F']=[] - v['FCLNK_TGT_F']=['-o'] - v['FCFLAGS_fcshlib']=['-fpic'] - v['LINKFLAGS_fcshlib']=['-shared'] - v['fcshlib_PATTERN']='lib%s.so' - v['fcstlib_PATTERN']='lib%s.a' - v['FCLIB_ST']='-l%s' - v['FCLIBPATH_ST']='-L%s' - v['FCSTLIB_ST']='-l%s' - v['FCSTLIBPATH_ST']='-L%s' - v['FCSTLIB_MARKER']='-Wl,-Bstatic' - v['FCSHLIB_MARKER']='-Wl,-Bdynamic' - v['SONAME_ST']='-Wl,-h,%s' + v.FC_SRC_F=[] + v.FC_TGT_F=['-c','-o'] + v.FCINCPATH_ST='-I%s' + v.FCDEFINES_ST='-D%s' + if not v.LINK_FC: + v.LINK_FC=v.FC + v.FCLNK_SRC_F=[] + v.FCLNK_TGT_F=['-o'] + v.FCFLAGS_fcshlib=['-fpic'] + v.LINKFLAGS_fcshlib=['-shared'] + v.fcshlib_PATTERN='lib%s.so' + v.fcstlib_PATTERN='lib%s.a' + v.FCLIB_ST='-l%s' + v.FCLIBPATH_ST='-L%s' + v.FCSTLIB_ST='-l%s' + v.FCSTLIBPATH_ST='-L%s' + v.FCSTLIB_MARKER='-Wl,-Bstatic' + v.FCSHLIB_MARKER='-Wl,-Bdynamic' + v.SONAME_ST='-Wl,-h,%s' @conf def fc_add_flags(conf): + conf.add_os_flags('FCPPFLAGS',dup=False) conf.add_os_flags('FCFLAGS',dup=False) conf.add_os_flags('LINKFLAGS',dup=False) conf.add_os_flags('LDFLAGS',dup=False) @@ -51,32 +53,31 @@ def check_fc(self,*k,**kw): @conf def fortran_modifier_darwin(conf): v=conf.env - v['FCFLAGS_fcshlib']=['-fPIC'] - v['LINKFLAGS_fcshlib']=['-dynamiclib'] - v['fcshlib_PATTERN']='lib%s.dylib' - v['FRAMEWORKPATH_ST']='-F%s' - v['FRAMEWORK_ST']='-framework %s' - v['LINKFLAGS_fcstlib']=[] - v['FCSHLIB_MARKER']='' - v['FCSTLIB_MARKER']='' - v['SONAME_ST']='' + v.FCFLAGS_fcshlib=['-fPIC'] + v.LINKFLAGS_fcshlib=['-dynamiclib'] + v.fcshlib_PATTERN='lib%s.dylib' + v.FRAMEWORKPATH_ST='-F%s' + v.FRAMEWORK_ST=['-framework'] + v.LINKFLAGS_fcstlib=[] + v.FCSHLIB_MARKER='' + v.FCSTLIB_MARKER='' + v.SONAME_ST='' @conf def fortran_modifier_win32(conf): v=conf.env - v['fcprogram_PATTERN']=v['fcprogram_test_PATTERN']='%s.exe' - v['fcshlib_PATTERN']='%s.dll' - v['implib_PATTERN']='lib%s.dll.a' - v['IMPLIB_ST']='-Wl,--out-implib,%s' - v['FCFLAGS_fcshlib']=[] - v.append_value('FCFLAGS_fcshlib',['-DDLL_EXPORT']) + v.fcprogram_PATTERN=v.fcprogram_test_PATTERN='%s.exe' + v.fcshlib_PATTERN='%s.dll' + v.implib_PATTERN='%s.dll.a' + v.IMPLIB_ST='-Wl,--out-implib,%s' + v.FCFLAGS_fcshlib=[] v.append_value('LINKFLAGS',['-Wl,--enable-auto-import']) @conf def fortran_modifier_cygwin(conf): fortran_modifier_win32(conf) v=conf.env - v['fcshlib_PATTERN']='cyg%s.dll' + v.fcshlib_PATTERN='cyg%s.dll' v.append_value('LINKFLAGS_fcshlib',['-Wl,--enable-auto-image-base']) - v['FCFLAGS_fcshlib']=[] + v.FCFLAGS_fcshlib=[] @conf def check_fortran_dummy_main(self,*k,**kw): if not self.env.CC: @@ -201,10 +202,10 @@ def getoutput(conf,cmd,stdin=False): else: env=dict(os.environ) env['LANG']='C' - input=stdin and'\n'or None + input=stdin and'\n'.encode()or None try: out,err=conf.cmd_and_log(cmd,env=env,output=0,input=input) - except Errors.WafError ,e: + except Errors.WafError as e: if not(hasattr(e,'stderr')and hasattr(e,'stdout')): raise e else: @@ -258,7 +259,7 @@ def check_fortran_mangling(self,*k,**kw): self.start_msg('Getting fortran mangling scheme') for(u,du,c)in mangling_schemes(): try: - self.check_cc(compile_filename=[],features='link_main_routines_func',msg='nomsg',errmsg='nomsg',mandatory=True,dummy_func_nounder=mangle_name(u,du,c,"foobar"),dummy_func_under=mangle_name(u,du,c,"foo_bar"),main_func_name=self.env.FC_MAIN) + self.check_cc(compile_filename=[],features='link_main_routines_func',msg='nomsg',errmsg='nomsg',dummy_func_nounder=mangle_name(u,du,c,'foobar'),dummy_func_under=mangle_name(u,du,c,'foo_bar'),main_func_name=self.env.FC_MAIN) except self.errors.ConfigurationError: pass else: @@ -272,7 +273,7 @@ def check_fortran_mangling(self,*k,**kw): @feature('pyext') @before_method('propagate_uselib_vars','apply_link') def set_lib_pat(self): - self.env['fcshlib_PATTERN']=self.env['pyext_PATTERN'] + self.env.fcshlib_PATTERN=self.env.pyext_PATTERN @conf def detect_openmp(self): for x in('-fopenmp','-openmp','-mp','-xopenmp','-omp','-qsmp=omp'): @@ -284,3 +285,15 @@ def detect_openmp(self): break else: self.fatal('Could not find OpenMP') +@conf +def check_gfortran_o_space(self): + if self.env.FC_NAME!='GFORTRAN'or int(self.env.FC_VERSION[0])>4: + return + self.env.stash() + self.env.FCLNK_TGT_F=['-o',''] + try: + self.check_fc(msg='Checking if the -o link must be split from arguments',fragment=FC_FRAGMENT,features='fc fcshlib') + except self.errors.ConfigurationError: + self.env.revert() + else: + self.env.commit() diff --git a/waflib/Tools/fc_scan.py b/waflib/Tools/fc_scan.py index c07a22d..a688250 100644 --- a/waflib/Tools/fc_scan.py +++ b/waflib/Tools/fc_scan.py @@ -5,10 +5,12 @@ import re INC_REGEX="""(?:^|['">]\s*;)\s*(?:|#\s*)INCLUDE\s+(?:\w+_)?[<"'](.+?)(?=["'>])""" USE_REGEX="""(?:^|;)\s*USE(?:\s+|(?:(?:\s*,\s*(?:NON_)?INTRINSIC)?\s*::))\s*(\w+)""" -MOD_REGEX="""(?:^|;)\s*MODULE(?!\s*PROCEDURE)(?:\s+|(?:(?:\s*,\s*(?:NON_)?INTRINSIC)?\s*::))\s*(\w+)""" +MOD_REGEX="""(?:^|;)\s*MODULE(?!\s+(?:PROCEDURE|SUBROUTINE|FUNCTION))\s+(\w+)""" +SMD_REGEX="""(?:^|;)\s*SUBMODULE\s*\(([\w:]+)\)\s*(\w+)""" re_inc=re.compile(INC_REGEX,re.I) re_use=re.compile(USE_REGEX,re.I) re_mod=re.compile(MOD_REGEX,re.I) +re_smd=re.compile(SMD_REGEX,re.I) class fortran_parser(object): def __init__(self,incpaths): self.seen=[] @@ -30,6 +32,10 @@ class fortran_parser(object): m=re_mod.search(line) if m: mods.append(m.group(1)) + m=re_smd.search(line) + if m: + uses.append(m.group(1)) + mods.append('{0}:{1}'.format(m.group(1),m.group(2))) return(incs,uses,mods) def start(self,node): self.waiting=[node] diff --git a/waflib/Tools/flex.py b/waflib/Tools/flex.py index 7a04074..1f1620e 100644 --- a/waflib/Tools/flex.py +++ b/waflib/Tools/flex.py @@ -2,7 +2,9 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import waflib.TaskGen,os,re +import os,re +from waflib import Task,TaskGen +from waflib.Tools import ccroot def decide_ext(self,node): if'cxx'in self.features: return['.lex.cc'] @@ -12,19 +14,23 @@ def flexfun(tsk): bld=tsk.generator.bld wd=bld.variant_dir def to_list(xx): - if isinstance(xx,str):return[xx] + if isinstance(xx,str): + return[xx] return xx tsk.last_cmd=lst=[] - lst.extend(to_list(env['FLEX'])) - lst.extend(to_list(env['FLEXFLAGS'])) - inputs=[a.path_from(bld.bldnode)for a in tsk.inputs] + lst.extend(to_list(env.FLEX)) + lst.extend(to_list(env.FLEXFLAGS)) + inputs=[a.path_from(tsk.get_cwd())for a in tsk.inputs] if env.FLEX_MSYS: inputs=[x.replace(os.sep,'/')for x in inputs] lst.extend(inputs) lst=[x for x in lst if x] txt=bld.cmd_and_log(lst,cwd=wd,env=env.env or None,quiet=0) tsk.outputs[0].write(txt.replace('\r\n','\n').replace('\r','\n')) -waflib.TaskGen.declare_chain(name='flex',rule=flexfun,ext_in='.l',decider=decide_ext,) +TaskGen.declare_chain(name='flex',rule=flexfun,ext_in='.l',decider=decide_ext,) +Task.classes['flex'].vars=['FLEXFLAGS','FLEX'] +ccroot.USELIB_VARS['c'].add('FLEXFLAGS') +ccroot.USELIB_VARS['cxx'].add('FLEXFLAGS') def configure(conf): conf.find_program('flex',var='FLEX') conf.env.FLEXFLAGS=['-t'] diff --git a/waflib/Tools/g95.py b/waflib/Tools/g95.py index 6524e1c..b62adcd 100644 --- a/waflib/Tools/g95.py +++ b/waflib/Tools/g95.py @@ -14,9 +14,9 @@ def find_g95(conf): @conf def g95_flags(conf): v=conf.env - v['FCFLAGS_fcshlib']=['-fPIC'] - v['FORTRANMODFLAG']=['-fmod=',''] - v['FCFLAGS_DEBUG']=['-Werror'] + v.FCFLAGS_fcshlib=['-fPIC'] + v.FORTRANMODFLAG=['-fmod=',''] + v.FCFLAGS_DEBUG=['-Werror'] @conf def g95_modifier_win32(conf): fc_config.fortran_modifier_win32(conf) @@ -28,7 +28,7 @@ def g95_modifier_darwin(conf): fc_config.fortran_modifier_darwin(conf) @conf def g95_modifier_platform(conf): - dest_os=conf.env['DEST_OS']or Utils.unversioned_sys_platform() + dest_os=conf.env.DEST_OS or Utils.unversioned_sys_platform() g95_modifier_func=getattr(conf,'g95_modifier_'+dest_os,None) if g95_modifier_func: g95_modifier_func() @@ -44,7 +44,7 @@ def get_g95_version(conf,fc): if not match: conf.fatal('cannot determine g95 version') k=match.groupdict() - conf.env['FC_VERSION']=(k['major'],k['minor']) + conf.env.FC_VERSION=(k['major'],k['minor']) def configure(conf): conf.find_g95() conf.find_ar() diff --git a/waflib/Tools/gcc.py b/waflib/Tools/gcc.py index a3c7720..12afcc6 100644 --- a/waflib/Tools/gcc.py +++ b/waflib/Tools/gcc.py @@ -12,81 +12,82 @@ def find_gcc(conf): @conf def gcc_common_flags(conf): v=conf.env - v['CC_SRC_F']=[] - v['CC_TGT_F']=['-c','-o'] - if not v['LINK_CC']:v['LINK_CC']=v['CC'] - v['CCLNK_SRC_F']=[] - v['CCLNK_TGT_F']=['-o'] - v['CPPPATH_ST']='-I%s' - v['DEFINES_ST']='-D%s' - v['LIB_ST']='-l%s' - v['LIBPATH_ST']='-L%s' - v['STLIB_ST']='-l%s' - v['STLIBPATH_ST']='-L%s' - v['RPATH_ST']='-Wl,-rpath,%s' - v['SONAME_ST']='-Wl,-h,%s' - v['SHLIB_MARKER']='-Wl,-Bdynamic' - v['STLIB_MARKER']='-Wl,-Bstatic' - v['cprogram_PATTERN']='%s' - v['CFLAGS_cshlib']=['-fPIC'] - v['LINKFLAGS_cshlib']=['-shared'] - v['cshlib_PATTERN']='lib%s.so' - v['LINKFLAGS_cstlib']=['-Wl,-Bstatic'] - v['cstlib_PATTERN']='lib%s.a' - v['LINKFLAGS_MACBUNDLE']=['-bundle','-undefined','dynamic_lookup'] - v['CFLAGS_MACBUNDLE']=['-fPIC'] - v['macbundle_PATTERN']='%s.bundle' + v.CC_SRC_F=[] + v.CC_TGT_F=['-c','-o'] + if not v.LINK_CC: + v.LINK_CC=v.CC + v.CCLNK_SRC_F=[] + v.CCLNK_TGT_F=['-o'] + v.CPPPATH_ST='-I%s' + v.DEFINES_ST='-D%s' + v.LIB_ST='-l%s' + v.LIBPATH_ST='-L%s' + v.STLIB_ST='-l%s' + v.STLIBPATH_ST='-L%s' + v.RPATH_ST='-Wl,-rpath,%s' + v.SONAME_ST='-Wl,-h,%s' + v.SHLIB_MARKER='-Wl,-Bdynamic' + v.STLIB_MARKER='-Wl,-Bstatic' + v.cprogram_PATTERN='%s' + v.CFLAGS_cshlib=['-fPIC'] + v.LINKFLAGS_cshlib=['-shared'] + v.cshlib_PATTERN='lib%s.so' + v.LINKFLAGS_cstlib=['-Wl,-Bstatic'] + v.cstlib_PATTERN='lib%s.a' + v.LINKFLAGS_MACBUNDLE=['-bundle','-undefined','dynamic_lookup'] + v.CFLAGS_MACBUNDLE=['-fPIC'] + v.macbundle_PATTERN='%s.bundle' @conf def gcc_modifier_win32(conf): v=conf.env - v['cprogram_PATTERN']='%s.exe' - v['cshlib_PATTERN']='%s.dll' - v['implib_PATTERN']='lib%s.dll.a' - v['IMPLIB_ST']='-Wl,--out-implib,%s' - v['CFLAGS_cshlib']=[] + v.cprogram_PATTERN='%s.exe' + v.cshlib_PATTERN='%s.dll' + v.implib_PATTERN='%s.dll.a' + v.IMPLIB_ST='-Wl,--out-implib,%s' + v.CFLAGS_cshlib=[] v.append_value('LINKFLAGS',['-Wl,--enable-auto-import']) @conf def gcc_modifier_cygwin(conf): gcc_modifier_win32(conf) v=conf.env - v['cshlib_PATTERN']='cyg%s.dll' + v.cshlib_PATTERN='cyg%s.dll' v.append_value('LINKFLAGS_cshlib',['-Wl,--enable-auto-image-base']) - v['CFLAGS_cshlib']=[] + v.CFLAGS_cshlib=[] @conf def gcc_modifier_darwin(conf): v=conf.env - v['CFLAGS_cshlib']=['-fPIC'] - v['LINKFLAGS_cshlib']=['-dynamiclib'] - v['cshlib_PATTERN']='lib%s.dylib' - v['FRAMEWORKPATH_ST']='-F%s' - v['FRAMEWORK_ST']=['-framework'] - v['ARCH_ST']=['-arch'] - v['LINKFLAGS_cstlib']=[] - v['SHLIB_MARKER']=[] - v['STLIB_MARKER']=[] - v['SONAME_ST']=[] + v.CFLAGS_cshlib=['-fPIC'] + v.LINKFLAGS_cshlib=['-dynamiclib'] + v.cshlib_PATTERN='lib%s.dylib' + v.FRAMEWORKPATH_ST='-F%s' + v.FRAMEWORK_ST=['-framework'] + v.ARCH_ST=['-arch'] + v.LINKFLAGS_cstlib=[] + v.SHLIB_MARKER=[] + v.STLIB_MARKER=[] + v.SONAME_ST=[] @conf def gcc_modifier_aix(conf): v=conf.env - v['LINKFLAGS_cprogram']=['-Wl,-brtl'] - v['LINKFLAGS_cshlib']=['-shared','-Wl,-brtl,-bexpfull'] - v['SHLIB_MARKER']=[] + v.LINKFLAGS_cprogram=['-Wl,-brtl'] + v.LINKFLAGS_cshlib=['-shared','-Wl,-brtl,-bexpfull'] + v.SHLIB_MARKER=[] @conf def gcc_modifier_hpux(conf): v=conf.env - v['SHLIB_MARKER']=[] - v['STLIB_MARKER']=[] - v['CFLAGS_cshlib']=['-fPIC','-DPIC'] - v['cshlib_PATTERN']='lib%s.sl' + v.SHLIB_MARKER=[] + v.STLIB_MARKER=[] + v.CFLAGS_cshlib=['-fPIC','-DPIC'] + v.cshlib_PATTERN='lib%s.sl' @conf def gcc_modifier_openbsd(conf): conf.env.SONAME_ST=[] @conf def gcc_modifier_osf1V(conf): v=conf.env - v['SHLIB_MARKER']=[] - v['STLIB_MARKER']=[] - v['SONAME_ST']=[] + v.SHLIB_MARKER=[] + v.STLIB_MARKER=[] + v.SONAME_ST=[] @conf def gcc_modifier_platform(conf): gcc_modifier_func=getattr(conf,'gcc_modifier_'+conf.env.DEST_OS,None) @@ -100,3 +101,4 @@ def configure(conf): conf.cc_load_tools() conf.cc_add_flags() conf.link_add_flags() + conf.check_gcc_o_space() diff --git a/waflib/Tools/gdc.py b/waflib/Tools/gdc.py index acfea4a..c809930 100644 --- a/waflib/Tools/gdc.py +++ b/waflib/Tools/gdc.py @@ -13,20 +13,20 @@ def find_gdc(conf): @conf def common_flags_gdc(conf): v=conf.env - v['DFLAGS']=[] - v['D_SRC_F']=['-c'] - v['D_TGT_F']='-o%s' - v['D_LINKER']=v['D'] - v['DLNK_SRC_F']='' - v['DLNK_TGT_F']='-o%s' - v['DINC_ST']='-I%s' - v['DSHLIB_MARKER']=v['DSTLIB_MARKER']='' - v['DSTLIB_ST']=v['DSHLIB_ST']='-l%s' - v['DSTLIBPATH_ST']=v['DLIBPATH_ST']='-L%s' - v['LINKFLAGS_dshlib']=['-shared'] - v['DHEADER_ext']='.di' + v.DFLAGS=[] + v.D_SRC_F=['-c'] + v.D_TGT_F='-o%s' + v.D_LINKER=v.D + v.DLNK_SRC_F='' + v.DLNK_TGT_F='-o%s' + v.DINC_ST='-I%s' + v.DSHLIB_MARKER=v.DSTLIB_MARKER='' + v.DSTLIB_ST=v.DSHLIB_ST='-l%s' + v.DSTLIBPATH_ST=v.DLIBPATH_ST='-L%s' + v.LINKFLAGS_dshlib=['-shared'] + v.DHEADER_ext='.di' v.DFLAGS_d_with_header='-fintfc' - v['D_HDR_F']='-fintfc-file=%s' + v.D_HDR_F='-fintfc-file=%s' def configure(conf): conf.find_gdc() conf.load('ar') diff --git a/waflib/Tools/gfortran.py b/waflib/Tools/gfortran.py index a0ea00b..47d005a 100644 --- a/waflib/Tools/gfortran.py +++ b/waflib/Tools/gfortran.py @@ -14,9 +14,9 @@ def find_gfortran(conf): @conf def gfortran_flags(conf): v=conf.env - v['FCFLAGS_fcshlib']=['-fPIC'] - v['FORTRANMODFLAG']=['-J',''] - v['FCFLAGS_DEBUG']=['-Werror'] + v.FCFLAGS_fcshlib=['-fPIC'] + v.FORTRANMODFLAG=['-J',''] + v.FCFLAGS_DEBUG=['-Werror'] @conf def gfortran_modifier_win32(conf): fc_config.fortran_modifier_win32(conf) @@ -28,7 +28,7 @@ def gfortran_modifier_darwin(conf): fc_config.fortran_modifier_darwin(conf) @conf def gfortran_modifier_platform(conf): - dest_os=conf.env['DEST_OS']or Utils.unversioned_sys_platform() + dest_os=conf.env.DEST_OS or Utils.unversioned_sys_platform() gfortran_modifier_func=getattr(conf,'gfortran_modifier_'+dest_os,None) if gfortran_modifier_func: gfortran_modifier_func() @@ -37,8 +37,10 @@ def get_gfortran_version(conf,fc): version_re=re.compile(r"GNU\s*Fortran",re.I).search cmd=fc+['--version'] out,err=fc_config.getoutput(conf,cmd,stdin=False) - if out:match=version_re(out) - else:match=version_re(err) + if out: + match=version_re(out) + else: + match=version_re(err) if not match: conf.fatal('Could not determine the compiler type') cmd=fc+['-dM','-E','-'] @@ -58,7 +60,7 @@ def get_gfortran_version(conf,fc): return var in k def isT(var): return var in k and k[var]!='0' - conf.env['FC_VERSION']=(k['__GNUC__'],k['__GNUC_MINOR__'],k['__GNUC_PATCHLEVEL__']) + conf.env.FC_VERSION=(k['__GNUC__'],k['__GNUC_MINOR__'],k['__GNUC_PATCHLEVEL__']) def configure(conf): conf.find_gfortran() conf.find_ar() @@ -66,3 +68,4 @@ def configure(conf): conf.fc_add_flags() conf.gfortran_flags() conf.gfortran_modifier_platform() + conf.check_gfortran_o_space() diff --git a/waflib/Tools/glib2.py b/waflib/Tools/glib2.py index 47ee823..ba5a71e 100644 --- a/waflib/Tools/glib2.py +++ b/waflib/Tools/glib2.py @@ -3,6 +3,7 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file import os +import functools from waflib import Context,Task,Utils,Options,Errors,Logs from waflib.TaskGen import taskgen_method,before_method,feature,extension from waflib.Configure import conf @@ -25,19 +26,20 @@ def process_marshal(self): self.source=self.to_nodes(getattr(self,'source',[])) self.source.append(c_node) class glib_genmarshal(Task.Task): + vars=['GLIB_GENMARSHAL_PREFIX','GLIB_GENMARSHAL'] + color='BLUE' + ext_out=['.h'] def run(self): - bld=self.inputs[0].__class__.ctx + bld=self.generator.bld get=self.env.get_flat cmd1="%s %s --prefix=%s --header > %s"%(get('GLIB_GENMARSHAL'),self.inputs[0].srcpath(),get('GLIB_GENMARSHAL_PREFIX'),self.outputs[0].abspath()) ret=bld.exec_command(cmd1) - if ret:return ret + if ret: + return ret c='''#include "%s"\n'''%self.outputs[0].name self.outputs[1].write(c) cmd2="%s %s --prefix=%s --body >> %s"%(get('GLIB_GENMARSHAL'),self.inputs[0].srcpath(),get('GLIB_GENMARSHAL_PREFIX'),self.outputs[1].abspath()) return bld.exec_command(cmd2) - vars=['GLIB_GENMARSHAL_PREFIX','GLIB_GENMARSHAL'] - color='BLUE' - ext_out=['.h'] @taskgen_method def add_enums_from_template(self,source='',target='',template='',comments=''): if not hasattr(self,'enums_list'): @@ -61,13 +63,13 @@ def process_enums(self): raise Errors.WafError('missing source '+str(enum)) source_list=[self.path.find_resource(k)for k in source_list] inputs+=source_list - env['GLIB_MKENUMS_SOURCE']=[k.abspath()for k in source_list] + env.GLIB_MKENUMS_SOURCE=[k.abspath()for k in source_list] if not enum['target']: raise Errors.WafError('missing target '+str(enum)) tgt_node=self.path.find_or_declare(enum['target']) if tgt_node.name.endswith('.c'): self.source.append(tgt_node) - env['GLIB_MKENUMS_TARGET']=tgt_node.abspath() + env.GLIB_MKENUMS_TARGET=tgt_node.abspath() options=[] if enum['template']: template_node=self.path.find_resource(enum['template']) @@ -77,7 +79,7 @@ def process_enums(self): for param,option in params.items(): if enum[param]: options.append('%s %r'%(option,enum[param])) - env['GLIB_MKENUMS_OPTIONS']=' '.join(options) + env.GLIB_MKENUMS_OPTIONS=' '.join(options) task.set_inputs(inputs) task.set_outputs(tgt_node) class glib_mkenums(Task.Task): @@ -94,9 +96,9 @@ def add_settings_schemas(self,filename_list): @taskgen_method def add_settings_enums(self,namespace,filename_list): if hasattr(self,'settings_enum_namespace'): - raise Errors.WafError("Tried to add gsettings enums to '%s' more than once"%self.name) + raise Errors.WafError("Tried to add gsettings enums to %r more than once"%self.name) self.settings_enum_namespace=namespace - if type(filename_list)!='list': + if not isinstance(filename_list,list): filename_list=[filename_list] self.settings_enum_files=filename_list @feature('glib2') @@ -104,53 +106,63 @@ def process_settings(self): enums_tgt_node=[] install_files=[] settings_schema_files=getattr(self,'settings_schema_files',[]) - if settings_schema_files and not self.env['GLIB_COMPILE_SCHEMAS']: + if settings_schema_files and not self.env.GLIB_COMPILE_SCHEMAS: raise Errors.WafError("Unable to process GSettings schemas - glib-compile-schemas was not found during configure") if hasattr(self,'settings_enum_files'): enums_task=self.create_task('glib_mkenums') source_list=self.settings_enum_files source_list=[self.path.find_resource(k)for k in source_list] enums_task.set_inputs(source_list) - enums_task.env['GLIB_MKENUMS_SOURCE']=[k.abspath()for k in source_list] + enums_task.env.GLIB_MKENUMS_SOURCE=[k.abspath()for k in source_list] target=self.settings_enum_namespace+'.enums.xml' tgt_node=self.path.find_or_declare(target) enums_task.set_outputs(tgt_node) - enums_task.env['GLIB_MKENUMS_TARGET']=tgt_node.abspath() + enums_task.env.GLIB_MKENUMS_TARGET=tgt_node.abspath() enums_tgt_node=[tgt_node] install_files.append(tgt_node) options='--comments "<!-- @comment@ -->" --fhead "<schemalist>" --vhead " <@type@ id=\\"%s.@EnumName@\\">" --vprod " <value nick=\\"@valuenick@\\" value=\\"@valuenum@\\"/>" --vtail " </@type@>" --ftail "</schemalist>" '%(self.settings_enum_namespace) - enums_task.env['GLIB_MKENUMS_OPTIONS']=options + enums_task.env.GLIB_MKENUMS_OPTIONS=options for schema in settings_schema_files: schema_task=self.create_task('glib_validate_schema') schema_node=self.path.find_resource(schema) if not schema_node: - raise Errors.WafError("Cannot find the schema file '%s'"%schema) + raise Errors.WafError("Cannot find the schema file %r"%schema) install_files.append(schema_node) source_list=enums_tgt_node+[schema_node] schema_task.set_inputs(source_list) - schema_task.env['GLIB_COMPILE_SCHEMAS_OPTIONS']=[("--schema-file="+k.abspath())for k in source_list] + schema_task.env.GLIB_COMPILE_SCHEMAS_OPTIONS=[("--schema-file="+k.abspath())for k in source_list] target_node=schema_node.change_ext('.xml.valid') schema_task.set_outputs(target_node) - schema_task.env['GLIB_VALIDATE_SCHEMA_OUTPUT']=target_node.abspath() + schema_task.env.GLIB_VALIDATE_SCHEMA_OUTPUT=target_node.abspath() def compile_schemas_callback(bld): - if not bld.is_install:return - Logs.pprint('YELLOW','Updating GSettings schema cache') - command=Utils.subst_vars("${GLIB_COMPILE_SCHEMAS} ${GSETTINGSSCHEMADIR}",bld.env) - self.bld.exec_command(command) + if not bld.is_install: + return + compile_schemas=Utils.to_list(bld.env.GLIB_COMPILE_SCHEMAS) + destdir=Options.options.destdir + paths=bld._compile_schemas_registered + if destdir: + paths=(os.path.join(destdir,path.lstrip(os.sep))for path in paths) + for path in paths: + Logs.pprint('YELLOW','Updating GSettings schema cache %r'%path) + if self.bld.exec_command(compile_schemas+[path]): + Logs.warn('Could not update GSettings schema cache %r'%path) if self.bld.is_install: - if not self.env['GSETTINGSSCHEMADIR']: + schemadir=self.env.GSETTINGSSCHEMADIR + if not schemadir: raise Errors.WafError('GSETTINGSSCHEMADIR not defined (should have been set up automatically during configure)') if install_files: - self.bld.install_files(self.env['GSETTINGSSCHEMADIR'],install_files) - if not hasattr(self.bld,'_compile_schemas_registered'): + self.add_install_files(install_to=schemadir,install_from=install_files) + registered_schemas=getattr(self.bld,'_compile_schemas_registered',None) + if not registered_schemas: + registered_schemas=self.bld._compile_schemas_registered=set() self.bld.add_post_fun(compile_schemas_callback) - self.bld._compile_schemas_registered=True + registered_schemas.add(schemadir) class glib_validate_schema(Task.Task): run_str='rm -f ${GLIB_VALIDATE_SCHEMA_OUTPUT} && ${GLIB_COMPILE_SCHEMAS} --dry-run ${GLIB_COMPILE_SCHEMAS_OPTIONS} && touch ${GLIB_VALIDATE_SCHEMA_OUTPUT}' color='PINK' @extension('.gresource.xml') def process_gresource_source(self,node): - if not self.env['GLIB_COMPILE_RESOURCES']: + if not self.env.GLIB_COMPILE_RESOURCES: raise Errors.WafError("Unable to process GResource file - glib-compile-resources was not found during configure") if'gresource'in self.features: return @@ -165,18 +177,14 @@ def process_gresource_bundle(self): task=self.create_task('glib_gresource_bundle',node,node.change_ext('')) inst_to=getattr(self,'install_path',None) if inst_to: - self.bld.install_files(inst_to,task.outputs) + self.add_install_files(install_to=inst_to,install_from=task.outputs) class glib_gresource_base(Task.Task): color='BLUE' base_cmd='${GLIB_COMPILE_RESOURCES} --sourcedir=${SRC[0].parent.srcpath()} --sourcedir=${SRC[0].bld_dir()}' def scan(self): bld=self.generator.bld kw={} - try: - if not kw.get('cwd',None): - kw['cwd']=bld.cwd - except AttributeError: - bld.cwd=kw['cwd']=bld.variant_dir + kw['cwd']=self.get_cwd() kw['quiet']=Context.BOTH cmd=Utils.subst_vars('${GLIB_COMPILE_RESOURCES} --sourcedir=%s --sourcedir=%s --generate-dependencies %s'%(self.inputs[0].parent.srcpath(),self.inputs[0].bld_dir(),self.inputs[0].bldpath()),self.env) output=bld.cmd_and_log(cmd,**kw) @@ -217,10 +225,10 @@ def find_glib_compile_schemas(conf): if not gsettingsschemadir: datadir=getstr('DATADIR') if not datadir: - prefix=conf.env['PREFIX'] + prefix=conf.env.PREFIX datadir=os.path.join(prefix,'share') gsettingsschemadir=os.path.join(datadir,'glib-2.0','schemas') - conf.env['GSETTINGSSCHEMADIR']=gsettingsschemadir + conf.env.GSETTINGSSCHEMADIR=gsettingsschemadir @conf def find_glib_compile_resources(conf): conf.find_program('glib-compile-resources',var='GLIB_COMPILE_RESOURCES') diff --git a/waflib/Tools/gxx.py b/waflib/Tools/gxx.py index 332c953..1ba1393 100644 --- a/waflib/Tools/gxx.py +++ b/waflib/Tools/gxx.py @@ -12,81 +12,82 @@ def find_gxx(conf): @conf def gxx_common_flags(conf): v=conf.env - v['CXX_SRC_F']=[] - v['CXX_TGT_F']=['-c','-o'] - if not v['LINK_CXX']:v['LINK_CXX']=v['CXX'] - v['CXXLNK_SRC_F']=[] - v['CXXLNK_TGT_F']=['-o'] - v['CPPPATH_ST']='-I%s' - v['DEFINES_ST']='-D%s' - v['LIB_ST']='-l%s' - v['LIBPATH_ST']='-L%s' - v['STLIB_ST']='-l%s' - v['STLIBPATH_ST']='-L%s' - v['RPATH_ST']='-Wl,-rpath,%s' - v['SONAME_ST']='-Wl,-h,%s' - v['SHLIB_MARKER']='-Wl,-Bdynamic' - v['STLIB_MARKER']='-Wl,-Bstatic' - v['cxxprogram_PATTERN']='%s' - v['CXXFLAGS_cxxshlib']=['-fPIC'] - v['LINKFLAGS_cxxshlib']=['-shared'] - v['cxxshlib_PATTERN']='lib%s.so' - v['LINKFLAGS_cxxstlib']=['-Wl,-Bstatic'] - v['cxxstlib_PATTERN']='lib%s.a' - v['LINKFLAGS_MACBUNDLE']=['-bundle','-undefined','dynamic_lookup'] - v['CXXFLAGS_MACBUNDLE']=['-fPIC'] - v['macbundle_PATTERN']='%s.bundle' + v.CXX_SRC_F=[] + v.CXX_TGT_F=['-c','-o'] + if not v.LINK_CXX: + v.LINK_CXX=v.CXX + v.CXXLNK_SRC_F=[] + v.CXXLNK_TGT_F=['-o'] + v.CPPPATH_ST='-I%s' + v.DEFINES_ST='-D%s' + v.LIB_ST='-l%s' + v.LIBPATH_ST='-L%s' + v.STLIB_ST='-l%s' + v.STLIBPATH_ST='-L%s' + v.RPATH_ST='-Wl,-rpath,%s' + v.SONAME_ST='-Wl,-h,%s' + v.SHLIB_MARKER='-Wl,-Bdynamic' + v.STLIB_MARKER='-Wl,-Bstatic' + v.cxxprogram_PATTERN='%s' + v.CXXFLAGS_cxxshlib=['-fPIC'] + v.LINKFLAGS_cxxshlib=['-shared'] + v.cxxshlib_PATTERN='lib%s.so' + v.LINKFLAGS_cxxstlib=['-Wl,-Bstatic'] + v.cxxstlib_PATTERN='lib%s.a' + v.LINKFLAGS_MACBUNDLE=['-bundle','-undefined','dynamic_lookup'] + v.CXXFLAGS_MACBUNDLE=['-fPIC'] + v.macbundle_PATTERN='%s.bundle' @conf def gxx_modifier_win32(conf): v=conf.env - v['cxxprogram_PATTERN']='%s.exe' - v['cxxshlib_PATTERN']='%s.dll' - v['implib_PATTERN']='lib%s.dll.a' - v['IMPLIB_ST']='-Wl,--out-implib,%s' - v['CXXFLAGS_cxxshlib']=[] + v.cxxprogram_PATTERN='%s.exe' + v.cxxshlib_PATTERN='%s.dll' + v.implib_PATTERN='%s.dll.a' + v.IMPLIB_ST='-Wl,--out-implib,%s' + v.CXXFLAGS_cxxshlib=[] v.append_value('LINKFLAGS',['-Wl,--enable-auto-import']) @conf def gxx_modifier_cygwin(conf): gxx_modifier_win32(conf) v=conf.env - v['cxxshlib_PATTERN']='cyg%s.dll' + v.cxxshlib_PATTERN='cyg%s.dll' v.append_value('LINKFLAGS_cxxshlib',['-Wl,--enable-auto-image-base']) - v['CXXFLAGS_cxxshlib']=[] + v.CXXFLAGS_cxxshlib=[] @conf def gxx_modifier_darwin(conf): v=conf.env - v['CXXFLAGS_cxxshlib']=['-fPIC'] - v['LINKFLAGS_cxxshlib']=['-dynamiclib'] - v['cxxshlib_PATTERN']='lib%s.dylib' - v['FRAMEWORKPATH_ST']='-F%s' - v['FRAMEWORK_ST']=['-framework'] - v['ARCH_ST']=['-arch'] - v['LINKFLAGS_cxxstlib']=[] - v['SHLIB_MARKER']=[] - v['STLIB_MARKER']=[] - v['SONAME_ST']=[] + v.CXXFLAGS_cxxshlib=['-fPIC'] + v.LINKFLAGS_cxxshlib=['-dynamiclib'] + v.cxxshlib_PATTERN='lib%s.dylib' + v.FRAMEWORKPATH_ST='-F%s' + v.FRAMEWORK_ST=['-framework'] + v.ARCH_ST=['-arch'] + v.LINKFLAGS_cxxstlib=[] + v.SHLIB_MARKER=[] + v.STLIB_MARKER=[] + v.SONAME_ST=[] @conf def gxx_modifier_aix(conf): v=conf.env - v['LINKFLAGS_cxxprogram']=['-Wl,-brtl'] - v['LINKFLAGS_cxxshlib']=['-shared','-Wl,-brtl,-bexpfull'] - v['SHLIB_MARKER']=[] + v.LINKFLAGS_cxxprogram=['-Wl,-brtl'] + v.LINKFLAGS_cxxshlib=['-shared','-Wl,-brtl,-bexpfull'] + v.SHLIB_MARKER=[] @conf def gxx_modifier_hpux(conf): v=conf.env - v['SHLIB_MARKER']=[] - v['STLIB_MARKER']=[] - v['CFLAGS_cxxshlib']=['-fPIC','-DPIC'] - v['cxxshlib_PATTERN']='lib%s.sl' + v.SHLIB_MARKER=[] + v.STLIB_MARKER=[] + v.CFLAGS_cxxshlib=['-fPIC','-DPIC'] + v.cxxshlib_PATTERN='lib%s.sl' @conf def gxx_modifier_openbsd(conf): conf.env.SONAME_ST=[] @conf def gcc_modifier_osf1V(conf): v=conf.env - v['SHLIB_MARKER']=[] - v['STLIB_MARKER']=[] - v['SONAME_ST']=[] + v.SHLIB_MARKER=[] + v.STLIB_MARKER=[] + v.SONAME_ST=[] @conf def gxx_modifier_platform(conf): gxx_modifier_func=getattr(conf,'gxx_modifier_'+conf.env.DEST_OS,None) @@ -100,3 +101,4 @@ def configure(conf): conf.cxx_load_tools() conf.cxx_add_flags() conf.link_add_flags() + conf.check_gcc_o_space('cxx') diff --git a/waflib/Tools/icc.py b/waflib/Tools/icc.py index 9b7aa03..bffd7d1 100644 --- a/waflib/Tools/icc.py +++ b/waflib/Tools/icc.py @@ -7,8 +7,6 @@ from waflib.Tools import ccroot,ar,gcc from waflib.Configure import conf @conf def find_icc(conf): - if sys.platform=='cygwin': - conf.fatal('The Intel compiler does not work on Cygwin') cc=conf.find_program(['icc','ICL'],var='CC') conf.get_cc_version(cc,icc=True) conf.env.CC_NAME='icc' diff --git a/waflib/Tools/icpc.py b/waflib/Tools/icpc.py index 1e9b6cc..f7bdb0f 100644 --- a/waflib/Tools/icpc.py +++ b/waflib/Tools/icpc.py @@ -7,8 +7,6 @@ from waflib.Tools import ccroot,ar,gxx from waflib.Configure import conf @conf def find_icpc(conf): - if sys.platform=='cygwin': - conf.fatal('The Intel compiler does not work on Cygwin') cxx=conf.find_program('icpc',var='CXX') conf.get_cc_version(cxx,icc=True) conf.env.CXX_NAME='icc' diff --git a/waflib/Tools/ifort.py b/waflib/Tools/ifort.py index 41fa604..2cbae10 100644 --- a/waflib/Tools/ifort.py +++ b/waflib/Tools/ifort.py @@ -2,10 +2,11 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import re -from waflib import Utils -from waflib.Tools import fc,fc_config,fc_scan,ar +import os,re,traceback +from waflib import Utils,Logs,Errors +from waflib.Tools import fc,fc_config,fc_scan,ar,ccroot from waflib.Configure import conf +from waflib.TaskGen import after_method,feature @conf def find_ifort(conf): fc=conf.find_program('ifort',var='FC') @@ -54,10 +55,10 @@ def get_ifort_version(conf,fc): if not match: conf.fatal('cannot determine ifort version.') k=match.groupdict() - conf.env['FC_VERSION']=(k['major'],k['minor']) + conf.env.FC_VERSION=(k['major'],k['minor']) def configure(conf): if Utils.is_win32: - compiler,version,path,includes,libdirs,arch=conf.detect_ifort(True) + compiler,version,path,includes,libdirs,arch=conf.detect_ifort() v=conf.env v.DEST_CPU=arch v.PATH=path @@ -66,8 +67,7 @@ def configure(conf): v.MSVC_COMPILER=compiler try: v.MSVC_VERSION=float(version) - except Exception: - raise + except ValueError: v.MSVC_VERSION=float(version[:-3]) conf.find_ifort_win32() conf.ifort_modifier_win32() @@ -78,86 +78,73 @@ def configure(conf): conf.fc_flags() conf.fc_add_flags() conf.ifort_modifier_platform() -import os,sys,re,tempfile -from waflib import Task,Logs,Options,Errors -from waflib.Logs import debug,warn -from waflib.TaskGen import after_method,feature -from waflib.Configure import conf -from waflib.Tools import ccroot,ar,winres all_ifort_platforms=[('intel64','amd64'),('em64t','amd64'),('ia32','x86'),('Itanium','ia64')] @conf def gather_ifort_versions(conf,versions): version_pattern=re.compile('^...?.?\....?.?') try: all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Wow6432node\\Intel\\Compilers\\Fortran') - except WindowsError: + except OSError: try: all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Intel\\Compilers\\Fortran') - except WindowsError: + except OSError: return index=0 while 1: try: version=Utils.winreg.EnumKey(all_versions,index) - except WindowsError: + except OSError: break - index=index+1 + index+=1 if not version_pattern.match(version): continue - targets=[] + targets={} for target,arch in all_ifort_platforms: + if target=='intel64': + targetDir='EM64T_NATIVE' + else: + targetDir=target try: - if target=='intel64':targetDir='EM64T_NATIVE' - else:targetDir=target Utils.winreg.OpenKey(all_versions,version+'\\'+targetDir) icl_version=Utils.winreg.OpenKey(all_versions,version) path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') - batch_file=os.path.join(path,'bin','iclvars.bat') - if os.path.isfile(batch_file): - try: - targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) - except conf.errors.ConfigurationError: - pass - except WindowsError: + except OSError: pass + else: + batch_file=os.path.join(path,'bin','ifortvars.bat') + if os.path.isfile(batch_file): + targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) for target,arch in all_ifort_platforms: try: icl_version=Utils.winreg.OpenKey(all_versions,version+'\\'+target) path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') - batch_file=os.path.join(path,'bin','iclvars.bat') - if os.path.isfile(batch_file): - try: - targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) - except conf.errors.ConfigurationError: - pass - except WindowsError: + except OSError: continue + else: + batch_file=os.path.join(path,'bin','ifortvars.bat') + if os.path.isfile(batch_file): + targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) major=version[0:2] - versions.append(('intel '+major,targets)) -def setup_ifort(conf,versions,arch=False): - platforms=Utils.to_list(conf.env['MSVC_TARGETS'])or[i for i,j in all_ifort_platforms] - desired_versions=conf.env['MSVC_VERSIONS']or[v for v,_ in versions][::-1] - versiondict=dict(versions) + versions['intel '+major]=targets +@conf +def setup_ifort(conf,versiondict): + platforms=Utils.to_list(conf.env.MSVC_TARGETS)or[i for i,j in all_ifort_platforms] + desired_versions=conf.env.MSVC_VERSIONS or list(reversed(list(versiondict.keys()))) for version in desired_versions: try: - targets=dict(versiondict[version]) - for target in platforms: - try: - try: - realtarget,(p1,p2,p3)=targets[target] - except conf.errors.ConfigurationError: - del(targets[target]) - else: - compiler,revision=version.rsplit(' ',1) - if arch: - return compiler,revision,p1,p2,p3,realtarget - else: - return compiler,revision,p1,p2,p3 - except KeyError: - continue + targets=versiondict[version] except KeyError: continue - conf.fatal('msvc: Impossible to find a valid architecture for building (in setup_ifort)') + for arch in platforms: + try: + cfg=targets[arch] + except KeyError: + continue + cfg.evaluate() + if cfg.is_valid: + compiler,revision=version.rsplit(' ',1) + return compiler,revision,cfg.bindirs,cfg.incdirs,cfg.libdirs,cfg.cpu + conf.fatal('ifort: Impossible to find a valid architecture for building %r - %r'%(desired_versions,list(versiondict.keys()))) @conf def get_ifort_version_win32(conf,compiler,version,target,vcvars): try: @@ -188,7 +175,7 @@ echo LIB=%%LIB%%;%%LIBPATH%% elif line.startswith('LIB='): MSVC_LIBDIR=[i for i in line[4:].split(';')if i] if None in(MSVC_PATH,MSVC_INCDIR,MSVC_LIBDIR): - conf.fatal('msvc: Could not find a valid architecture for building (get_ifort_version_win32)') + conf.fatal('ifort: Could not find a valid architecture for building (get_ifort_version_win32)') env=dict(os.environ) env.update(PATH=path) compiler_name,linker_name,lib_name=_get_prog_names(conf,compiler) @@ -196,83 +183,58 @@ echo LIB=%%LIB%%;%%LIBPATH%% if'CL'in env: del(env['CL']) try: - try: - conf.cmd_and_log(fc+['/help'],env=env) - except UnicodeError: - st=Utils.ex_stack() - if conf.logger: - conf.logger.error(st) - conf.fatal('msvc: Unicode error - check the code page?') - except Exception ,e: - debug('msvc: get_ifort_version: %r %r %r -> failure %s'%(compiler,version,target,str(e))) - conf.fatal('msvc: cannot run the compiler in get_ifort_version (run with -v to display errors)') - else: - debug('msvc: get_ifort_version: %r %r %r -> OK',compiler,version,target) + conf.cmd_and_log(fc+['/help'],env=env) + except UnicodeError: + st=traceback.format_exc() + if conf.logger: + conf.logger.error(st) + conf.fatal('ifort: Unicode error - check the code page?') + except Exception as e: + Logs.debug('ifort: get_ifort_version: %r %r %r -> failure %s',compiler,version,target,str(e)) + conf.fatal('ifort: cannot run the compiler in get_ifort_version (run with -v to display errors)') + else: + Logs.debug('ifort: get_ifort_version: %r %r %r -> OK',compiler,version,target) finally: conf.env[compiler_name]='' return(MSVC_PATH,MSVC_INCDIR,MSVC_LIBDIR) -def get_compiler_env(conf,compiler,version,bat_target,bat,select=None): - lazy=getattr(Options.options,'msvc_lazy',True) - if conf.env.MSVC_LAZY_AUTODETECT is False: - lazy=False - def msvc_thunk(): - vs=conf.get_ifort_version_win32(compiler,version,bat_target,bat) - if select: - return select(vs) - else: - return vs - return lazytup(msvc_thunk,lazy,([],[],[])) -class lazytup(object): - def __init__(self,fn,lazy=True,default=None): - self.fn=fn - self.default=default - if not lazy: - self.evaluate() - def __len__(self): - self.evaluate() - return len(self.value) - def __iter__(self): - self.evaluate() - for i,v in enumerate(self.value): - yield v - def __getitem__(self,i): - self.evaluate() - return self.value[i] - def __repr__(self): - if hasattr(self,'value'): - return repr(self.value) - elif self.default: - return repr(self.default) - else: - self.evaluate() - return repr(self.value) +class target_compiler(object): + def __init__(self,ctx,compiler,cpu,version,bat_target,bat,callback=None): + self.conf=ctx + self.name=None + self.is_valid=False + self.is_done=False + self.compiler=compiler + self.cpu=cpu + self.version=version + self.bat_target=bat_target + self.bat=bat + self.callback=callback def evaluate(self): - if hasattr(self,'value'): + if self.is_done: + return + self.is_done=True + try: + vs=self.conf.get_ifort_version_win32(self.compiler,self.version,self.bat_target,self.bat) + except Errors.ConfigurationError: + self.is_valid=False return - self.value=self.fn() + if self.callback: + vs=self.callback(self,vs) + self.is_valid=True + (self.bindirs,self.incdirs,self.libdirs)=vs + def __str__(self): + return str((self.bindirs,self.incdirs,self.libdirs)) + def __repr__(self): + return repr((self.bindirs,self.incdirs,self.libdirs)) @conf -def get_ifort_versions(conf,eval_and_save=True): - if conf.env['IFORT_INSTALLED_VERSIONS']: - return conf.env['IFORT_INSTALLED_VERSIONS'] - lst=[] - conf.gather_ifort_versions(lst) - if eval_and_save: - def checked_target(t): - target,(arch,paths)=t - try: - paths.evaluate() - except conf.errors.ConfigurationError: - return None - else: - return t - lst=[(version,list(filter(checked_target,targets)))for version,targets in lst] - conf.env['IFORT_INSTALLED_VERSIONS']=lst - return lst +def detect_ifort(self): + return self.setup_ifort(self.get_ifort_versions(False)) @conf -def detect_ifort(conf,arch=False): - versions=get_ifort_versions(conf,False) - return setup_ifort(conf,versions,arch) -def _get_prog_names(conf,compiler): +def get_ifort_versions(self,eval_and_save=True): + dct={} + self.gather_ifort_versions(dct) + return dct +def _get_prog_names(self,compiler): if compiler=='intel': compiler_name='ifort' linker_name='XILINK' @@ -285,29 +247,30 @@ def _get_prog_names(conf,compiler): @conf def find_ifort_win32(conf): v=conf.env - path=v['PATH'] - compiler=v['MSVC_COMPILER'] - version=v['MSVC_VERSION'] + path=v.PATH + compiler=v.MSVC_COMPILER + version=v.MSVC_VERSION compiler_name,linker_name,lib_name=_get_prog_names(conf,compiler) v.IFORT_MANIFEST=(compiler=='intel'and version>=11) fc=conf.find_program(compiler_name,var='FC',path_list=path) env=dict(conf.environ) - if path:env.update(PATH=';'.join(path)) + if path: + env.update(PATH=';'.join(path)) if not conf.cmd_and_log(fc+['/nologo','/help'],env=env): conf.fatal('not intel fortran compiler could not be identified') - v['FC_NAME']='IFORT' - if not v['LINK_FC']: + v.FC_NAME='IFORT' + if not v.LINK_FC: conf.find_program(linker_name,var='LINK_FC',path_list=path,mandatory=True) - if not v['AR']: + if not v.AR: conf.find_program(lib_name,path_list=path,var='AR',mandatory=True) - v['ARFLAGS']=['/NOLOGO'] + v.ARFLAGS=['/nologo'] if v.IFORT_MANIFEST: conf.find_program('MT',path_list=path,var='MT') - v['MTFLAGS']=['/NOLOGO'] + v.MTFLAGS=['/nologo'] try: conf.load('winres') except Errors.WafError: - warn('Resource compiler not found. Compiling resource file is disabled') + Logs.warn('Resource compiler not found. Compiling resource file is disabled') @after_method('apply_link') @feature('fc') def apply_flags_ifort(self): @@ -326,7 +289,7 @@ def apply_flags_ifort(self): pdbnode=self.link_task.outputs[0].change_ext('.pdb') self.link_task.outputs.append(pdbnode) if getattr(self,'install_task',None): - self.pdb_install_task=self.bld.install_files(self.install_task.dest,pdbnode,env=self.env) + self.pdb_install_task=self.add_install_files(install_to=self.install_task.install_to,install_from=pdbnode) break @feature('fcprogram','fcshlib','fcprogram_test') @after_method('apply_link') @@ -337,97 +300,4 @@ def apply_manifest_ifort(self): out_node=self.link_task.outputs[0] man_node=out_node.parent.find_or_declare(out_node.name+'.manifest') self.link_task.outputs.append(man_node) - self.link_task.do_manifest=True -def exec_mf(self): - env=self.env - mtool=env['MT'] - if not mtool: - return 0 - self.do_manifest=False - outfile=self.outputs[0].abspath() - manifest=None - for out_node in self.outputs: - if out_node.name.endswith('.manifest'): - manifest=out_node.abspath() - break - if manifest is None: - return 0 - mode='' - if'fcprogram'in self.generator.features or'fcprogram_test'in self.generator.features: - mode='1' - elif'fcshlib'in self.generator.features: - mode='2' - debug('msvc: embedding manifest in mode %r'%mode) - lst=[]+mtool - lst.extend(Utils.to_list(env['MTFLAGS'])) - lst.extend(['-manifest',manifest]) - lst.append('-outputresource:%s;%s'%(outfile,mode)) - return self.exec_command(lst) -def quote_response_command(self,flag): - if flag.find(' ')>-1: - for x in('/LIBPATH:','/IMPLIB:','/OUT:','/I'): - if flag.startswith(x): - flag='%s"%s"'%(x,flag[len(x):]) - break - else: - flag='"%s"'%flag - return flag -def exec_response_command(self,cmd,**kw): - try: - tmp=None - if sys.platform.startswith('win')and isinstance(cmd,list)and len(' '.join(cmd))>=8192: - program=cmd[0] - cmd=[self.quote_response_command(x)for x in cmd] - (fd,tmp)=tempfile.mkstemp() - os.write(fd,'\r\n'.join(i.replace('\\','\\\\')for i in cmd[1:])) - os.close(fd) - cmd=[program,'@'+tmp] - ret=super(self.__class__,self).exec_command(cmd,**kw) - finally: - if tmp: - try: - os.remove(tmp) - except OSError: - pass - return ret -def exec_command_ifort(self,*k,**kw): - if isinstance(k[0],list): - lst=[] - carry='' - for a in k[0]: - if a=='/Fo'or a=='/doc'or a[-1]==':': - carry=a - else: - lst.append(carry+a) - carry='' - k=[lst] - if self.env['PATH']: - env=dict(self.env.env or os.environ) - env.update(PATH=';'.join(self.env['PATH'])) - kw['env']=env - if not'cwd'in kw: - kw['cwd']=self.generator.bld.variant_dir - ret=self.exec_response_command(k[0],**kw) - if not ret and getattr(self,'do_manifest',None): - ret=self.exec_mf() - return ret -def wrap_class(class_name): - cls=Task.classes.get(class_name,None) - if not cls: - return None - derived_class=type(class_name,(cls,),{}) - def exec_command(self,*k,**kw): - if self.env.IFORT_WIN32: - return self.exec_command_ifort(*k,**kw) - else: - return super(derived_class,self).exec_command(*k,**kw) - derived_class.exec_command=exec_command - derived_class.exec_response_command=exec_response_command - derived_class.quote_response_command=quote_response_command - derived_class.exec_command_ifort=exec_command_ifort - derived_class.exec_mf=exec_mf - if hasattr(cls,'hcode'): - derived_class.hcode=cls.hcode - return derived_class -for k in'fc fcprogram fcprogram_test fcshlib fcstlib'.split(): - wrap_class(k) + self.env.DO_MANIFEST=True diff --git a/waflib/Tools/intltool.py b/waflib/Tools/intltool.py index c751e26..d799402 100644 --- a/waflib/Tools/intltool.py +++ b/waflib/Tools/intltool.py @@ -2,6 +2,7 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file +from __future__ import with_statement import os,re from waflib import Context,Task,Utils,Logs import waflib.Tools.ccroot @@ -19,8 +20,10 @@ def ensure_localedir(self): @before_method('process_source') @feature('intltool_in') def apply_intltool_in_f(self): - try:self.meths.remove('process_source') - except ValueError:pass + try: + self.meths.remove('process_source') + except ValueError: + pass self.ensure_localedir() podir=getattr(self,'podir','.') podirnode=self.path.find_dir(podir) @@ -45,23 +48,24 @@ def apply_intltool_in_f(self): task=self.create_task('intltool',node,node.change_ext('')) inst=getattr(self,'install_path',None) if inst: - self.bld.install_files(inst,task.outputs) + self.add_install_files(install_to=inst,install_from=task.outputs) @feature('intltool_po') def apply_intltool_po(self): - try:self.meths.remove('process_source') - except ValueError:pass + try: + self.meths.remove('process_source') + except ValueError: + pass self.ensure_localedir() appname=getattr(self,'appname',getattr(Context.g_module,Context.APPNAME,'set_your_app_name')) podir=getattr(self,'podir','.') inst=getattr(self,'install_path','${LOCALEDIR}') linguas=self.path.find_node(os.path.join(podir,'LINGUAS')) if linguas: - file=open(linguas.abspath()) - langs=[] - for line in file.readlines(): - if not line.startswith('#'): - langs+=line.split() - file.close() + with open(linguas.abspath())as f: + langs=[] + for line in f.readlines(): + if not line.startswith('#'): + langs+=line.split() re_linguas=re.compile('[-a-zA-Z_@.]+') for lang in langs: if re_linguas.match(lang): @@ -71,7 +75,7 @@ def apply_intltool_po(self): filename=task.outputs[0].name (langname,ext)=os.path.splitext(filename) inst_file=inst+os.sep+langname+os.sep+'LC_MESSAGES'+os.sep+appname+'.mo' - self.bld.install_as(inst_file,task.outputs[0],chmod=getattr(self,'chmod',Utils.O644),env=task.env) + self.add_install_as(install_to=inst_file,install_from=task.outputs[0],chmod=getattr(self,'chmod',Utils.O644)) else: Logs.pprint('RED',"Error no LINGUAS file found in po directory") class po(Task.Task): diff --git a/waflib/Tools/irixcc.py b/waflib/Tools/irixcc.py index 74a36cf..06099ff 100644 --- a/waflib/Tools/irixcc.py +++ b/waflib/Tools/irixcc.py @@ -2,39 +2,45 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file +from waflib import Errors from waflib.Tools import ccroot,ar from waflib.Configure import conf @conf def find_irixcc(conf): v=conf.env cc=None - if v['CC']:cc=v['CC'] - elif'CC'in conf.environ:cc=conf.environ['CC'] - if not cc:cc=conf.find_program('cc',var='CC') - if not cc:conf.fatal('irixcc was not found') + if v.CC: + cc=v.CC + elif'CC'in conf.environ: + cc=conf.environ['CC'] + if not cc: + cc=conf.find_program('cc',var='CC') + if not cc: + conf.fatal('irixcc was not found') try: conf.cmd_and_log(cc+['-version']) - except Exception: + except Errors.WafError: conf.fatal('%r -version could not be executed'%cc) - v['CC']=cc - v['CC_NAME']='irix' + v.CC=cc + v.CC_NAME='irix' @conf def irixcc_common_flags(conf): v=conf.env - v['CC_SRC_F']='' - v['CC_TGT_F']=['-c','-o'] - v['CPPPATH_ST']='-I%s' - v['DEFINES_ST']='-D%s' - if not v['LINK_CC']:v['LINK_CC']=v['CC'] - v['CCLNK_SRC_F']='' - v['CCLNK_TGT_F']=['-o'] - v['LIB_ST']='-l%s' - v['LIBPATH_ST']='-L%s' - v['STLIB_ST']='-l%s' - v['STLIBPATH_ST']='-L%s' - v['cprogram_PATTERN']='%s' - v['cshlib_PATTERN']='lib%s.so' - v['cstlib_PATTERN']='lib%s.a' + v.CC_SRC_F='' + v.CC_TGT_F=['-c','-o'] + v.CPPPATH_ST='-I%s' + v.DEFINES_ST='-D%s' + if not v.LINK_CC: + v.LINK_CC=v.CC + v.CCLNK_SRC_F='' + v.CCLNK_TGT_F=['-o'] + v.LIB_ST='-l%s' + v.LIBPATH_ST='-L%s' + v.STLIB_ST='-l%s' + v.STLIBPATH_ST='-L%s' + v.cprogram_PATTERN='%s' + v.cshlib_PATTERN='lib%s.so' + v.cstlib_PATTERN='lib%s.a' def configure(conf): conf.find_irixcc() conf.find_cpp() diff --git a/waflib/Tools/javaw.py b/waflib/Tools/javaw.py index 585cce7..adb4e04 100644 --- a/waflib/Tools/javaw.py +++ b/waflib/Tools/javaw.py @@ -2,10 +2,10 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import os,tempfile,shutil -from waflib import Task,Utils,Errors,Node,Logs +import os,shutil +from waflib import Task,Utils,Errors,Node from waflib.Configure import conf -from waflib.TaskGen import feature,before_method,after_method +from waflib.TaskGen import feature,before_method,after_method,taskgen_method from waflib.Tools import ccroot ccroot.USELIB_VARS['javac']=set(['CLASSPATH','JAVACFLAGS']) SOURCE_RE='**/*.java' @@ -41,7 +41,7 @@ def apply_java(self): outdir=self.path.get_bld() outdir.mkdir() self.outdir=outdir - self.env['OUTDIR']=outdir.abspath() + self.env.OUTDIR=outdir.abspath() self.javac_task=tsk=self.create_task('javac') tmp=[] srcdir=getattr(self,'srcdir','') @@ -57,7 +57,7 @@ def apply_java(self): tmp.append(y) tsk.srcdir=tmp if getattr(self,'compat',None): - tsk.env.append_value('JAVACFLAGS',['-source',self.compat]) + tsk.env.append_value('JAVACFLAGS',['-source',str(self.compat)]) if hasattr(self,'sourcepath'): fold=[isinstance(x,Node.Node)and x or self.path.find_dir(x)for x in self.to_list(self.sourcepath)] names=os.pathsep.join([x.srcpath()for x in fold]) @@ -65,32 +65,52 @@ def apply_java(self): names=[x.srcpath()for x in tsk.srcdir] if names: tsk.env.append_value('JAVACFLAGS',['-sourcepath',names]) +@taskgen_method +def java_use_rec(self,name,**kw): + if name in self.tmp_use_seen: + return + self.tmp_use_seen.append(name) + try: + y=self.bld.get_tgen_by_name(name) + except Errors.WafError: + self.uselib.append(name) + return + else: + y.post() + if hasattr(y,'jar_task'): + self.use_lst.append(y.jar_task.outputs[0].abspath()) + for x in self.to_list(getattr(y,'use',[])): + self.java_use_rec(x) @feature('javac') +@before_method('propagate_uselib_vars') @after_method('apply_java') def use_javac_files(self): - lst=[] + self.use_lst=[] + self.tmp_use_seen=[] self.uselib=self.to_list(getattr(self,'uselib',[])) names=self.to_list(getattr(self,'use',[])) get=self.bld.get_tgen_by_name for x in names: try: y=get(x) - except Exception: + except Errors.WafError: self.uselib.append(x) else: y.post() if hasattr(y,'jar_task'): - lst.append(y.jar_task.outputs[0].abspath()) + self.use_lst.append(y.jar_task.outputs[0].abspath()) self.javac_task.set_run_after(y.jar_task) else: for tsk in y.tasks: self.javac_task.set_run_after(tsk) - if lst: - self.env.append_value('CLASSPATH',lst) + if getattr(self,'recurse_use',False)or self.bld.env.RECURSE_JAVA: + self.java_use_rec(x) + self.env.append_value('CLASSPATH',self.use_lst) @feature('javac') @after_method('apply_java','propagate_uselib_vars','use_javac_files') def set_classpath(self): - self.env.append_value('CLASSPATH',getattr(self,'classpath',[])) + if getattr(self,'classpath',None): + self.env.append_unique('CLASSPATH',getattr(self,'classpath',[])) for x in self.tasks: x.env.CLASSPATH=os.pathsep.join(self.env.CLASSPATH)+os.pathsep @feature('jar') @@ -112,9 +132,11 @@ def jar_files(self): if manifest: jarcreate=getattr(self,'jarcreate','cfm') if not isinstance(manifest,Node.Node): - node=self.path.find_or_declare(manifest) + node=self.path.find_resource(manifest) else: node=manifest + if not node: + self.bld.fatal('invalid manifest file %r for %r'%(manifest,self)) tsk.dep_nodes.append(node) jaropts.insert(0,node.abspath()) else: @@ -128,8 +150,8 @@ def jar_files(self): jaropts.append('-C') jaropts.append(basedir.bldpath()) jaropts.append('.') - tsk.env['JAROPTS']=jaropts - tsk.env['JARCREATE']=jarcreate + tsk.env.JAROPTS=jaropts + tsk.env.JARCREATE=jarcreate if getattr(self,'javac_task',None): tsk.set_run_after(self.javac_task) @feature('jar') @@ -141,12 +163,22 @@ def use_jar_files(self): for x in names: try: y=get(x) - except Exception: + except Errors.WafError: self.uselib.append(x) else: y.post() self.jar_task.run_after.update(y.tasks) -class jar_create(Task.Task): +class JTask(Task.Task): + def split_argfile(self,cmd): + inline=[cmd[0]] + infile=[] + for x in cmd[1:]: + if x.startswith('-J'): + inline.append(x) + else: + infile.append(self.quote_flag(x)) + return(inline,infile) +class jar_create(JTask): color='GREEN' run_str='${JAR} ${JARCREATE} ${TGT} ${JAROPTS}' def runnable_status(self): @@ -154,14 +186,14 @@ class jar_create(Task.Task): if not t.hasrun: return Task.ASK_LATER if not self.inputs: - global JAR_RE try: self.inputs=[x for x in self.basedir.ant_glob(JAR_RE,remove=False)if id(x)!=id(self.outputs[0])] except Exception: raise Errors.WafError('Could not find the basedir %r for %r'%(self.basedir,self)) return super(jar_create,self).runnable_status() -class javac(Task.Task): +class javac(JTask): color='BLUE' + run_str='${JAVAC} -classpath ${CLASSPATH} -d ${OUTDIR} ${JAVACFLAGS} ${SRC}' vars=['CLASSPATH','JAVACFLAGS','JAVAC','OUTDIR'] def uid(self): lst=[self.__class__.__name__,self.generator.outdir.abspath()] @@ -173,49 +205,14 @@ class javac(Task.Task): if not t.hasrun: return Task.ASK_LATER if not self.inputs: - global SOURCE_RE self.inputs=[] for x in self.srcdir: - self.inputs.extend(x.ant_glob(SOURCE_RE,remove=False)) + if x.exists(): + self.inputs.extend(x.ant_glob(SOURCE_RE,remove=False)) return super(javac,self).runnable_status() - def run(self): - env=self.env - gen=self.generator - bld=gen.bld - wd=bld.bldnode.abspath() - def to_list(xx): - if isinstance(xx,str):return[xx] - return xx - cmd=[] - cmd.extend(to_list(env['JAVAC'])) - cmd.extend(['-classpath']) - cmd.extend(to_list(env['CLASSPATH'])) - cmd.extend(['-d']) - cmd.extend(to_list(env['OUTDIR'])) - cmd.extend(to_list(env['JAVACFLAGS'])) - files=[a.path_from(bld.bldnode)for a in self.inputs] - tmp=None - try: - if len(str(files))+len(str(cmd))>8192: - (fd,tmp)=tempfile.mkstemp(dir=bld.bldnode.abspath()) - try: - os.write(fd,'\n'.join(files)) - finally: - if tmp: - os.close(fd) - if Logs.verbose: - Logs.debug('runner: %r'%(cmd+files)) - cmd.append('@'+tmp) - else: - cmd+=files - ret=self.exec_command(cmd,cwd=wd,env=env.env or None) - finally: - if tmp: - os.remove(tmp) - return ret def post_run(self): - for n in self.generator.outdir.ant_glob('**/*.class'): - n.sig=Utils.h_file(n.abspath()) + for node in self.generator.outdir.ant_glob('**/*.class'): + self.generator.bld.node_sigs[node]=self.uid() self.generator.bld.task_sigs[self.uid()]=self.cache_sig @feature('javadoc') @after_method('process_rule') @@ -232,7 +229,7 @@ class javadoc(Task.Task): def run(self): env=self.env bld=self.generator.bld - wd=bld.bldnode.abspath() + wd=bld.bldnode srcpath=self.generator.path.abspath()+os.sep+self.generator.srcdir srcpath+=os.pathsep srcpath+=self.generator.path.get_bld().abspath()+os.sep+self.generator.srcdir @@ -241,7 +238,7 @@ class javadoc(Task.Task): classpath+=os.pathsep.join(self.classpath) classpath="".join(classpath) self.last_cmd=lst=[] - lst.extend(Utils.to_list(env['JAVADOC'])) + lst.extend(Utils.to_list(env.JAVADOC)) lst.extend(['-d',self.generator.javadoc_output.abspath()]) lst.extend(['-sourcepath',srcpath]) lst.extend(['-classpath',classpath]) @@ -251,36 +248,38 @@ class javadoc(Task.Task): self.generator.bld.cmd_and_log(lst,cwd=wd,env=env.env or None,quiet=0) def post_run(self): nodes=self.generator.javadoc_output.ant_glob('**') - for x in nodes: - x.sig=Utils.h_file(x.abspath()) + for node in nodes: + self.generator.bld.node_sigs[node]=self.uid() self.generator.bld.task_sigs[self.uid()]=self.cache_sig def configure(self): java_path=self.environ['PATH'].split(os.pathsep) v=self.env if'JAVA_HOME'in self.environ: java_path=[os.path.join(self.environ['JAVA_HOME'],'bin')]+java_path - self.env['JAVA_HOME']=[self.environ['JAVA_HOME']] + self.env.JAVA_HOME=[self.environ['JAVA_HOME']] for x in'javac java jar javadoc'.split(): self.find_program(x,var=x.upper(),path_list=java_path) if'CLASSPATH'in self.environ: - v['CLASSPATH']=self.environ['CLASSPATH'] - if not v['JAR']:self.fatal('jar is required for making java packages') - if not v['JAVAC']:self.fatal('javac is required for compiling java classes') - v['JARCREATE']='cf' - v['JAVACFLAGS']=[] + v.CLASSPATH=self.environ['CLASSPATH'] + if not v.JAR: + self.fatal('jar is required for making java packages') + if not v.JAVAC: + self.fatal('javac is required for compiling java classes') + v.JARCREATE='cf' + v.JAVACFLAGS=[] @conf def check_java_class(self,classname,with_classpath=None): javatestdir='.waf-javatest' classpath=javatestdir - if self.env['CLASSPATH']: - classpath+=os.pathsep+self.env['CLASSPATH'] + if self.env.CLASSPATH: + classpath+=os.pathsep+self.env.CLASSPATH if isinstance(with_classpath,str): classpath+=os.pathsep+with_classpath shutil.rmtree(javatestdir,True) os.mkdir(javatestdir) Utils.writef(os.path.join(javatestdir,'Test.java'),class_check_source) - self.exec_command(self.env['JAVAC']+[os.path.join(javatestdir,'Test.java')],shell=False) - cmd=self.env['JAVA']+['-cp',classpath,'Test',classname] + self.exec_command(self.env.JAVAC+[os.path.join(javatestdir,'Test.java')],shell=False) + cmd=self.env.JAVA+['-cp',classpath,'Test',classname] self.to_log("%s\n"%str(cmd)) found=self.exec_command(cmd,shell=False) self.msg('Checking for java class %s'%classname,not found) @@ -292,7 +291,7 @@ def check_jni_headers(conf): conf.fatal('load a compiler first (gcc, g++, ..)') if not conf.env.JAVA_HOME: conf.fatal('set JAVA_HOME in the system environment') - javaHome=conf.env['JAVA_HOME'][0] + javaHome=conf.env.JAVA_HOME[0] dir=conf.root.find_dir(conf.env.JAVA_HOME[0]+'/include') if dir is None: dir=conf.root.find_dir(conf.env.JAVA_HOME[0]+'/../Headers') diff --git a/waflib/Tools/kde4.py b/waflib/Tools/kde4.py deleted file mode 100644 index 3e90377..0000000 --- a/waflib/Tools/kde4.py +++ /dev/null @@ -1,48 +0,0 @@ -#! /usr/bin/env python -# encoding: utf-8 -# WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file - -import os,re -from waflib import Task,Utils -from waflib.TaskGen import feature -@feature('msgfmt') -def apply_msgfmt(self): - for lang in self.to_list(self.langs): - node=self.path.find_resource(lang+'.po') - task=self.create_task('msgfmt',node,node.change_ext('.mo')) - langname=lang.split('/') - langname=langname[-1] - inst=getattr(self,'install_path','${KDE4_LOCALE_INSTALL_DIR}') - self.bld.install_as(inst+os.sep+langname+os.sep+'LC_MESSAGES'+os.sep+getattr(self,'appname','set_your_appname')+'.mo',task.outputs[0],chmod=getattr(self,'chmod',Utils.O644)) -class msgfmt(Task.Task): - color='BLUE' - run_str='${MSGFMT} ${SRC} -o ${TGT}' -def configure(self): - kdeconfig=self.find_program('kde4-config') - prefix=self.cmd_and_log(kdeconfig+['--prefix']).strip() - fname='%s/share/apps/cmake/modules/KDELibsDependencies.cmake'%prefix - try:os.stat(fname) - except OSError: - fname='%s/share/kde4/apps/cmake/modules/KDELibsDependencies.cmake'%prefix - try:os.stat(fname) - except OSError:self.fatal('could not open %s'%fname) - try: - txt=Utils.readf(fname) - except EnvironmentError: - self.fatal('could not read %s'%fname) - txt=txt.replace('\\\n','\n') - fu=re.compile('#(.*)\n') - txt=fu.sub('',txt) - setregexp=re.compile('([sS][eE][tT]\s*\()\s*([^\s]+)\s+\"([^"]+)\"\)') - found=setregexp.findall(txt) - for(_,key,val)in found: - self.env[key]=val - self.env['LIB_KDECORE']=['kdecore'] - self.env['LIB_KDEUI']=['kdeui'] - self.env['LIB_KIO']=['kio'] - self.env['LIB_KHTML']=['khtml'] - self.env['LIB_KPARTS']=['kparts'] - self.env['LIBPATH_KDECORE']=[os.path.join(self.env.KDE4_LIB_INSTALL_DIR,'kde4','devel'),self.env.KDE4_LIB_INSTALL_DIR] - self.env['INCLUDES_KDECORE']=[self.env['KDE4_INCLUDE_INSTALL_DIR']] - self.env.append_value('INCLUDES_KDECORE',[self.env['KDE4_INCLUDE_INSTALL_DIR']+os.sep+'KDE']) - self.find_program('msgfmt',var='MSGFMT') diff --git a/waflib/Tools/ldc2.py b/waflib/Tools/ldc2.py index 75162e4..40d435e 100644 --- a/waflib/Tools/ldc2.py +++ b/waflib/Tools/ldc2.py @@ -13,21 +13,21 @@ def find_ldc2(conf): @conf def common_flags_ldc2(conf): v=conf.env - v['D_SRC_F']=['-c'] - v['D_TGT_F']='-of%s' - v['D_LINKER']=v['D'] - v['DLNK_SRC_F']='' - v['DLNK_TGT_F']='-of%s' - v['DINC_ST']='-I%s' - v['DSHLIB_MARKER']=v['DSTLIB_MARKER']='' - v['DSTLIB_ST']=v['DSHLIB_ST']='-L-l%s' - v['DSTLIBPATH_ST']=v['DLIBPATH_ST']='-L-L%s' - v['LINKFLAGS_dshlib']=['-L-shared'] - v['DHEADER_ext']='.di' - v['DFLAGS_d_with_header']=['-H','-Hf'] - v['D_HDR_F']='%s' - v['LINKFLAGS']=[] - v['DFLAGS_dshlib']=['-relocation-model=pic'] + v.D_SRC_F=['-c'] + v.D_TGT_F='-of%s' + v.D_LINKER=v.D + v.DLNK_SRC_F='' + v.DLNK_TGT_F='-of%s' + v.DINC_ST='-I%s' + v.DSHLIB_MARKER=v.DSTLIB_MARKER='' + v.DSTLIB_ST=v.DSHLIB_ST='-L-l%s' + v.DSTLIBPATH_ST=v.DLIBPATH_ST='-L-L%s' + v.LINKFLAGS_dshlib=['-L-shared'] + v.DHEADER_ext='.di' + v.DFLAGS_d_with_header=['-H','-Hf'] + v.D_HDR_F='%s' + v.LINKFLAGS=[] + v.DFLAGS_dshlib=['-relocation-model=pic'] def configure(conf): conf.find_ldc2() conf.load('ar') diff --git a/waflib/Tools/lua.py b/waflib/Tools/lua.py index b801d5f..7c6a682 100644 --- a/waflib/Tools/lua.py +++ b/waflib/Tools/lua.py @@ -9,7 +9,7 @@ def add_lua(self,node): tsk=self.create_task('luac',node,node.change_ext('.luac')) inst_to=getattr(self,'install_path',self.env.LUADIR and'${LUADIR}'or None) if inst_to: - self.bld.install_files(inst_to,tsk.outputs) + self.add_install_files(install_to=inst_to,install_from=tsk.outputs) return tsk class luac(Task.Task): run_str='${LUAC} -s -o ${TGT} ${SRC}' diff --git a/waflib/Tools/md5_tstamp.py b/waflib/Tools/md5_tstamp.py new file mode 100644 index 0000000..0d0faa0 --- /dev/null +++ b/waflib/Tools/md5_tstamp.py @@ -0,0 +1,24 @@ +#! /usr/bin/env python +# encoding: utf-8 +# WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file + +import os,stat +from waflib import Utils,Build,Node +STRONGEST=True +Build.SAVED_ATTRS.append('hashes_md5_tstamp') +def h_file(self): + filename=self.abspath() + st=os.stat(filename) + cache=self.ctx.hashes_md5_tstamp + if filename in cache and cache[filename][0]==st.st_mtime: + return cache[filename][1] + if STRONGEST: + ret=Utils.h_file(filename) + else: + if stat.S_ISDIR(st[stat.ST_MODE]): + raise IOError('Not a file') + ret=Utils.md5(str((st.st_mtime,st.st_size)).encode()).digest() + cache[filename]=(st.st_mtime,ret) + return ret +h_file.__doc__=Node.Node.h_file.__doc__ +Node.Node.h_file=h_file diff --git a/waflib/Tools/msvc.py b/waflib/Tools/msvc.py index 919c825..662fa61 100644 --- a/waflib/Tools/msvc.py +++ b/waflib/Tools/msvc.py @@ -2,12 +2,11 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import os,sys,re,tempfile -from waflib import Utils,Task,Logs,Options,Errors -from waflib.Logs import debug,warn +import os,sys,re,traceback +from waflib import Utils,Logs,Options,Errors from waflib.TaskGen import after_method,feature from waflib.Configure import conf -from waflib.Tools import ccroot,c,cxx,ar,winres +from waflib.Tools import ccroot,c,cxx,ar g_msvc_systemlibs=''' aclui activeds ad1 adptif adsiid advapi32 asycfilt authz bhsupp bits bufferoverflowu cabinet cap certadm certidl ciuuid clusapi comctl32 comdlg32 comsupp comsuppd comsuppw comsuppwd comsvcs @@ -28,42 +27,56 @@ traffic unicows url urlmon user32 userenv usp10 uuid uxtheme vcomp vcompd vdmdbg version vfw32 wbemuuid webpost wiaguid wininet winmm winscard winspool winstrm wintrust wldap32 wmiutils wow32 ws2_32 wsnmp32 wsock32 wst wtsapi32 xaswitch xolehlp '''.split() -all_msvc_platforms=[('x64','amd64'),('x86','x86'),('ia64','ia64'),('x86_amd64','amd64'),('x86_ia64','ia64'),('x86_arm','arm'),('amd64_x86','x86'),('amd64_arm','arm')] +all_msvc_platforms=[('x64','amd64'),('x86','x86'),('ia64','ia64'),('x86_amd64','amd64'),('x86_ia64','ia64'),('x86_arm','arm'),('x86_arm64','arm64'),('amd64_x86','x86'),('amd64_arm','arm'),('amd64_arm64','arm64')] all_wince_platforms=[('armv4','arm'),('armv4i','arm'),('mipsii','mips'),('mipsii_fp','mips'),('mipsiv','mips'),('mipsiv_fp','mips'),('sh4','sh'),('x86','cex86')] all_icl_platforms=[('intel64','amd64'),('em64t','amd64'),('ia32','x86'),('Itanium','ia64')] def options(opt): opt.add_option('--msvc_version',type='string',help='msvc version, eg: "msvc 10.0,msvc 9.0"',default='') opt.add_option('--msvc_targets',type='string',help='msvc targets, eg: "x64,arm"',default='') - opt.add_option('--msvc_lazy_autodetect',action='store_true',help='lazily check msvc target environments') -def setup_msvc(conf,versions,arch=False): + opt.add_option('--no-msvc-lazy',action='store_false',help='lazily check msvc target environments',default=True,dest='msvc_lazy') +@conf +def setup_msvc(conf,versiondict): platforms=getattr(Options.options,'msvc_targets','').split(',') if platforms==['']: - platforms=Utils.to_list(conf.env['MSVC_TARGETS'])or[i for i,j in all_msvc_platforms+all_icl_platforms+all_wince_platforms] + platforms=Utils.to_list(conf.env.MSVC_TARGETS)or[i for i,j in all_msvc_platforms+all_icl_platforms+all_wince_platforms] desired_versions=getattr(Options.options,'msvc_version','').split(',') if desired_versions==['']: - desired_versions=conf.env['MSVC_VERSIONS']or[v for v,_ in versions][::-1] - versiondict=dict(versions) + desired_versions=conf.env.MSVC_VERSIONS or list(reversed(sorted(versiondict.keys()))) + lazy_detect=getattr(Options.options,'msvc_lazy',True) + if conf.env.MSVC_LAZY_AUTODETECT is False: + lazy_detect=False + if not lazy_detect: + for val in versiondict.values(): + for arch in list(val.keys()): + cfg=val[arch] + cfg.evaluate() + if not cfg.is_valid: + del val[arch] + conf.env.MSVC_INSTALLED_VERSIONS=versiondict for version in desired_versions: + Logs.debug('msvc: detecting %r - %r',version,desired_versions) try: - targets=dict(versiondict[version]) - for target in platforms: - try: - try: - realtarget,(p1,p2,p3)=targets[target] - except conf.errors.ConfigurationError: - del(targets[target]) - else: - compiler,revision=version.rsplit(' ',1) - if arch: - return compiler,revision,p1,p2,p3,realtarget - else: - return compiler,revision,p1,p2,p3 - except KeyError:continue - except KeyError:continue - conf.fatal('msvc: Impossible to find a valid architecture for building (in setup_msvc)') + targets=versiondict[version] + except KeyError: + continue + seen=set() + for arch in platforms: + if arch in seen: + continue + else: + seen.add(arch) + try: + cfg=targets[arch] + except KeyError: + continue + cfg.evaluate() + if cfg.is_valid: + compiler,revision=version.rsplit(' ',1) + return compiler,revision,cfg.bindirs,cfg.incdirs,cfg.libdirs,cfg.cpu + conf.fatal('msvc: Impossible to find a valid architecture for building %r - %r'%(desired_versions,list(versiondict.keys()))) @conf def get_msvc_version(conf,compiler,version,target,vcvars): - debug('msvc: get_msvc_version: %r %r %r',compiler,version,target) + Logs.debug('msvc: get_msvc_version: %r %r %r',compiler,version,target) try: conf.msvc_cnt+=1 except AttributeError: @@ -99,80 +112,47 @@ echo LIB=%%LIB%%;%%LIBPATH%% if'CL'in env: del(env['CL']) try: - try: - conf.cmd_and_log(cxx+['/help'],env=env) - except UnicodeError: - st=Utils.ex_stack() - if conf.logger: - conf.logger.error(st) - conf.fatal('msvc: Unicode error - check the code page?') - except Exception ,e: - debug('msvc: get_msvc_version: %r %r %r -> failure %s'%(compiler,version,target,str(e))) - conf.fatal('msvc: cannot run the compiler in get_msvc_version (run with -v to display errors)') - else: - debug('msvc: get_msvc_version: %r %r %r -> OK',compiler,version,target) + conf.cmd_and_log(cxx+['/help'],env=env) + except UnicodeError: + st=traceback.format_exc() + if conf.logger: + conf.logger.error(st) + conf.fatal('msvc: Unicode error - check the code page?') + except Exception as e: + Logs.debug('msvc: get_msvc_version: %r %r %r -> failure %s',compiler,version,target,str(e)) + conf.fatal('msvc: cannot run the compiler in get_msvc_version (run with -v to display errors)') + else: + Logs.debug('msvc: get_msvc_version: %r %r %r -> OK',compiler,version,target) finally: conf.env[compiler_name]='' return(MSVC_PATH,MSVC_INCDIR,MSVC_LIBDIR) -@conf -def gather_wsdk_versions(conf,versions): - version_pattern=re.compile('^v..?.?\...?.?') - try: - all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Wow6432node\\Microsoft\\Microsoft SDKs\\Windows') - except WindowsError: - try: - all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Microsoft\\Microsoft SDKs\\Windows') - except WindowsError: - return - index=0 - while 1: - try: - version=Utils.winreg.EnumKey(all_versions,index) - except WindowsError: - break - index=index+1 - if not version_pattern.match(version): - continue - try: - msvc_version=Utils.winreg.OpenKey(all_versions,version) - path,type=Utils.winreg.QueryValueEx(msvc_version,'InstallationFolder') - except WindowsError: - continue - if path and os.path.isfile(os.path.join(path,'bin','SetEnv.cmd')): - targets=[] - for target,arch in all_msvc_platforms: - try: - targets.append((target,(arch,get_compiler_env(conf,'wsdk',version,'/'+target,os.path.join(path,'bin','SetEnv.cmd'))))) - except conf.errors.ConfigurationError: - pass - versions.append(('wsdk '+version[1:],targets)) def gather_wince_supported_platforms(): supported_wince_platforms=[] try: ce_sdk=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Wow6432node\\Microsoft\\Windows CE Tools\\SDKs') - except WindowsError: + except OSError: try: ce_sdk=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Microsoft\\Windows CE Tools\\SDKs') - except WindowsError: + except OSError: ce_sdk='' if not ce_sdk: return supported_wince_platforms - ce_index=0 + index=0 while 1: try: - sdk_device=Utils.winreg.EnumKey(ce_sdk,ce_index) - except WindowsError: + sdk_device=Utils.winreg.EnumKey(ce_sdk,index) + sdk=Utils.winreg.OpenKey(ce_sdk,sdk_device) + except OSError: break - ce_index=ce_index+1 - sdk=Utils.winreg.OpenKey(ce_sdk,sdk_device) + index+=1 try: path,type=Utils.winreg.QueryValueEx(sdk,'SDKRootDir') - except WindowsError: + except OSError: try: path,type=Utils.winreg.QueryValueEx(sdk,'SDKInformation') - path,xml=os.path.split(path) - except WindowsError: + except OSError: continue + path,xml=os.path.split(path) path=str(path) path,device=os.path.split(path) if not device: @@ -188,94 +168,109 @@ def gather_msvc_detected_versions(): version_pattern=re.compile('^(\d\d?\.\d\d?)(Exp)?$') detected_versions=[] for vcver,vcvar in(('VCExpress','Exp'),('VisualStudio','')): + prefix='SOFTWARE\\Wow6432node\\Microsoft\\'+vcver try: - prefix='SOFTWARE\\Wow6432node\\Microsoft\\'+vcver all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,prefix) - except WindowsError: + except OSError: + prefix='SOFTWARE\\Microsoft\\'+vcver try: - prefix='SOFTWARE\\Microsoft\\'+vcver all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,prefix) - except WindowsError: + except OSError: continue index=0 while 1: try: version=Utils.winreg.EnumKey(all_versions,index) - except WindowsError: + except OSError: break - index=index+1 + index+=1 match=version_pattern.match(version) - if not match: - continue - else: + if match: versionnumber=float(match.group(1)) - detected_versions.append((versionnumber,version+vcvar,prefix+"\\"+version)) + else: + continue + detected_versions.append((versionnumber,version+vcvar,prefix+'\\'+version)) def fun(tup): return tup[0] detected_versions.sort(key=fun) return detected_versions -def get_compiler_env(conf,compiler,version,bat_target,bat,select=None): - lazy=getattr(Options.options,'msvc_lazy_autodetect',False)or conf.env['MSVC_LAZY_AUTODETECT'] - def msvc_thunk(): - vs=conf.get_msvc_version(compiler,version,bat_target,bat) - if select: - return select(vs) - else: - return vs - return lazytup(msvc_thunk,lazy,([],[],[])) -class lazytup(object): - def __init__(self,fn,lazy=True,default=None): - self.fn=fn - self.default=default - if not lazy: - self.evaluate() - def __len__(self): - self.evaluate() - return len(self.value) - def __iter__(self): - self.evaluate() - for i,v in enumerate(self.value): - yield v - def __getitem__(self,i): - self.evaluate() - return self.value[i] - def __repr__(self): - if hasattr(self,'value'): - return repr(self.value) - elif self.default: - return repr(self.default) - else: - self.evaluate() - return repr(self.value) +class target_compiler(object): + def __init__(self,ctx,compiler,cpu,version,bat_target,bat,callback=None): + self.conf=ctx + self.name=None + self.is_valid=False + self.is_done=False + self.compiler=compiler + self.cpu=cpu + self.version=version + self.bat_target=bat_target + self.bat=bat + self.callback=callback def evaluate(self): - if hasattr(self,'value'): + if self.is_done: + return + self.is_done=True + try: + vs=self.conf.get_msvc_version(self.compiler,self.version,self.bat_target,self.bat) + except Errors.ConfigurationError: + self.is_valid=False return - self.value=self.fn() + if self.callback: + vs=self.callback(self,vs) + self.is_valid=True + (self.bindirs,self.incdirs,self.libdirs)=vs + def __str__(self): + return str((self.compiler,self.cpu,self.version,self.bat_target,self.bat)) + def __repr__(self): + return repr((self.compiler,self.cpu,self.version,self.bat_target,self.bat)) +@conf +def gather_wsdk_versions(conf,versions): + version_pattern=re.compile('^v..?.?\...?.?') + try: + all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Wow6432node\\Microsoft\\Microsoft SDKs\\Windows') + except OSError: + try: + all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Microsoft\\Microsoft SDKs\\Windows') + except OSError: + return + index=0 + while 1: + try: + version=Utils.winreg.EnumKey(all_versions,index) + except OSError: + break + index+=1 + if not version_pattern.match(version): + continue + try: + msvc_version=Utils.winreg.OpenKey(all_versions,version) + path,type=Utils.winreg.QueryValueEx(msvc_version,'InstallationFolder') + except OSError: + continue + if path and os.path.isfile(os.path.join(path,'bin','SetEnv.cmd')): + targets={} + for target,arch in all_msvc_platforms: + targets[target]=target_compiler(conf,'wsdk',arch,version,'/'+target,os.path.join(path,'bin','SetEnv.cmd')) + versions['wsdk '+version[1:]]=targets @conf def gather_msvc_targets(conf,versions,version,vc_path): - targets=[] - if os.path.isfile(os.path.join(vc_path,'vcvarsall.bat')): + targets={} + if os.path.isfile(os.path.join(vc_path,'VC','Auxiliary','Build','vcvarsall.bat')): for target,realtarget in all_msvc_platforms[::-1]: - try: - targets.append((target,(realtarget,get_compiler_env(conf,'msvc',version,target,os.path.join(vc_path,'vcvarsall.bat'))))) - except conf.errors.ConfigurationError: - pass + targets[target]=target_compiler(conf,'msvc',realtarget,version,target,os.path.join(vc_path,'VC','Auxiliary','Build','vcvarsall.bat')) + elif os.path.isfile(os.path.join(vc_path,'vcvarsall.bat')): + for target,realtarget in all_msvc_platforms[::-1]: + targets[target]=target_compiler(conf,'msvc',realtarget,version,target,os.path.join(vc_path,'vcvarsall.bat')) elif os.path.isfile(os.path.join(vc_path,'Common7','Tools','vsvars32.bat')): - try: - targets.append(('x86',('x86',get_compiler_env(conf,'msvc',version,'x86',os.path.join(vc_path,'Common7','Tools','vsvars32.bat'))))) - except conf.errors.ConfigurationError: - pass + targets['x86']=target_compiler(conf,'msvc','x86',version,'x86',os.path.join(vc_path,'Common7','Tools','vsvars32.bat')) elif os.path.isfile(os.path.join(vc_path,'Bin','vcvars32.bat')): - try: - targets.append(('x86',('x86',get_compiler_env(conf,'msvc',version,'',os.path.join(vc_path,'Bin','vcvars32.bat'))))) - except conf.errors.ConfigurationError: - pass + targets['x86']=target_compiler(conf,'msvc','x86',version,'',os.path.join(vc_path,'Bin','vcvars32.bat')) if targets: - versions.append(('msvc '+version,targets)) + versions['msvc %s'%version]=targets @conf def gather_wince_targets(conf,versions,version,vc_path,vsvars,supported_platforms): for device,platforms in supported_platforms: - cetargets=[] + targets={} for platform,compiler,include,lib in platforms: winCEpath=os.path.join(vc_path,'ce') if not os.path.isdir(winCEpath): @@ -284,25 +279,44 @@ def gather_wince_targets(conf,versions,version,vc_path,vsvars,supported_platform bindirs=[os.path.join(winCEpath,'bin',compiler),os.path.join(winCEpath,'bin','x86_'+compiler)] incdirs=[os.path.join(winCEpath,'include'),os.path.join(winCEpath,'atlmfc','include'),include] libdirs=[os.path.join(winCEpath,'lib',platform),os.path.join(winCEpath,'atlmfc','lib',platform),lib] - def combine_common(compiler_env): + def combine_common(obj,compiler_env): (common_bindirs,_1,_2)=compiler_env return(bindirs+common_bindirs,incdirs,libdirs) - try: - cetargets.append((platform,(platform,get_compiler_env(conf,'msvc',version,'x86',vsvars,combine_common)))) - except conf.errors.ConfigurationError: - continue - if cetargets: - versions.append((device+' '+version,cetargets)) + targets[platform]=target_compiler(conf,'msvc',platform,version,'x86',vsvars,combine_common) + if targets: + versions[device+' '+version]=targets @conf def gather_winphone_targets(conf,versions,version,vc_path,vsvars): - targets=[] + targets={} for target,realtarget in all_msvc_platforms[::-1]: - try: - targets.append((target,(realtarget,get_compiler_env(conf,'winphone',version,target,vsvars)))) - except conf.errors.ConfigurationError: - pass + targets[target]=target_compiler(conf,'winphone',realtarget,version,target,vsvars) if targets: - versions.append(('winphone '+version,targets)) + versions['winphone '+version]=targets +@conf +def gather_vswhere_versions(conf,versions): + try: + import json + except ImportError: + Logs.error('Visual Studio 2017 detection requires Python 2.6') + return + prg_path=os.environ.get('ProgramFiles(x86)',os.environ.get('ProgramFiles','C:\\Program Files (x86)')) + vswhere=os.path.join(prg_path,'Microsoft Visual Studio','Installer','vswhere.exe') + args=[vswhere,'-products','*','-legacy','-format','json'] + try: + txt=conf.cmd_and_log(args) + except Errors.WafError as e: + Logs.debug('msvc: vswhere.exe failed %s',e) + return + if sys.version_info[0]<3: + txt=txt.decode(Utils.console_encoding()) + arr=json.loads(txt) + arr.sort(key=lambda x:x['installationVersion']) + for entry in arr: + ver=entry['installationVersion'] + ver=str('.'.join(ver.split('.')[:2])) + path=str(os.path.abspath(entry['installationPath'])) + if os.path.exists(path)and('msvc %s'%ver)not in versions: + conf.gather_msvc_targets(versions,ver,path) @conf def gather_msvc_versions(conf,versions): vc_paths=[] @@ -310,12 +324,20 @@ def gather_msvc_versions(conf,versions): try: try: msvc_version=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,reg+"\\Setup\\VC") - except WindowsError: + except OSError: msvc_version=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,reg+"\\Setup\\Microsoft Visual C++") path,type=Utils.winreg.QueryValueEx(msvc_version,'ProductDir') - vc_paths.append((version,os.path.abspath(str(path)))) - except WindowsError: + except OSError: + try: + msvc_version=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,"SOFTWARE\\Wow6432node\\Microsoft\\VisualStudio\\SxS\\VS7") + path,type=Utils.winreg.QueryValueEx(msvc_version,version) + except OSError: + continue + else: + vc_paths.append((version,os.path.abspath(str(path)))) continue + else: + vc_paths.append((version,os.path.abspath(str(path)))) wince_supported_platforms=gather_wince_supported_platforms() for version,vc_path in vc_paths: vs_path=os.path.dirname(vc_path) @@ -336,91 +358,91 @@ def gather_icl_versions(conf,versions): version_pattern=re.compile('^...?.?\....?.?') try: all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Wow6432node\\Intel\\Compilers\\C++') - except WindowsError: + except OSError: try: all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Intel\\Compilers\\C++') - except WindowsError: + except OSError: return index=0 while 1: try: version=Utils.winreg.EnumKey(all_versions,index) - except WindowsError: + except OSError: break - index=index+1 + index+=1 if not version_pattern.match(version): continue - targets=[] + targets={} for target,arch in all_icl_platforms: + if target=='intel64': + targetDir='EM64T_NATIVE' + else: + targetDir=target try: - if target=='intel64':targetDir='EM64T_NATIVE' - else:targetDir=target Utils.winreg.OpenKey(all_versions,version+'\\'+targetDir) icl_version=Utils.winreg.OpenKey(all_versions,version) path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') + except OSError: + pass + else: batch_file=os.path.join(path,'bin','iclvars.bat') if os.path.isfile(batch_file): - try: - targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) - except conf.errors.ConfigurationError: - pass - except WindowsError: - pass + targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) for target,arch in all_icl_platforms: try: icl_version=Utils.winreg.OpenKey(all_versions,version+'\\'+target) path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') + except OSError: + continue + else: batch_file=os.path.join(path,'bin','iclvars.bat') if os.path.isfile(batch_file): - try: - targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) - except conf.errors.ConfigurationError: - pass - except WindowsError: - continue + targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) major=version[0:2] - versions.append(('intel '+major,targets)) + versions['intel '+major]=targets @conf def gather_intel_composer_versions(conf,versions): version_pattern=re.compile('^...?.?\...?.?.?') try: all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Wow6432node\\Intel\\Suites') - except WindowsError: + except OSError: try: all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Intel\\Suites') - except WindowsError: + except OSError: return index=0 while 1: try: version=Utils.winreg.EnumKey(all_versions,index) - except WindowsError: + except OSError: break - index=index+1 + index+=1 if not version_pattern.match(version): continue - targets=[] + targets={} for target,arch in all_icl_platforms: + if target=='intel64': + targetDir='EM64T_NATIVE' + else: + targetDir=target try: - if target=='intel64':targetDir='EM64T_NATIVE' - else:targetDir=target try: defaults=Utils.winreg.OpenKey(all_versions,version+'\\Defaults\\C++\\'+targetDir) - except WindowsError: + except OSError: if targetDir=='EM64T_NATIVE': defaults=Utils.winreg.OpenKey(all_versions,version+'\\Defaults\\C++\\EM64T') else: - raise WindowsError + raise uid,type=Utils.winreg.QueryValueEx(defaults,'SubKey') Utils.winreg.OpenKey(all_versions,version+'\\'+uid+'\\C++\\'+targetDir) icl_version=Utils.winreg.OpenKey(all_versions,version+'\\'+uid+'\\C++') path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') + except OSError: + pass + else: batch_file=os.path.join(path,'bin','iclvars.bat') if os.path.isfile(batch_file): - try: - targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) - except conf.errors.ConfigurationError: - pass + targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) compilervars_warning_attr='_compilervars_warning_key' if version[0:2]=='13'and getattr(conf,compilervars_warning_attr,True): setattr(conf,compilervars_warning_attr,False) @@ -432,46 +454,25 @@ def gather_intel_composer_versions(conf,versions): dev_env_path=os.environ[vscomntool]+r'..\IDE\devenv.exe' if(r'if exist "%VS110COMNTOOLS%..\IDE\VSWinExpress.exe"'in Utils.readf(compilervars_arch)and not os.path.exists(vs_express_path)and not os.path.exists(dev_env_path)): Logs.warn(('The Intel compilervar_arch.bat only checks for one Visual Studio SKU ''(VSWinExpress.exe) but it does not seem to be installed at %r. ''The intel command line set up will fail to configure unless the file %r''is patched. See: %s')%(vs_express_path,compilervars_arch,patch_url)) - except WindowsError: - pass major=version[0:2] - versions.append(('intel '+major,targets)) + versions['intel '+major]=targets @conf -def get_msvc_versions(conf,eval_and_save=True): - if conf.env['MSVC_INSTALLED_VERSIONS']: - return conf.env['MSVC_INSTALLED_VERSIONS'] - lst=[] - conf.gather_icl_versions(lst) - conf.gather_intel_composer_versions(lst) - conf.gather_wsdk_versions(lst) - conf.gather_msvc_versions(lst) - if eval_and_save: - def checked_target(t): - target,(arch,paths)=t - try: - paths.evaluate() - except conf.errors.ConfigurationError: - return None - else: - return t - lst=[(version,list(filter(checked_target,targets)))for version,targets in lst] - conf.env['MSVC_INSTALLED_VERSIONS']=lst - return lst -@conf -def print_all_msvc_detected(conf): - for version,targets in conf.env['MSVC_INSTALLED_VERSIONS']: - Logs.info(version) - for target,l in targets: - Logs.info("\t"+target) +def detect_msvc(self): + return self.setup_msvc(self.get_msvc_versions()) @conf -def detect_msvc(conf,arch=False): - lazy_detect=getattr(Options.options,'msvc_lazy_autodetect',False)or conf.env['MSVC_LAZY_AUTODETECT'] - versions=get_msvc_versions(conf,not lazy_detect) - return setup_msvc(conf,versions,arch) +def get_msvc_versions(self): + dct=Utils.ordered_iter_dict() + self.gather_icl_versions(dct) + self.gather_intel_composer_versions(dct) + self.gather_wsdk_versions(dct) + self.gather_msvc_versions(dct) + self.gather_vswhere_versions(dct) + Logs.debug('msvc: detected versions %r',list(dct.keys())) + return dct @conf def find_lt_names_msvc(self,libname,is_static=False): lt_names=['lib%s.la'%libname,'%s.la'%libname,] - for path in self.env['LIBPATH']: + for path in self.env.LIBPATH: for la in lt_names: laf=os.path.join(path,la) dll=None @@ -507,12 +508,12 @@ def libname_msvc(self,libname,is_static=False): return None (lt_path,lt_libname,lt_static)=self.find_lt_names_msvc(lib,is_static) if lt_path!=None and lt_libname!=None: - if lt_static==True: + if lt_static: return os.path.join(lt_path,lt_libname) if lt_path!=None: - _libpaths=[lt_path]+self.env['LIBPATH'] + _libpaths=[lt_path]+self.env.LIBPATH else: - _libpaths=self.env['LIBPATH'] + _libpaths=self.env.LIBPATH static_libs=['lib%ss.lib'%lib,'lib%s.lib'%lib,'%ss.lib'%lib,'%s.lib'%lib,] dynamic_libs=['lib%s.dll.lib'%lib,'lib%s.dll.a'%lib,'%s.dll.lib'%lib,'%s.dll.a'%lib,'lib%s_d.lib'%lib,'%s_d.lib'%lib,'%s.lib'%lib,] libnames=static_libs @@ -521,9 +522,9 @@ def libname_msvc(self,libname,is_static=False): for path in _libpaths: for libn in libnames: if os.path.exists(os.path.join(path,libn)): - debug('msvc: lib found: %s'%os.path.join(path,libn)) + Logs.debug('msvc: lib found: %s',os.path.join(path,libn)) return re.sub('\.lib$','',libn) - self.fatal("The library %r could not be found"%libname) + self.fatal('The library %r could not be found'%libname) return re.sub('\.lib$','',libname) @conf def check_lib_msvc(self,libname,is_static=False,uselib_store=None): @@ -557,19 +558,17 @@ def autodetect(conf,arch=False): v=conf.env if v.NO_MSVC_DETECT: return + compiler,version,path,includes,libdirs,cpu=conf.detect_msvc() if arch: - compiler,version,path,includes,libdirs,arch=conf.detect_msvc(True) - v['DEST_CPU']=arch - else: - compiler,version,path,includes,libdirs=conf.detect_msvc() - v['PATH']=path - v['INCLUDES']=includes - v['LIBPATH']=libdirs - v['MSVC_COMPILER']=compiler + v.DEST_CPU=cpu + v.PATH=path + v.INCLUDES=includes + v.LIBPATH=libdirs + v.MSVC_COMPILER=compiler try: - v['MSVC_VERSION']=float(version) - except Exception: - v['MSVC_VERSION']=float(version[:-3]) + v.MSVC_VERSION=float(version) + except ValueError: + v.MSVC_VERSION=float(version[:-3]) def _get_prog_names(conf,compiler): if compiler=='intel': compiler_name='ICL' @@ -585,84 +584,79 @@ def find_msvc(conf): if sys.platform=='cygwin': conf.fatal('MSVC module does not work under cygwin Python!') v=conf.env - path=v['PATH'] - compiler=v['MSVC_COMPILER'] - version=v['MSVC_VERSION'] + path=v.PATH + compiler=v.MSVC_COMPILER + version=v.MSVC_VERSION compiler_name,linker_name,lib_name=_get_prog_names(conf,compiler) v.MSVC_MANIFEST=(compiler=='msvc'and version>=8)or(compiler=='wsdk'and version>=6)or(compiler=='intel'and version>=11) cxx=conf.find_program(compiler_name,var='CXX',path_list=path) env=dict(conf.environ) - if path:env.update(PATH=';'.join(path)) + if path: + env.update(PATH=';'.join(path)) if not conf.cmd_and_log(cxx+['/nologo','/help'],env=env): conf.fatal('the msvc compiler could not be identified') - v['CC']=v['CXX']=cxx - v['CC_NAME']=v['CXX_NAME']='msvc' - if not v['LINK_CXX']: - link=conf.find_program(linker_name,path_list=path) - if link:v['LINK_CXX']=link - else:conf.fatal('%s was not found (linker)'%linker_name) - v['LINK']=link - if not v['LINK_CC']: - v['LINK_CC']=v['LINK_CXX'] - if not v['AR']: + v.CC=v.CXX=cxx + v.CC_NAME=v.CXX_NAME='msvc' + if not v.LINK_CXX: + conf.find_program(linker_name,path_list=path,errmsg='%s was not found (linker)'%linker_name,var='LINK_CXX') + if not v.LINK_CC: + v.LINK_CC=v.LINK_CXX + if not v.AR: stliblink=conf.find_program(lib_name,path_list=path,var='AR') - if not stliblink:return - v['ARFLAGS']=['/NOLOGO'] + if not stliblink: + return + v.ARFLAGS=['/nologo'] if v.MSVC_MANIFEST: conf.find_program('MT',path_list=path,var='MT') - v['MTFLAGS']=['/NOLOGO'] + v.MTFLAGS=['/nologo'] try: conf.load('winres') - except Errors.WafError: - warn('Resource compiler not found. Compiling resource file is disabled') + except Errors.ConfigurationError: + Logs.warn('Resource compiler not found. Compiling resource file is disabled') @conf def visual_studio_add_flags(self): v=self.env - try:v.prepend_value('INCLUDES',[x for x in self.environ['INCLUDE'].split(';')if x]) - except Exception:pass - try:v.prepend_value('LIBPATH',[x for x in self.environ['LIB'].split(';')if x]) - except Exception:pass + if self.environ.get('INCLUDE'): + v.prepend_value('INCLUDES',[x for x in self.environ['INCLUDE'].split(';')if x]) + if self.environ.get('LIB'): + v.prepend_value('LIBPATH',[x for x in self.environ['LIB'].split(';')if x]) @conf def msvc_common_flags(conf): v=conf.env - v['DEST_BINFMT']='pe' + v.DEST_BINFMT='pe' v.append_value('CFLAGS',['/nologo']) v.append_value('CXXFLAGS',['/nologo']) - v['DEFINES_ST']='/D%s' - v['CC_SRC_F']='' - v['CC_TGT_F']=['/c','/Fo'] - v['CXX_SRC_F']='' - v['CXX_TGT_F']=['/c','/Fo'] + v.append_value('LINKFLAGS',['/nologo']) + v.DEFINES_ST='/D%s' + v.CC_SRC_F='' + v.CC_TGT_F=['/c','/Fo'] + v.CXX_SRC_F='' + v.CXX_TGT_F=['/c','/Fo'] if(v.MSVC_COMPILER=='msvc'and v.MSVC_VERSION>=8)or(v.MSVC_COMPILER=='wsdk'and v.MSVC_VERSION>=6): - v['CC_TGT_F']=['/FC']+v['CC_TGT_F'] - v['CXX_TGT_F']=['/FC']+v['CXX_TGT_F'] - v['CPPPATH_ST']='/I%s' - v['AR_TGT_F']=v['CCLNK_TGT_F']=v['CXXLNK_TGT_F']='/OUT:' - v['CFLAGS_CONSOLE']=v['CXXFLAGS_CONSOLE']=['/SUBSYSTEM:CONSOLE'] - v['CFLAGS_NATIVE']=v['CXXFLAGS_NATIVE']=['/SUBSYSTEM:NATIVE'] - v['CFLAGS_POSIX']=v['CXXFLAGS_POSIX']=['/SUBSYSTEM:POSIX'] - v['CFLAGS_WINDOWS']=v['CXXFLAGS_WINDOWS']=['/SUBSYSTEM:WINDOWS'] - v['CFLAGS_WINDOWSCE']=v['CXXFLAGS_WINDOWSCE']=['/SUBSYSTEM:WINDOWSCE'] - v['CFLAGS_CRT_MULTITHREADED']=v['CXXFLAGS_CRT_MULTITHREADED']=['/MT'] - v['CFLAGS_CRT_MULTITHREADED_DLL']=v['CXXFLAGS_CRT_MULTITHREADED_DLL']=['/MD'] - v['CFLAGS_CRT_MULTITHREADED_DBG']=v['CXXFLAGS_CRT_MULTITHREADED_DBG']=['/MTd'] - v['CFLAGS_CRT_MULTITHREADED_DLL_DBG']=v['CXXFLAGS_CRT_MULTITHREADED_DLL_DBG']=['/MDd'] - v['LIB_ST']='%s.lib' - v['LIBPATH_ST']='/LIBPATH:%s' - v['STLIB_ST']='%s.lib' - v['STLIBPATH_ST']='/LIBPATH:%s' - v.append_value('LINKFLAGS',['/NOLOGO']) - if v['MSVC_MANIFEST']: + v.CC_TGT_F=['/FC']+v.CC_TGT_F + v.CXX_TGT_F=['/FC']+v.CXX_TGT_F + v.CPPPATH_ST='/I%s' + v.AR_TGT_F=v.CCLNK_TGT_F=v.CXXLNK_TGT_F='/OUT:' + v.CFLAGS_CRT_MULTITHREADED=v.CXXFLAGS_CRT_MULTITHREADED=['/MT'] + v.CFLAGS_CRT_MULTITHREADED_DLL=v.CXXFLAGS_CRT_MULTITHREADED_DLL=['/MD'] + v.CFLAGS_CRT_MULTITHREADED_DBG=v.CXXFLAGS_CRT_MULTITHREADED_DBG=['/MTd'] + v.CFLAGS_CRT_MULTITHREADED_DLL_DBG=v.CXXFLAGS_CRT_MULTITHREADED_DLL_DBG=['/MDd'] + v.LIB_ST='%s.lib' + v.LIBPATH_ST='/LIBPATH:%s' + v.STLIB_ST='%s.lib' + v.STLIBPATH_ST='/LIBPATH:%s' + if v.MSVC_MANIFEST: v.append_value('LINKFLAGS',['/MANIFEST']) - v['CFLAGS_cshlib']=[] - v['CXXFLAGS_cxxshlib']=[] - v['LINKFLAGS_cshlib']=v['LINKFLAGS_cxxshlib']=['/DLL'] - v['cshlib_PATTERN']=v['cxxshlib_PATTERN']='%s.dll' - v['implib_PATTERN']='%s.lib' - v['IMPLIB_ST']='/IMPLIB:%s' - v['LINKFLAGS_cstlib']=[] - v['cstlib_PATTERN']=v['cxxstlib_PATTERN']='%s.lib' - v['cprogram_PATTERN']=v['cxxprogram_PATTERN']='%s.exe' + v.CFLAGS_cshlib=[] + v.CXXFLAGS_cxxshlib=[] + v.LINKFLAGS_cshlib=v.LINKFLAGS_cxxshlib=['/DLL'] + v.cshlib_PATTERN=v.cxxshlib_PATTERN='%s.dll' + v.implib_PATTERN='%s.lib' + v.IMPLIB_ST='/IMPLIB:%s' + v.LINKFLAGS_cstlib=[] + v.cstlib_PATTERN=v.cxxstlib_PATTERN='%s.lib' + v.cprogram_PATTERN=v.cxxprogram_PATTERN='%s.exe' + v.def_PATTERN='/def:%s' @after_method('apply_link') @feature('c','cxx') def apply_flags_msvc(self): @@ -681,7 +675,7 @@ def apply_flags_msvc(self): pdbnode=self.link_task.outputs[0].change_ext('.pdb') self.link_task.outputs.append(pdbnode) if getattr(self,'install_task',None): - self.pdb_install_task=self.bld.install_files(self.install_task.dest,pdbnode,env=self.env) + self.pdb_install_task=self.add_install_files(install_to=self.install_task.install_to,install_from=pdbnode) break @feature('cprogram','cshlib','cxxprogram','cxxshlib') @after_method('apply_link') @@ -690,109 +684,11 @@ def apply_manifest(self): out_node=self.link_task.outputs[0] man_node=out_node.parent.find_or_declare(out_node.name+'.manifest') self.link_task.outputs.append(man_node) - self.link_task.do_manifest=True -def exec_mf(self): - env=self.env - mtool=env['MT'] - if not mtool: - return 0 - self.do_manifest=False - outfile=self.outputs[0].abspath() - manifest=None - for out_node in self.outputs: - if out_node.name.endswith('.manifest'): - manifest=out_node.abspath() - break - if manifest is None: - return 0 - mode='' - if'cprogram'in self.generator.features or'cxxprogram'in self.generator.features: - mode='1' - elif'cshlib'in self.generator.features or'cxxshlib'in self.generator.features: - mode='2' - debug('msvc: embedding manifest in mode %r'%mode) - lst=[]+mtool - lst.extend(Utils.to_list(env['MTFLAGS'])) - lst.extend(['-manifest',manifest]) - lst.append('-outputresource:%s;%s'%(outfile,mode)) - return self.exec_command(lst) -def quote_response_command(self,flag): - if flag.find(' ')>-1: - for x in('/LIBPATH:','/IMPLIB:','/OUT:','/I'): - if flag.startswith(x): - flag='%s"%s"'%(x,flag[len(x):]) - break - else: - flag='"%s"'%flag - return flag -def exec_response_command(self,cmd,**kw): - try: - tmp=None - if sys.platform.startswith('win')and isinstance(cmd,list)and len(' '.join(cmd))>=8192: - program=cmd[0] - cmd=[self.quote_response_command(x)for x in cmd] - (fd,tmp)=tempfile.mkstemp() - os.write(fd,'\r\n'.join(i.replace('\\','\\\\')for i in cmd[1:])) - os.close(fd) - cmd=[program,'@'+tmp] - ret=self.generator.bld.exec_command(cmd,**kw) - finally: - if tmp: - try: - os.remove(tmp) - except OSError: - pass - return ret -def exec_command_msvc(self,*k,**kw): - if isinstance(k[0],list): - lst=[] - carry='' - for a in k[0]: - if a=='/Fo'or a=='/doc'or a[-1]==':': - carry=a - else: - lst.append(carry+a) - carry='' - k=[lst] - if self.env['PATH']: - env=dict(self.env.env or os.environ) - env.update(PATH=';'.join(self.env['PATH'])) - kw['env']=env - bld=self.generator.bld - try: - if not kw.get('cwd',None): - kw['cwd']=bld.cwd - except AttributeError: - bld.cwd=kw['cwd']=bld.variant_dir - ret=self.exec_response_command(k[0],**kw) - if not ret and getattr(self,'do_manifest',None): - ret=self.exec_mf() - return ret -def wrap_class(class_name): - cls=Task.classes.get(class_name,None) - if not cls: - return None - derived_class=type(class_name,(cls,),{}) - def exec_command(self,*k,**kw): - if self.env['CC_NAME']=='msvc': - return self.exec_command_msvc(*k,**kw) - else: - return super(derived_class,self).exec_command(*k,**kw) - derived_class.exec_command=exec_command - derived_class.exec_response_command=exec_response_command - derived_class.quote_response_command=quote_response_command - derived_class.exec_command_msvc=exec_command_msvc - derived_class.exec_mf=exec_mf - if hasattr(cls,'hcode'): - derived_class.hcode=cls.hcode - return derived_class -for k in'c cxx cprogram cxxprogram cshlib cxxshlib cstlib cxxstlib'.split(): - wrap_class(k) + self.env.DO_MANIFEST=True def make_winapp(self,family): append=self.env.append_unique append('DEFINES','WINAPI_FAMILY=%s'%family) - append('CXXFLAGS','/ZW') - append('CXXFLAGS','/TP') + append('CXXFLAGS',['/ZW','/TP']) for lib_path in self.env.LIBPATH: append('CXXFLAGS','/AI%s'%lib_path) @feature('winphoneapp') @@ -800,8 +696,7 @@ def make_winapp(self,family): @after_method('propagate_uselib_vars') def make_winphone_app(self): make_winapp(self,'WINAPI_FAMILY_PHONE_APP') - conf.env.append_unique('LINKFLAGS','/NODEFAULTLIB:ole32.lib') - conf.env.append_unique('LINKFLAGS','PhoneAppModelHost.lib') + self.env.append_unique('LINKFLAGS',['/NODEFAULTLIB:ole32.lib','PhoneAppModelHost.lib']) @feature('winapp') @after_method('process_use') @after_method('propagate_uselib_vars') diff --git a/waflib/Tools/nobuild.py b/waflib/Tools/nobuild.py new file mode 100644 index 0000000..beb2217 --- /dev/null +++ b/waflib/Tools/nobuild.py @@ -0,0 +1,11 @@ +#! /usr/bin/env python +# encoding: utf-8 +# WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file + +from waflib import Task +def build(bld): + def run(self): + for x in self.outputs: + x.write('') + for(name,cls)in Task.classes.items(): + cls.run=run diff --git a/waflib/Tools/perl.py b/waflib/Tools/perl.py index 47506d8..ee86113 100644 --- a/waflib/Tools/perl.py +++ b/waflib/Tools/perl.py @@ -3,15 +3,16 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file import os -from waflib import Task,Options,Utils +from waflib import Task,Options,Utils,Errors from waflib.Configure import conf from waflib.TaskGen import extension,feature,before_method @before_method('apply_incpaths','apply_link','propagate_uselib_vars') @feature('perlext') def init_perlext(self): self.uselib=self.to_list(getattr(self,'uselib',[])) - if not'PERLEXT'in self.uselib:self.uselib.append('PERLEXT') - self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['perlext_PATTERN'] + if not'PERLEXT'in self.uselib: + self.uselib.append('PERLEXT') + self.env.cshlib_PATTERN=self.env.cxxshlib_PATTERN=self.env.perlext_PATTERN @extension('.xs') def xsubpp_file(self,node): outnode=node.change_ext('.c') @@ -29,14 +30,8 @@ def check_perl_version(self,minver=None): else: cver='' self.start_msg('Checking for minimum perl version %s'%cver) - perl=Utils.to_list(getattr(Options.options,'perlbinary',None)) - if not perl: - perl=self.find_program('perl',var='PERL') - if not perl: - self.end_msg("Perl not found",color="YELLOW") - return False - self.env['PERL']=perl - version=self.cmd_and_log(self.env.PERL+["-e",'printf \"%vd\", $^V']) + perl=self.find_program('perl',var='PERL',value=getattr(Options.options,'perlbinary',None)) + version=self.cmd_and_log(perl+["-e",'printf \"%vd\", $^V']) if not version: res=False version="Unknown" @@ -44,7 +39,7 @@ def check_perl_version(self,minver=None): ver=tuple(map(int,version.split("."))) if ver<minver: res=False - self.end_msg(version,color=res and"GREEN"or"YELLOW") + self.end_msg(version,color=res and'GREEN'or'YELLOW') return res @conf def check_perl_module(self,module): @@ -52,7 +47,7 @@ def check_perl_module(self,module): self.start_msg('perl module %s'%module) try: r=self.cmd_and_log(cmd) - except Exception: + except Errors.WafError: self.end_msg(False) return None self.end_msg(r or True) @@ -75,16 +70,16 @@ def check_perl_ext_devel(self): if xsubpp and os.path.isfile(xsubpp[0]): return xsubpp return self.find_program('xsubpp') - env['LINKFLAGS_PERLEXT']=cfg_lst('$Config{lddlflags}') - env['INCLUDES_PERLEXT']=cfg_lst('$Config{archlib}/CORE') - env['CFLAGS_PERLEXT']=cfg_lst('$Config{ccflags} $Config{cccdlflags}') - env['EXTUTILS_TYPEMAP']=cfg_lst('$Config{privlib}/ExtUtils/typemap') - env['XSUBPP']=find_xsubpp() + env.LINKFLAGS_PERLEXT=cfg_lst('$Config{lddlflags}') + env.INCLUDES_PERLEXT=cfg_lst('$Config{archlib}/CORE') + env.CFLAGS_PERLEXT=cfg_lst('$Config{ccflags} $Config{cccdlflags}') + env.EXTUTILS_TYPEMAP=cfg_lst('$Config{privlib}/ExtUtils/typemap') + env.XSUBPP=find_xsubpp() if not getattr(Options.options,'perlarchdir',None): - env['ARCHDIR_PERL']=cfg_str('$Config{sitearch}') + env.ARCHDIR_PERL=cfg_str('$Config{sitearch}') else: - env['ARCHDIR_PERL']=getattr(Options.options,'perlarchdir') - env['perlext_PATTERN']='%s.'+cfg_str('$Config{dlext}') + env.ARCHDIR_PERL=getattr(Options.options,'perlarchdir') + env.perlext_PATTERN='%s.'+cfg_str('$Config{dlext}') def options(opt): opt.add_option('--with-perl-binary',type='string',dest='perlbinary',help='Specify alternate perl binary',default=None) opt.add_option('--with-perl-archdir',type='string',dest='perlarchdir',help='Specify directory where to install arch specific files',default=None) diff --git a/waflib/Tools/python.py b/waflib/Tools/python.py index 17c054e..f41f15c 100644 --- a/waflib/Tools/python.py +++ b/waflib/Tools/python.py @@ -3,7 +3,7 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file import os,sys -from waflib import Utils,Options,Errors,Logs,Task,Node +from waflib import Errors,Logs,Node,Options,Task,Utils from waflib.TaskGen import extension,before_method,after_method,feature from waflib.Configure import conf FRAG=''' @@ -44,12 +44,12 @@ def feature_py(self): self.install_32=True @extension('.py') def process_py(self,node): - assert(getattr(self,'install_path')),'add features="py"' + assert(hasattr(self,'install_path')),'add features="py"' if self.install_path: if self.install_from: - self.bld.install_files(self.install_path,[node],cwd=self.install_from,relative_trick=True) + self.add_install_files(install_to=self.install_path,install_from=node,cwd=self.install_from,relative_trick=True) else: - self.bld.install_files(self.install_path,[node],relative_trick=True) + self.add_install_files(install_to=self.install_path,install_from=node,relative_trick=True) lst=[] if self.env.PYC: lst.append('pyc') @@ -63,7 +63,7 @@ def process_py(self,node): else: pyd=node.abspath() for ext in lst: - if self.env.PYTAG: + if self.env.PYTAG and not self.env.NOPYCACHE: name=node.name[:-3] pyobj=node.parent.get_bld().make_node('__pycache__').make_node("%s.%s.%s"%(name,self.env.PYTAG,ext)) pyobj.parent.mkdir() @@ -72,15 +72,21 @@ def process_py(self,node): tsk=self.create_task(ext,node,pyobj) tsk.pyd=pyd if self.install_path: - self.bld.install_files(os.path.dirname(pyd),pyobj,cwd=node.parent.get_bld(),relative_trick=True) + self.add_install_files(install_to=os.path.dirname(pyd),install_from=pyobj,cwd=node.parent.get_bld(),relative_trick=True) class pyc(Task.Task): color='PINK' + def __str__(self): + node=self.outputs[0] + return node.path_from(node.ctx.launch_node()) def run(self): cmd=[Utils.subst_vars('${PYTHON}',self.env),'-c',INST,self.inputs[0].abspath(),self.outputs[0].abspath(),self.pyd] ret=self.generator.bld.exec_command(cmd) return ret class pyo(Task.Task): color='PINK' + def __str__(self): + node=self.outputs[0] + return node.path_from(node.ctx.launch_node()) def run(self): cmd=[Utils.subst_vars('${PYTHON}',self.env),Utils.subst_vars('${PYFLAGS_OPT}',self.env),'-c',INST,self.inputs[0].abspath(),self.outputs[0].abspath(),self.pyd] ret=self.generator.bld.exec_command(cmd) @@ -162,14 +168,14 @@ def python_cross_compile(self,features='pyembed pyext'): self.env[x]=self.environ[x] xx=self.env.CXX_NAME and'cxx'or'c' if'pyext'in features: - flags=self.environ.get('PYTHON_PYEXT_LDFLAGS',self.environ.get('PYTHON_LDFLAGS',None)) + flags=self.environ.get('PYTHON_PYEXT_LDFLAGS',self.environ.get('PYTHON_LDFLAGS')) if flags is None: self.fatal('No flags provided through PYTHON_PYEXT_LDFLAGS as required') else: self.parse_flags(flags,'PYEXT') self.test_pyext(xx) if'pyembed'in features: - flags=self.environ.get('PYTHON_PYEMBED_LDFLAGS',self.environ.get('PYTHON_LDFLAGS',None)) + flags=self.environ.get('PYTHON_PYEMBED_LDFLAGS',self.environ.get('PYTHON_LDFLAGS')) if flags is None: self.fatal('No flags provided through PYTHON_PYEMBED_LDFLAGS as required') else: @@ -181,11 +187,11 @@ def check_python_headers(conf,features='pyembed pyext'): features=Utils.to_list(features) assert('pyembed'in features)or('pyext'in features),"check_python_headers features must include 'pyembed' and/or 'pyext'" env=conf.env - if not env['CC_NAME']and not env['CXX_NAME']: + if not env.CC_NAME and not env.CXX_NAME: conf.fatal('load a compiler first (gcc, g++, ..)') if conf.python_cross_compile(features): return - if not env['PYTHON_VERSION']: + if not env.PYTHON_VERSION: conf.check_python_version() pybin=env.PYTHON if not pybin: @@ -201,10 +207,12 @@ def check_python_headers(conf,features='pyembed pyext'): x='MACOSX_DEPLOYMENT_TARGET' if dct[x]: env[x]=conf.environ[x]=dct[x] - env['pyext_PATTERN']='%s'+dct['SO'] - num='.'.join(env['PYTHON_VERSION'].split('.')[:2]) + env.pyext_PATTERN='%s'+dct['SO'] + num='.'.join(env.PYTHON_VERSION.split('.')[:2]) conf.find_program([''.join(pybin)+'-config','python%s-config'%num,'python-config-%s'%num,'python%sm-config'%num],var='PYTHON_CONFIG',msg="python-config",mandatory=False) if env.PYTHON_CONFIG: + if conf.env.HAVE_PYTHON_H: + return all_flags=[['--cflags','--libs','--ldflags']] if sys.hexversion<0x2070000: all_flags=[[k]for k in all_flags[0]] @@ -239,10 +247,10 @@ def check_python_headers(conf,features='pyembed pyext'): conf.parse_flags(all_flags,'PYEXT') result=None if not dct["LDVERSION"]: - dct["LDVERSION"]=env['PYTHON_VERSION'] - for name in('python'+dct['LDVERSION'],'python'+env['PYTHON_VERSION']+'m','python'+env['PYTHON_VERSION'].replace('.','')): - if not result and env['LIBPATH_PYEMBED']: - path=env['LIBPATH_PYEMBED'] + dct["LDVERSION"]=env.PYTHON_VERSION + for name in('python'+dct['LDVERSION'],'python'+env.PYTHON_VERSION+'m','python'+env.PYTHON_VERSION.replace('.','')): + if not result and env.LIBPATH_PYEMBED: + path=env.LIBPATH_PYEMBED conf.to_log("\n\n# Trying default LIBPATH_PYEMBED: %r\n"%path) result=conf.check(lib=name,uselib='PYEMBED',libpath=path,mandatory=False,msg='Checking for library %s in LIBPATH_PYEMBED'%name) if not result and dct['LIBDIR']: @@ -260,20 +268,20 @@ def check_python_headers(conf,features='pyembed pyext'): if result: break if result: - env['LIBPATH_PYEMBED']=path + env.LIBPATH_PYEMBED=path env.append_value('LIB_PYEMBED',[name]) else: conf.to_log("\n\n### LIB NOT FOUND\n") if Utils.is_win32 or dct['Py_ENABLE_SHARED']: - env['LIBPATH_PYEXT']=env['LIBPATH_PYEMBED'] - env['LIB_PYEXT']=env['LIB_PYEMBED'] + env.LIBPATH_PYEXT=env.LIBPATH_PYEMBED + env.LIB_PYEXT=env.LIB_PYEMBED conf.to_log("Include path for Python extensions (found via distutils module): %r\n"%(dct['INCLUDEPY'],)) - env['INCLUDES_PYEXT']=[dct['INCLUDEPY']] - env['INCLUDES_PYEMBED']=[dct['INCLUDEPY']] - if env['CC_NAME']=='gcc': + env.INCLUDES_PYEXT=[dct['INCLUDEPY']] + env.INCLUDES_PYEMBED=[dct['INCLUDEPY']] + if env.CC_NAME=='gcc': env.append_value('CFLAGS_PYEMBED',['-fno-strict-aliasing']) env.append_value('CFLAGS_PYEXT',['-fno-strict-aliasing']) - if env['CXX_NAME']=='gcc': + if env.CXX_NAME=='gcc': env.append_value('CXXFLAGS_PYEMBED',['-fno-strict-aliasing']) env.append_value('CXXFLAGS_PYEXT',['-fno-strict-aliasing']) if env.CC_NAME=="msvc": @@ -287,20 +295,20 @@ def check_python_headers(conf,features='pyembed pyext'): @conf def check_python_version(conf,minver=None): assert minver is None or isinstance(minver,tuple) - pybin=conf.env['PYTHON'] + pybin=conf.env.PYTHON if not pybin: conf.fatal('could not find the python executable') cmd=pybin+['-c','import sys\nfor x in sys.version_info: print(str(x))'] - Logs.debug('python: Running python command %r'%cmd) + Logs.debug('python: Running python command %r',cmd) lines=conf.cmd_and_log(cmd).split() - assert len(lines)==5,"found %i lines, expected 5: %r"%(len(lines),lines) + assert len(lines)==5,"found %r lines, expected 5: %r"%(len(lines),lines) pyver_tuple=(int(lines[0]),int(lines[1]),int(lines[2]),lines[3],int(lines[4])) result=(minver is None)or(pyver_tuple>=minver) if result: pyver='.'.join([str(x)for x in pyver_tuple[:2]]) - conf.env['PYTHON_VERSION']=pyver + conf.env.PYTHON_VERSION=pyver if'PYTHONDIR'in conf.env: - pydir=conf.env['PYTHONDIR'] + pydir=conf.env.PYTHONDIR elif'PYTHONDIR'in conf.environ: pydir=conf.environ['PYTHONDIR'] else: @@ -310,12 +318,12 @@ def check_python_version(conf,minver=None): python_LIBDEST=None (pydir,)=conf.get_python_variables(["get_python_lib(standard_lib=0, prefix=%r) or ''"%conf.env.PREFIX]) if python_LIBDEST is None: - if conf.env['LIBDIR']: - python_LIBDEST=os.path.join(conf.env['LIBDIR'],"python"+pyver) + if conf.env.LIBDIR: + python_LIBDEST=os.path.join(conf.env.LIBDIR,'python'+pyver) else: - python_LIBDEST=os.path.join(conf.env['PREFIX'],"lib","python"+pyver) + python_LIBDEST=os.path.join(conf.env.PREFIX,'lib','python'+pyver) if'PYTHONARCHDIR'in conf.env: - pyarchdir=conf.env['PYTHONARCHDIR'] + pyarchdir=conf.env.PYTHONARCHDIR elif'PYTHONARCHDIR'in conf.environ: pyarchdir=conf.environ['PYTHONARCHDIR'] else: @@ -325,8 +333,8 @@ def check_python_version(conf,minver=None): if hasattr(conf,'define'): conf.define('PYTHONDIR',pydir) conf.define('PYTHONARCHDIR',pyarchdir) - conf.env['PYTHONDIR']=pydir - conf.env['PYTHONARCHDIR']=pyarchdir + conf.env.PYTHONDIR=pydir + conf.env.PYTHONARCHDIR=pyarchdir pyver_full='.'.join(map(str,pyver_tuple[:3])) if minver is None: conf.msg('Checking for python version',pyver_full) @@ -345,13 +353,13 @@ else: ''' @conf def check_python_module(conf,module_name,condition=''): - msg="Checking for python module '%s'"%module_name + msg="Checking for python module %r"%module_name if condition: msg='%s (%s)'%(msg,condition) conf.start_msg(msg) try: - ret=conf.cmd_and_log(conf.env['PYTHON']+['-c',PYTHON_MODULE_TEMPLATE%module_name]) - except Exception: + ret=conf.cmd_and_log(conf.env.PYTHON+['-c',PYTHON_MODULE_TEMPLATE%module_name]) + except Errors.WafError: conf.end_msg(False) conf.fatal('Could not find the python module %r'%module_name) ret=ret.strip() @@ -376,16 +384,20 @@ def check_python_module(conf,module_name,condition=''): conf.end_msg(ret) def configure(conf): v=conf.env - v['PYTHON']=Options.options.python or os.environ.get('PYTHON',sys.executable) - if Options.options.pythondir: - v['PYTHONDIR']=Options.options.pythondir - if Options.options.pythonarchdir: - v['PYTHONARCHDIR']=Options.options.pythonarchdir + if getattr(Options.options,'pythondir',None): + v.PYTHONDIR=Options.options.pythondir + if getattr(Options.options,'pythonarchdir',None): + v.PYTHONARCHDIR=Options.options.pythonarchdir + if getattr(Options.options,'nopycache',None): + v.NOPYCACHE=Options.options.nopycache + if not v.PYTHON: + v.PYTHON=[getattr(Options.options,'python',None)or sys.executable] + v.PYTHON=Utils.to_list(v.PYTHON) conf.find_program('python',var='PYTHON') - v['PYFLAGS']='' - v['PYFLAGS_OPT']='-O' - v['PYC']=getattr(Options.options,'pyc',1) - v['PYO']=getattr(Options.options,'pyo',1) + v.PYFLAGS='' + v.PYFLAGS_OPT='-O' + v.PYC=getattr(Options.options,'pyc',1) + v.PYO=getattr(Options.options,'pyo',1) try: v.PYTAG=conf.cmd_and_log(conf.env.PYTHON+['-c',"import imp;print(imp.get_tag())"]).strip() except Errors.WafError: @@ -394,6 +406,7 @@ def options(opt): pyopt=opt.add_option_group("Python Options") pyopt.add_option('--nopyc',dest='pyc',action='store_false',default=1,help='Do not install bytecode compiled .pyc files (configuration) [Default:install]') pyopt.add_option('--nopyo',dest='pyo',action='store_false',default=1,help='Do not install optimised compiled .pyo files (configuration) [Default:install]') + pyopt.add_option('--nopycache',dest='nopycache',action='store_true',help='Do not use __pycache__ directory to install objects [Default:auto]') pyopt.add_option('--python',dest="python",help='python binary to be used [Default: %s]'%sys.executable) pyopt.add_option('--pythondir',dest='pythondir',help='Installation path for python modules (py, platform-independent .py and .pyc files)') pyopt.add_option('--pythonarchdir',dest='pythonarchdir',help='Installation path for python extension (pyext, platform-dependent .so or .dylib files)') diff --git a/waflib/Tools/qt4.py b/waflib/Tools/qt4.py deleted file mode 100644 index 896c5b4..0000000 --- a/waflib/Tools/qt4.py +++ /dev/null @@ -1,442 +0,0 @@ -#! /usr/bin/env python -# encoding: utf-8 -# WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file - -try: - from xml.sax import make_parser - from xml.sax.handler import ContentHandler -except ImportError: - has_xml=False - ContentHandler=object -else: - has_xml=True -import os,sys -from waflib.Tools import cxx -from waflib import Task,Utils,Options,Errors,Context -from waflib.TaskGen import feature,after_method,extension -from waflib.Configure import conf -from waflib import Logs -MOC_H=['.h','.hpp','.hxx','.hh'] -EXT_RCC=['.qrc'] -EXT_UI=['.ui'] -EXT_QT4=['.cpp','.cc','.cxx','.C'] -QT4_LIBS="QtCore QtGui QtUiTools QtNetwork QtOpenGL QtSql QtSvg QtTest QtXml QtXmlPatterns QtWebKit Qt3Support QtHelp QtScript QtDeclarative QtDesigner" -class qxx(Task.classes['cxx']): - def __init__(self,*k,**kw): - Task.Task.__init__(self,*k,**kw) - self.moc_done=0 - def runnable_status(self): - if self.moc_done: - return Task.Task.runnable_status(self) - else: - for t in self.run_after: - if not t.hasrun: - return Task.ASK_LATER - self.add_moc_tasks() - return Task.Task.runnable_status(self) - def create_moc_task(self,h_node,m_node): - try: - moc_cache=self.generator.bld.moc_cache - except AttributeError: - moc_cache=self.generator.bld.moc_cache={} - try: - return moc_cache[h_node] - except KeyError: - tsk=moc_cache[h_node]=Task.classes['moc'](env=self.env,generator=self.generator) - tsk.set_inputs(h_node) - tsk.set_outputs(m_node) - if self.generator: - self.generator.tasks.append(tsk) - gen=self.generator.bld.producer - gen.outstanding.insert(0,tsk) - gen.total+=1 - return tsk - def moc_h_ext(self): - ext=[] - try: - ext=Options.options.qt_header_ext.split() - except AttributeError: - pass - if not ext: - ext=MOC_H - return ext - def add_moc_tasks(self): - node=self.inputs[0] - bld=self.generator.bld - try: - self.signature() - except KeyError: - pass - else: - delattr(self,'cache_sig') - include_nodes=[node.parent]+self.generator.includes_nodes - moctasks=[] - mocfiles=set([]) - for d in bld.raw_deps.get(self.uid(),[]): - if not d.endswith('.moc'): - continue - if d in mocfiles: - continue - mocfiles.add(d) - h_node=None - base2=d[:-4] - for x in include_nodes: - for e in self.moc_h_ext(): - h_node=x.find_node(base2+e) - if h_node: - break - if h_node: - m_node=h_node.change_ext('.moc') - break - else: - for k in EXT_QT4: - if base2.endswith(k): - for x in include_nodes: - h_node=x.find_node(base2) - if h_node: - break - if h_node: - m_node=h_node.change_ext(k+'.moc') - break - if not h_node: - raise Errors.WafError('No source found for %r which is a moc file'%d) - task=self.create_moc_task(h_node,m_node) - moctasks.append(task) - self.run_after.update(set(moctasks)) - self.moc_done=1 -class trans_update(Task.Task): - run_str='${QT_LUPDATE} ${SRC} -ts ${TGT}' - color='BLUE' -Task.update_outputs(trans_update) -class XMLHandler(ContentHandler): - def __init__(self): - self.buf=[] - self.files=[] - def startElement(self,name,attrs): - if name=='file': - self.buf=[] - def endElement(self,name): - if name=='file': - self.files.append(str(''.join(self.buf))) - def characters(self,cars): - self.buf.append(cars) -@extension(*EXT_RCC) -def create_rcc_task(self,node): - rcnode=node.change_ext('_rc.cpp') - self.create_task('rcc',node,rcnode) - cpptask=self.create_task('cxx',rcnode,rcnode.change_ext('.o')) - try: - self.compiled_tasks.append(cpptask) - except AttributeError: - self.compiled_tasks=[cpptask] - return cpptask -@extension(*EXT_UI) -def create_uic_task(self,node): - uictask=self.create_task('ui4',node) - uictask.outputs=[self.path.find_or_declare(self.env['ui_PATTERN']%node.name[:-3])] -@extension('.ts') -def add_lang(self,node): - self.lang=self.to_list(getattr(self,'lang',[]))+[node] -@feature('qt4') -@after_method('apply_link') -def apply_qt4(self): - if getattr(self,'lang',None): - qmtasks=[] - for x in self.to_list(self.lang): - if isinstance(x,str): - x=self.path.find_resource(x+'.ts') - qmtasks.append(self.create_task('ts2qm',x,x.change_ext('.qm'))) - if getattr(self,'update',None)and Options.options.trans_qt4: - cxxnodes=[a.inputs[0]for a in self.compiled_tasks]+[a.inputs[0]for a in self.tasks if getattr(a,'inputs',None)and a.inputs[0].name.endswith('.ui')] - for x in qmtasks: - self.create_task('trans_update',cxxnodes,x.inputs) - if getattr(self,'langname',None): - qmnodes=[x.outputs[0]for x in qmtasks] - rcnode=self.langname - if isinstance(rcnode,str): - rcnode=self.path.find_or_declare(rcnode+'.qrc') - t=self.create_task('qm2rcc',qmnodes,rcnode) - k=create_rcc_task(self,t.outputs[0]) - self.link_task.inputs.append(k.outputs[0]) - lst=[] - for flag in self.to_list(self.env['CXXFLAGS']): - if len(flag)<2:continue - f=flag[0:2] - if f in('-D','-I','/D','/I'): - if(f[0]=='/'): - lst.append('-'+flag[1:]) - else: - lst.append(flag) - self.env.append_value('MOC_FLAGS',lst) -@extension(*EXT_QT4) -def cxx_hook(self,node): - return self.create_compiled_task('qxx',node) -class rcc(Task.Task): - color='BLUE' - run_str='${QT_RCC} -name ${tsk.rcname()} ${SRC[0].abspath()} ${RCC_ST} -o ${TGT}' - ext_out=['.h'] - def rcname(self): - return os.path.splitext(self.inputs[0].name)[0] - def scan(self): - if not has_xml: - Logs.error('no xml support was found, the rcc dependencies will be incomplete!') - return([],[]) - parser=make_parser() - curHandler=XMLHandler() - parser.setContentHandler(curHandler) - fi=open(self.inputs[0].abspath(),'r') - try: - parser.parse(fi) - finally: - fi.close() - nodes=[] - names=[] - root=self.inputs[0].parent - for x in curHandler.files: - nd=root.find_resource(x) - if nd:nodes.append(nd) - else:names.append(x) - return(nodes,names) -class moc(Task.Task): - color='BLUE' - run_str='${QT_MOC} ${MOC_FLAGS} ${MOCCPPPATH_ST:INCPATHS} ${MOCDEFINES_ST:DEFINES} ${SRC} ${MOC_ST} ${TGT}' - def keyword(self): - return"Creating" - def __str__(self): - return self.outputs[0].path_from(self.generator.bld.launch_node()) -class ui4(Task.Task): - color='BLUE' - run_str='${QT_UIC} ${SRC} -o ${TGT}' - ext_out=['.h'] -class ts2qm(Task.Task): - color='BLUE' - run_str='${QT_LRELEASE} ${QT_LRELEASE_FLAGS} ${SRC} -qm ${TGT}' -class qm2rcc(Task.Task): - color='BLUE' - after='ts2qm' - def run(self): - txt='\n'.join(['<file>%s</file>'%k.path_from(self.outputs[0].parent)for k in self.inputs]) - code='<!DOCTYPE RCC><RCC version="1.0">\n<qresource>\n%s\n</qresource>\n</RCC>'%txt - self.outputs[0].write(code) -def configure(self): - self.find_qt4_binaries() - self.set_qt4_libs_to_check() - self.set_qt4_defines() - self.find_qt4_libraries() - self.add_qt4_rpath() - self.simplify_qt4_libs() -@conf -def find_qt4_binaries(self): - env=self.env - opt=Options.options - qtdir=getattr(opt,'qtdir','') - qtbin=getattr(opt,'qtbin','') - paths=[] - if qtdir: - qtbin=os.path.join(qtdir,'bin') - if not qtdir: - qtdir=os.environ.get('QT4_ROOT','') - qtbin=os.environ.get('QT4_BIN',None)or os.path.join(qtdir,'bin') - if qtbin: - paths=[qtbin] - if not qtdir: - paths=os.environ.get('PATH','').split(os.pathsep) - paths.append('/usr/share/qt4/bin/') - try: - lst=Utils.listdir('/usr/local/Trolltech/') - except OSError: - pass - else: - if lst: - lst.sort() - lst.reverse() - qtdir='/usr/local/Trolltech/%s/'%lst[0] - qtbin=os.path.join(qtdir,'bin') - paths.append(qtbin) - cand=None - prev_ver=['4','0','0'] - for qmk in('qmake-qt4','qmake4','qmake'): - try: - qmake=self.find_program(qmk,path_list=paths) - except self.errors.ConfigurationError: - pass - else: - try: - version=self.cmd_and_log(qmake+['-query','QT_VERSION']).strip() - except self.errors.WafError: - pass - else: - if version: - new_ver=version.split('.') - if new_ver>prev_ver: - cand=qmake - prev_ver=new_ver - if cand: - self.env.QMAKE=cand - else: - self.fatal('Could not find qmake for qt4') - qtbin=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_BINS']).strip()+os.sep - def find_bin(lst,var): - if var in env: - return - for f in lst: - try: - ret=self.find_program(f,path_list=paths) - except self.errors.ConfigurationError: - pass - else: - env[var]=ret - break - find_bin(['uic-qt3','uic3'],'QT_UIC3') - find_bin(['uic-qt4','uic'],'QT_UIC') - if not env.QT_UIC: - self.fatal('cannot find the uic compiler for qt4') - self.start_msg('Checking for uic version') - uicver=self.cmd_and_log(env.QT_UIC+["-version"],output=Context.BOTH) - uicver=''.join(uicver).strip() - uicver=uicver.replace('Qt User Interface Compiler ','').replace('User Interface Compiler for Qt','') - self.end_msg(uicver) - if uicver.find(' 3.')!=-1: - self.fatal('this uic compiler is for qt3, add uic for qt4 to your path') - find_bin(['moc-qt4','moc'],'QT_MOC') - find_bin(['rcc-qt4','rcc'],'QT_RCC') - find_bin(['lrelease-qt4','lrelease'],'QT_LRELEASE') - find_bin(['lupdate-qt4','lupdate'],'QT_LUPDATE') - env['UIC3_ST']='%s -o %s' - env['UIC_ST']='%s -o %s' - env['MOC_ST']='-o' - env['ui_PATTERN']='ui_%s.h' - env['QT_LRELEASE_FLAGS']=['-silent'] - env.MOCCPPPATH_ST='-I%s' - env.MOCDEFINES_ST='-D%s' -@conf -def find_qt4_libraries(self): - qtlibs=getattr(Options.options,'qtlibs',None)or os.environ.get("QT4_LIBDIR",None) - if not qtlibs: - try: - qtlibs=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_LIBS']).strip() - except Errors.WafError: - qtdir=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_PREFIX']).strip()+os.sep - qtlibs=os.path.join(qtdir,'lib') - self.msg('Found the Qt4 libraries in',qtlibs) - qtincludes=os.environ.get("QT4_INCLUDES",None)or self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_HEADERS']).strip() - env=self.env - if not'PKG_CONFIG_PATH'in os.environ: - os.environ['PKG_CONFIG_PATH']='%s:%s/pkgconfig:/usr/lib/qt4/lib/pkgconfig:/opt/qt4/lib/pkgconfig:/usr/lib/qt4/lib:/opt/qt4/lib'%(qtlibs,qtlibs) - try: - if os.environ.get("QT4_XCOMPILE",None): - raise self.errors.ConfigurationError() - self.check_cfg(atleast_pkgconfig_version='0.1') - except self.errors.ConfigurationError: - for i in self.qt4_vars: - uselib=i.upper() - if Utils.unversioned_sys_platform()=="darwin": - frameworkName=i+".framework" - qtDynamicLib=os.path.join(qtlibs,frameworkName,i) - if os.path.exists(qtDynamicLib): - env.append_unique('FRAMEWORK_'+uselib,i) - self.msg('Checking for %s'%i,qtDynamicLib,'GREEN') - else: - self.msg('Checking for %s'%i,False,'YELLOW') - env.append_unique('INCLUDES_'+uselib,os.path.join(qtlibs,frameworkName,'Headers')) - elif env.DEST_OS!="win32": - qtDynamicLib=os.path.join(qtlibs,"lib"+i+".so") - qtStaticLib=os.path.join(qtlibs,"lib"+i+".a") - if os.path.exists(qtDynamicLib): - env.append_unique('LIB_'+uselib,i) - self.msg('Checking for %s'%i,qtDynamicLib,'GREEN') - elif os.path.exists(qtStaticLib): - env.append_unique('LIB_'+uselib,i) - self.msg('Checking for %s'%i,qtStaticLib,'GREEN') - else: - self.msg('Checking for %s'%i,False,'YELLOW') - env.append_unique('LIBPATH_'+uselib,qtlibs) - env.append_unique('INCLUDES_'+uselib,qtincludes) - env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i)) - else: - for k in("lib%s.a","lib%s4.a","%s.lib","%s4.lib"): - lib=os.path.join(qtlibs,k%i) - if os.path.exists(lib): - env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')]) - self.msg('Checking for %s'%i,lib,'GREEN') - break - else: - self.msg('Checking for %s'%i,False,'YELLOW') - env.append_unique('LIBPATH_'+uselib,qtlibs) - env.append_unique('INCLUDES_'+uselib,qtincludes) - env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i)) - uselib=i.upper()+"_debug" - for k in("lib%sd.a","lib%sd4.a","%sd.lib","%sd4.lib"): - lib=os.path.join(qtlibs,k%i) - if os.path.exists(lib): - env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')]) - self.msg('Checking for %s'%i,lib,'GREEN') - break - else: - self.msg('Checking for %s'%i,False,'YELLOW') - env.append_unique('LIBPATH_'+uselib,qtlibs) - env.append_unique('INCLUDES_'+uselib,qtincludes) - env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i)) - else: - for i in self.qt4_vars_debug+self.qt4_vars: - self.check_cfg(package=i,args='--cflags --libs',mandatory=False) -@conf -def simplify_qt4_libs(self): - env=self.env - def process_lib(vars_,coreval): - for d in vars_: - var=d.upper() - if var=='QTCORE': - continue - value=env['LIBPATH_'+var] - if value: - core=env[coreval] - accu=[] - for lib in value: - if lib in core: - continue - accu.append(lib) - env['LIBPATH_'+var]=accu - process_lib(self.qt4_vars,'LIBPATH_QTCORE') - process_lib(self.qt4_vars_debug,'LIBPATH_QTCORE_DEBUG') -@conf -def add_qt4_rpath(self): - env=self.env - if getattr(Options.options,'want_rpath',False): - def process_rpath(vars_,coreval): - for d in vars_: - var=d.upper() - value=env['LIBPATH_'+var] - if value: - core=env[coreval] - accu=[] - for lib in value: - if var!='QTCORE': - if lib in core: - continue - accu.append('-Wl,--rpath='+lib) - env['RPATH_'+var]=accu - process_rpath(self.qt4_vars,'LIBPATH_QTCORE') - process_rpath(self.qt4_vars_debug,'LIBPATH_QTCORE_DEBUG') -@conf -def set_qt4_libs_to_check(self): - if not hasattr(self,'qt4_vars'): - self.qt4_vars=QT4_LIBS - self.qt4_vars=Utils.to_list(self.qt4_vars) - if not hasattr(self,'qt4_vars_debug'): - self.qt4_vars_debug=[a+'_debug'for a in self.qt4_vars] - self.qt4_vars_debug=Utils.to_list(self.qt4_vars_debug) -@conf -def set_qt4_defines(self): - if sys.platform!='win32': - return - for x in self.qt4_vars: - y=x[2:].upper() - self.env.append_unique('DEFINES_%s'%x.upper(),'QT_%s_LIB'%y) - self.env.append_unique('DEFINES_%s_DEBUG'%x.upper(),'QT_%s_LIB'%y) -def options(opt): - opt.add_option('--want-rpath',action='store_true',default=False,dest='want_rpath',help='enable the rpath for qt libraries') - opt.add_option('--header-ext',type='string',default='',help='header extension for moc files',dest='qt_header_ext') - for i in'qtdir qtbin qtlibs'.split(): - opt.add_option('--'+i,type='string',default='',dest=i) - opt.add_option('--translate',action="store_true",help="collect translation strings",dest="trans_qt4",default=False) diff --git a/waflib/Tools/qt5.py b/waflib/Tools/qt5.py index f69c79d..6f5f136 100644 --- a/waflib/Tools/qt5.py +++ b/waflib/Tools/qt5.py @@ -2,6 +2,7 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file +from __future__ import with_statement try: from xml.sax import make_parser from xml.sax.handler import ContentHandler @@ -10,54 +11,16 @@ except ImportError: ContentHandler=object else: has_xml=True -import os,sys +import os,sys,re from waflib.Tools import cxx from waflib import Task,Utils,Options,Errors,Context -from waflib.TaskGen import feature,after_method,extension +from waflib.TaskGen import feature,after_method,extension,before_method from waflib.Configure import conf from waflib import Logs MOC_H=['.h','.hpp','.hxx','.hh'] EXT_RCC=['.qrc'] EXT_UI=['.ui'] EXT_QT5=['.cpp','.cc','.cxx','.C'] -QT5_LIBS=''' -qtmain -Qt5Bluetooth -Qt5CLucene -Qt5Concurrent -Qt5Core -Qt5DBus -Qt5Declarative -Qt5DesignerComponents -Qt5Designer -Qt5Gui -Qt5Help -Qt5MultimediaQuick_p -Qt5Multimedia -Qt5MultimediaWidgets -Qt5Network -Qt5Nfc -Qt5OpenGL -Qt5Positioning -Qt5PrintSupport -Qt5Qml -Qt5QuickParticles -Qt5Quick -Qt5QuickTest -Qt5Script -Qt5ScriptTools -Qt5Sensors -Qt5SerialPort -Qt5Sql -Qt5Svg -Qt5Test -Qt5WebKit -Qt5WebKitWidgets -Qt5Widgets -Qt5WinExtras -Qt5X11Extras -Qt5XmlPatterns -Qt5Xml''' class qxx(Task.classes['cxx']): def __init__(self,*k,**kw): Task.Task.__init__(self,*k,**kw) @@ -82,23 +45,15 @@ class qxx(Task.classes['cxx']): tsk=moc_cache[h_node]=Task.classes['moc'](env=self.env,generator=self.generator) tsk.set_inputs(h_node) tsk.set_outputs(m_node) + tsk.env.append_unique('MOC_FLAGS','-i') if self.generator: self.generator.tasks.append(tsk) gen=self.generator.bld.producer - gen.outstanding.insert(0,tsk) + gen.outstanding.append(tsk) gen.total+=1 return tsk else: delattr(self,'cache_sig') - def moc_h_ext(self): - ext=[] - try: - ext=Options.options.qt_header_ext.split() - except AttributeError: - pass - if not ext: - ext=MOC_H - return ext def add_moc_tasks(self): node=self.inputs[0] bld=self.generator.bld @@ -110,7 +65,7 @@ class qxx(Task.classes['cxx']): delattr(self,'cache_sig') include_nodes=[node.parent]+self.generator.includes_nodes moctasks=[] - mocfiles=set([]) + mocfiles=set() for d in bld.raw_deps.get(self.uid(),[]): if not d.endswith('.moc'): continue @@ -119,25 +74,21 @@ class qxx(Task.classes['cxx']): mocfiles.add(d) h_node=None base2=d[:-4] - for x in include_nodes: - for e in self.moc_h_ext(): - h_node=x.find_node(base2+e) - if h_node: - break - if h_node: - m_node=h_node.change_ext('.moc') - break + prefix=node.name[:node.name.rfind('.')] + if base2==prefix: + h_node=node else: - for k in EXT_QT5: - if base2.endswith(k): - for x in include_nodes: - h_node=x.find_node(base2) - if h_node: - break + for x in include_nodes: + for e in MOC_H: + h_node=x.find_node(base2+e) if h_node: - m_node=h_node.change_ext(k+'.moc') break - if not h_node: + else: + continue + break + if h_node: + m_node=h_node.change_ext('.moc') + else: raise Errors.WafError('No source found for %r which is a moc file'%d) task=self.create_moc_task(h_node,m_node) moctasks.append(task) @@ -146,9 +97,9 @@ class qxx(Task.classes['cxx']): class trans_update(Task.Task): run_str='${QT_LUPDATE} ${SRC} -ts ${TGT}' color='BLUE' -Task.update_outputs(trans_update) class XMLHandler(ContentHandler): def __init__(self): + ContentHandler.__init__(self) self.buf=[] self.files=[] def startElement(self,name,attrs): @@ -161,7 +112,7 @@ class XMLHandler(ContentHandler): self.buf.append(cars) @extension(*EXT_RCC) def create_rcc_task(self,node): - rcnode=node.change_ext('_rc.cpp') + rcnode=node.change_ext('_rc.%d.cpp'%self.idx) self.create_task('rcc',node,rcnode) cpptask=self.create_task('cxx',rcnode,rcnode.change_ext('.o')) try: @@ -171,12 +122,28 @@ def create_rcc_task(self,node): return cpptask @extension(*EXT_UI) def create_uic_task(self,node): - uictask=self.create_task('ui5',node) - uictask.outputs=[self.path.find_or_declare(self.env['ui_PATTERN']%node.name[:-3])] + try: + uic_cache=self.bld.uic_cache + except AttributeError: + uic_cache=self.bld.uic_cache={} + if node not in uic_cache: + uictask=uic_cache[node]=self.create_task('ui5',node) + uictask.outputs=[node.parent.find_or_declare(self.env.ui_PATTERN%node.name[:-3])] @extension('.ts') def add_lang(self,node): self.lang=self.to_list(getattr(self,'lang',[]))+[node] @feature('qt5') +@before_method('process_source') +def process_mocs(self): + lst=self.to_nodes(getattr(self,'moc',[])) + self.source=self.to_list(getattr(self,'source',[])) + for x in lst: + prefix=x.name[:x.name.rfind('.')] + moc_target='moc_%s.%d.cpp'%(prefix,self.idx) + moc_node=x.parent.find_or_declare(moc_target) + self.source.append(moc_node) + self.create_task('moc',x,moc_node) +@feature('qt5') @after_method('apply_link') def apply_qt5(self): if getattr(self,'lang',None): @@ -184,22 +151,23 @@ def apply_qt5(self): for x in self.to_list(self.lang): if isinstance(x,str): x=self.path.find_resource(x+'.ts') - qmtasks.append(self.create_task('ts2qm',x,x.change_ext('.qm'))) + qmtasks.append(self.create_task('ts2qm',x,x.change_ext('.%d.qm'%self.idx))) if getattr(self,'update',None)and Options.options.trans_qt5: - cxxnodes=[a.inputs[0]for a in self.compiled_tasks]+[a.inputs[0]for a in self.tasks if getattr(a,'inputs',None)and a.inputs[0].name.endswith('.ui')] + cxxnodes=[a.inputs[0]for a in self.compiled_tasks]+[a.inputs[0]for a in self.tasks if a.inputs and a.inputs[0].name.endswith('.ui')] for x in qmtasks: self.create_task('trans_update',cxxnodes,x.inputs) if getattr(self,'langname',None): qmnodes=[x.outputs[0]for x in qmtasks] rcnode=self.langname if isinstance(rcnode,str): - rcnode=self.path.find_or_declare(rcnode+'.qrc') + rcnode=self.path.find_or_declare(rcnode+('.%d.qrc'%self.idx)) t=self.create_task('qm2rcc',qmnodes,rcnode) k=create_rcc_task(self,t.outputs[0]) self.link_task.inputs.append(k.outputs[0]) lst=[] - for flag in self.to_list(self.env['CXXFLAGS']): - if len(flag)<2:continue + for flag in self.to_list(self.env.CXXFLAGS): + if len(flag)<2: + continue f=flag[0:2] if f in('-D','-I','/D','/I'): if(f[0]=='/'): @@ -218,27 +186,30 @@ class rcc(Task.Task): return os.path.splitext(self.inputs[0].name)[0] def scan(self): if not has_xml: - Logs.error('no xml support was found, the rcc dependencies will be incomplete!') + Logs.error('No xml.sax support was found, rcc dependencies will be incomplete!') return([],[]) parser=make_parser() curHandler=XMLHandler() parser.setContentHandler(curHandler) - fi=open(self.inputs[0].abspath(),'r') - try: - parser.parse(fi) - finally: - fi.close() + with open(self.inputs[0].abspath(),'r')as f: + parser.parse(f) nodes=[] names=[] root=self.inputs[0].parent for x in curHandler.files: nd=root.find_resource(x) - if nd:nodes.append(nd) - else:names.append(x) + if nd: + nodes.append(nd) + else: + names.append(x) return(nodes,names) + def quote_flag(self,x): + return x class moc(Task.Task): color='BLUE' run_str='${QT_MOC} ${MOC_FLAGS} ${MOCCPPPATH_ST:INCPATHS} ${MOCDEFINES_ST:DEFINES} ${SRC} ${MOC_ST} ${TGT}' + def quote_flag(self,x): + return x class ui5(Task.Task): color='BLUE' run_str='${QT_UIC} ${SRC} -o ${TGT}' @@ -255,11 +226,36 @@ class qm2rcc(Task.Task): self.outputs[0].write(code) def configure(self): self.find_qt5_binaries() + self.set_qt5_libs_dir() self.set_qt5_libs_to_check() self.set_qt5_defines() self.find_qt5_libraries() self.add_qt5_rpath() self.simplify_qt5_libs() + if not has_xml: + Logs.error('No xml.sax support was found, rcc dependencies will be incomplete!') + if'COMPILER_CXX'not in self.env: + self.fatal('No CXX compiler defined: did you forget to configure compiler_cxx first?') + frag='#include <QApplication>\nint main(int argc, char **argv) {return 0;}\n' + uses='QT5CORE QT5WIDGETS QT5GUI' + for flag in[[],'-fPIE','-fPIC','-std=c++11',['-std=c++11','-fPIE'],['-std=c++11','-fPIC']]: + msg='See if Qt files compile ' + if flag: + msg+='with %s'%flag + try: + self.check(features='qt5 cxx',use=uses,uselib_store='qt5',cxxflags=flag,fragment=frag,msg=msg) + except self.errors.ConfigurationError: + pass + else: + break + else: + self.fatal('Could not build a simple Qt application') + if Utils.unversioned_sys_platform()=='freebsd': + frag='#include <QApplication>\nint main(int argc, char **argv) { QApplication app(argc, argv); return NULL != (void*) (&app);}\n' + try: + self.check(features='qt5 cxx cxxprogram',use=uses,fragment=frag,msg='Can we link Qt programs on FreeBSD directly?') + except self.errors.ConfigurationError: + self.check(features='qt5 cxx cxxprogram',use=uses,uselib_store='qt5',libpath='/usr/local/lib',fragment=frag,msg='Is /usr/local/lib required?') @conf def find_qt5_binaries(self): env=self.env @@ -270,13 +266,13 @@ def find_qt5_binaries(self): if qtdir: qtbin=os.path.join(qtdir,'bin') if not qtdir: - qtdir=os.environ.get('QT5_ROOT','') - qtbin=os.environ.get('QT5_BIN',None)or os.path.join(qtdir,'bin') + qtdir=self.environ.get('QT5_ROOT','') + qtbin=self.environ.get('QT5_BIN')or os.path.join(qtdir,'bin') if qtbin: paths=[qtbin] if not qtdir: - paths=os.environ.get('PATH','').split(os.pathsep) - paths.append('/usr/share/qt5/bin/') + paths=self.environ.get('PATH','').split(os.pathsep) + paths.extend(['/usr/share/qt5/bin','/usr/local/lib/qt5/bin']) try: lst=Utils.listdir('/usr/local/Trolltech/') except OSError: @@ -323,7 +319,7 @@ def find_qt5_binaries(self): self.env.QMAKE=cand else: self.fatal('Could not find qmake for qt5') - self.env.QT_INSTALL_BINS=qtbin=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_BINS']).strip()+os.sep + self.env.QT_HOST_BINS=qtbin=self.cmd_and_log(self.env.QMAKE+['-query','QT_HOST_BINS']).strip() paths.insert(0,qtbin) def find_bin(lst,var): if var in env: @@ -345,88 +341,89 @@ def find_qt5_binaries(self): uicver=uicver.replace('Qt User Interface Compiler ','').replace('User Interface Compiler for Qt','') self.end_msg(uicver) if uicver.find(' 3.')!=-1 or uicver.find(' 4.')!=-1: - self.fatal('this uic compiler is for qt3 or qt5, add uic for qt5 to your path') + self.fatal('this uic compiler is for qt3 or qt4, add uic for qt5 to your path') find_bin(['moc-qt5','moc'],'QT_MOC') find_bin(['rcc-qt5','rcc'],'QT_RCC') find_bin(['lrelease-qt5','lrelease'],'QT_LRELEASE') find_bin(['lupdate-qt5','lupdate'],'QT_LUPDATE') - env['UIC_ST']='%s -o %s' - env['MOC_ST']='-o' - env['ui_PATTERN']='ui_%s.h' - env['QT_LRELEASE_FLAGS']=['-silent'] + env.UIC_ST='%s -o %s' + env.MOC_ST='-o' + env.ui_PATTERN='ui_%s.h' + env.QT_LRELEASE_FLAGS=['-silent'] env.MOCCPPPATH_ST='-I%s' env.MOCDEFINES_ST='-D%s' @conf -def find_qt5_libraries(self): - qtlibs=getattr(Options.options,'qtlibs',None)or os.environ.get("QT5_LIBDIR",None) +def set_qt5_libs_dir(self): + env=self.env + qtlibs=getattr(Options.options,'qtlibs',None)or self.environ.get('QT5_LIBDIR') if not qtlibs: try: - qtlibs=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_LIBS']).strip() + qtlibs=self.cmd_and_log(env.QMAKE+['-query','QT_INSTALL_LIBS']).strip() except Errors.WafError: - qtdir=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_PREFIX']).strip()+os.sep + qtdir=self.cmd_and_log(env.QMAKE+['-query','QT_INSTALL_PREFIX']).strip() qtlibs=os.path.join(qtdir,'lib') self.msg('Found the Qt5 libraries in',qtlibs) - qtincludes=os.environ.get("QT5_INCLUDES",None)or self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_HEADERS']).strip() + env.QTLIBS=qtlibs +@conf +def find_single_qt5_lib(self,name,uselib,qtlibs,qtincludes,force_static): + env=self.env + if force_static: + exts=('.a','.lib') + prefix='STLIB' + else: + exts=('.so','.lib') + prefix='LIB' + def lib_names(): + for x in exts: + for k in('','5')if Utils.is_win32 else['']: + for p in('lib',''): + yield(p,name,k,x) + for tup in lib_names(): + k=''.join(tup) + path=os.path.join(qtlibs,k) + if os.path.exists(path): + if env.DEST_OS=='win32': + libval=''.join(tup[:-1]) + else: + libval=name + env.append_unique(prefix+'_'+uselib,libval) + env.append_unique('%sPATH_%s'%(prefix,uselib),qtlibs) + env.append_unique('INCLUDES_'+uselib,qtincludes) + env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,name.replace('Qt5','Qt'))) + return k + return False +@conf +def find_qt5_libraries(self): env=self.env - if not'PKG_CONFIG_PATH'in os.environ: - os.environ['PKG_CONFIG_PATH']='%s:%s/pkgconfig:/usr/lib/qt5/lib/pkgconfig:/opt/qt5/lib/pkgconfig:/usr/lib/qt5/lib:/opt/qt5/lib'%(qtlibs,qtlibs) + qtincludes=self.environ.get('QT5_INCLUDES')or self.cmd_and_log(env.QMAKE+['-query','QT_INSTALL_HEADERS']).strip() + force_static=self.environ.get('QT5_FORCE_STATIC') try: - if os.environ.get("QT5_XCOMPILE",None): - raise self.errors.ConfigurationError() + if self.environ.get('QT5_XCOMPILE'): + self.fatal('QT5_XCOMPILE Disables pkg-config detection') self.check_cfg(atleast_pkgconfig_version='0.1') except self.errors.ConfigurationError: for i in self.qt5_vars: uselib=i.upper() - if Utils.unversioned_sys_platform()=="darwin": - frameworkName=i+".framework" - qtDynamicLib=os.path.join(qtlibs,frameworkName,i) + if Utils.unversioned_sys_platform()=='darwin': + fwk=i.replace('Qt5','Qt') + frameworkName=fwk+'.framework' + qtDynamicLib=os.path.join(env.QTLIBS,frameworkName,fwk) if os.path.exists(qtDynamicLib): - env.append_unique('FRAMEWORK_'+uselib,i) + env.append_unique('FRAMEWORK_'+uselib,fwk) + env.append_unique('FRAMEWORKPATH_'+uselib,env.QTLIBS) self.msg('Checking for %s'%i,qtDynamicLib,'GREEN') else: self.msg('Checking for %s'%i,False,'YELLOW') - env.append_unique('INCLUDES_'+uselib,os.path.join(qtlibs,frameworkName,'Headers')) - elif env.DEST_OS!="win32": - qtDynamicLib=os.path.join(qtlibs,"lib"+i+".so") - qtStaticLib=os.path.join(qtlibs,"lib"+i+".a") - if os.path.exists(qtDynamicLib): - env.append_unique('LIB_'+uselib,i) - self.msg('Checking for %s'%i,qtDynamicLib,'GREEN') - elif os.path.exists(qtStaticLib): - env.append_unique('LIB_'+uselib,i) - self.msg('Checking for %s'%i,qtStaticLib,'GREEN') - else: - self.msg('Checking for %s'%i,False,'YELLOW') - env.append_unique('LIBPATH_'+uselib,qtlibs) - env.append_unique('INCLUDES_'+uselib,qtincludes) - env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i)) + env.append_unique('INCLUDES_'+uselib,os.path.join(env.QTLIBS,frameworkName,'Headers')) else: - for k in("lib%s.a","lib%s5.a","%s.lib","%s5.lib"): - lib=os.path.join(qtlibs,k%i) - if os.path.exists(lib): - env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')]) - self.msg('Checking for %s'%i,lib,'GREEN') - break - else: - self.msg('Checking for %s'%i,False,'YELLOW') - env.append_unique('LIBPATH_'+uselib,qtlibs) - env.append_unique('INCLUDES_'+uselib,qtincludes) - env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i.replace('Qt5','Qt'))) - uselib=i.upper()+"_debug" - for k in("lib%sd.a","lib%sd5.a","%sd.lib","%sd5.lib"): - lib=os.path.join(qtlibs,k%i) - if os.path.exists(lib): - env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')]) - self.msg('Checking for %s'%i,lib,'GREEN') - break - else: - self.msg('Checking for %s'%i,False,'YELLOW') - env.append_unique('LIBPATH_'+uselib,qtlibs) - env.append_unique('INCLUDES_'+uselib,qtincludes) - env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i.replace('Qt5','Qt'))) + ret=self.find_single_qt5_lib(i,uselib,env.QTLIBS,qtincludes,force_static) + if not force_static and not ret: + ret=self.find_single_qt5_lib(i,uselib,env.QTLIBS,qtincludes,True) + self.msg('Checking for %s'%i,ret,'GREEN'if ret else'YELLOW') else: - for i in self.qt5_vars_debug+self.qt5_vars: - self.check_cfg(package=i,args='--cflags --libs',mandatory=False) + path='%s:%s:%s/pkgconfig:/usr/lib/qt5/lib/pkgconfig:/opt/qt5/lib/pkgconfig:/usr/lib/qt5/lib:/opt/qt5/lib'%(self.environ.get('PKG_CONFIG_PATH',''),env.QTLIBS,env.QTLIBS) + for i in self.qt5_vars: + self.check_cfg(package=i,args='--cflags --libs',mandatory=False,force_static=force_static,pkg_config_path=path) @conf def simplify_qt5_libs(self): env=self.env @@ -445,7 +442,6 @@ def simplify_qt5_libs(self): accu.append(lib) env['LIBPATH_'+var]=accu process_lib(self.qt5_vars,'LIBPATH_QTCORE') - process_lib(self.qt5_vars_debug,'LIBPATH_QTCORE_DEBUG') @conf def add_qt5_rpath(self): env=self.env @@ -464,15 +460,28 @@ def add_qt5_rpath(self): accu.append('-Wl,--rpath='+lib) env['RPATH_'+var]=accu process_rpath(self.qt5_vars,'LIBPATH_QTCORE') - process_rpath(self.qt5_vars_debug,'LIBPATH_QTCORE_DEBUG') @conf def set_qt5_libs_to_check(self): - if not hasattr(self,'qt5_vars'): - self.qt5_vars=QT5_LIBS - self.qt5_vars=Utils.to_list(self.qt5_vars) - if not hasattr(self,'qt5_vars_debug'): - self.qt5_vars_debug=[a+'_debug'for a in self.qt5_vars] - self.qt5_vars_debug=Utils.to_list(self.qt5_vars_debug) + self.qt5_vars=Utils.to_list(getattr(self,'qt5_vars',[])) + if not self.qt5_vars: + dirlst=Utils.listdir(self.env.QTLIBS) + pat=self.env.cxxshlib_PATTERN + if Utils.is_win32: + pat=pat.replace('.dll','.lib') + if self.environ.get('QT5_FORCE_STATIC'): + pat=self.env.cxxstlib_PATTERN + if Utils.unversioned_sys_platform()=='darwin': + pat="%s\.framework" + re_qt=re.compile(pat%'Qt5?(?P<name>.*)'+'$') + for x in dirlst: + m=re_qt.match(x) + if m: + self.qt5_vars.append("Qt5%s"%m.group('name')) + if not self.qt5_vars: + self.fatal('cannot find any Qt5 library (%r)'%self.env.QTLIBS) + qtextralibs=getattr(Options.options,'qtextralibs',None) + if qtextralibs: + self.qt5_vars.extend(qtextralibs.split(',')) @conf def set_qt5_defines(self): if sys.platform!='win32': @@ -480,10 +489,9 @@ def set_qt5_defines(self): for x in self.qt5_vars: y=x.replace('Qt5','Qt')[2:].upper() self.env.append_unique('DEFINES_%s'%x.upper(),'QT_%s_LIB'%y) - self.env.append_unique('DEFINES_%s_DEBUG'%x.upper(),'QT_%s_LIB'%y) def options(opt): opt.add_option('--want-rpath',action='store_true',default=False,dest='want_rpath',help='enable the rpath for qt libraries') - opt.add_option('--header-ext',type='string',default='',help='header extension for moc files',dest='qt_header_ext') for i in'qtdir qtbin qtlibs'.split(): opt.add_option('--'+i,type='string',default='',dest=i) - opt.add_option('--translate',action="store_true",help="collect translation strings",dest="trans_qt5",default=False) + opt.add_option('--translate',action='store_true',help='collect translation strings',dest='trans_qt5',default=False) + opt.add_option('--qtextralibs',type='string',default='',dest='qtextralibs',help='additional qt libraries on the system to add to default ones, comma separated') diff --git a/waflib/Tools/ruby.py b/waflib/Tools/ruby.py index e81a6ef..887234f 100644 --- a/waflib/Tools/ruby.py +++ b/waflib/Tools/ruby.py @@ -3,11 +3,11 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file import os -from waflib import Options,Utils,Task +from waflib import Errors,Options,Task,Utils from waflib.TaskGen import before_method,feature,extension from waflib.Configure import conf @feature('rubyext') -@before_method('apply_incpaths','apply_lib_vars','apply_bundle','apply_link') +@before_method('apply_incpaths','process_source','apply_bundle','apply_link') def init_rubyext(self): self.install_path='${ARCHDIR_RUBY}' self.uselib=self.to_list(getattr(self,'uselib','')) @@ -16,24 +16,20 @@ def init_rubyext(self): if not'RUBYEXT'in self.uselib: self.uselib.append('RUBYEXT') @feature('rubyext') -@before_method('apply_link','propagate_uselib') +@before_method('apply_link','propagate_uselib_vars') def apply_ruby_so_name(self): - self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['rubyext_PATTERN'] + self.env.cshlib_PATTERN=self.env.cxxshlib_PATTERN=self.env.rubyext_PATTERN @conf def check_ruby_version(self,minver=()): - if Options.options.rubybinary: - self.env.RUBY=Options.options.rubybinary - else: - self.find_program('ruby',var='RUBY') - ruby=self.env.RUBY + ruby=self.find_program('ruby',var='RUBY',value=Options.options.rubybinary) try: version=self.cmd_and_log(ruby+['-e','puts defined?(VERSION) ? VERSION : RUBY_VERSION']).strip() - except Exception: + except Errors.WafError: self.fatal('could not determine ruby version') self.env.RUBY_VERSION=version try: - ver=tuple(map(int,version.split("."))) - except Exception: + ver=tuple(map(int,version.split('.'))) + except Errors.WafError: self.fatal('unsupported ruby version %r'%version) cver='' if minver: @@ -86,7 +82,7 @@ def check_ruby_module(self,module_name): self.start_msg('Ruby module %s'%module_name) try: self.cmd_and_log(self.env.RUBY+['-e','require \'%s\';puts 1'%module_name]) - except Exception: + except Errors.WafError: self.end_msg(False) self.fatal('Could not find the ruby module %r'%module_name) self.end_msg(True) diff --git a/waflib/Tools/suncc.py b/waflib/Tools/suncc.py index f014abf..676c884 100644 --- a/waflib/Tools/suncc.py +++ b/waflib/Tools/suncc.py @@ -2,6 +2,7 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file +from waflib import Errors from waflib.Tools import ccroot,ar from waflib.Configure import conf @conf @@ -10,33 +11,34 @@ def find_scc(conf): cc=conf.find_program('cc',var='CC') try: conf.cmd_and_log(cc+['-flags']) - except Exception: + except Errors.WafError: conf.fatal('%r is not a Sun compiler'%cc) v.CC_NAME='sun' conf.get_suncc_version(cc) @conf def scc_common_flags(conf): v=conf.env - v['CC_SRC_F']=[] - v['CC_TGT_F']=['-c','-o'] - if not v['LINK_CC']:v['LINK_CC']=v['CC'] - v['CCLNK_SRC_F']='' - v['CCLNK_TGT_F']=['-o'] - v['CPPPATH_ST']='-I%s' - v['DEFINES_ST']='-D%s' - v['LIB_ST']='-l%s' - v['LIBPATH_ST']='-L%s' - v['STLIB_ST']='-l%s' - v['STLIBPATH_ST']='-L%s' - v['SONAME_ST']='-Wl,-h,%s' - v['SHLIB_MARKER']='-Bdynamic' - v['STLIB_MARKER']='-Bstatic' - v['cprogram_PATTERN']='%s' - v['CFLAGS_cshlib']=['-xcode=pic32','-DPIC'] - v['LINKFLAGS_cshlib']=['-G'] - v['cshlib_PATTERN']='lib%s.so' - v['LINKFLAGS_cstlib']=['-Bstatic'] - v['cstlib_PATTERN']='lib%s.a' + v.CC_SRC_F=[] + v.CC_TGT_F=['-c','-o',''] + if not v.LINK_CC: + v.LINK_CC=v.CC + v.CCLNK_SRC_F='' + v.CCLNK_TGT_F=['-o',''] + v.CPPPATH_ST='-I%s' + v.DEFINES_ST='-D%s' + v.LIB_ST='-l%s' + v.LIBPATH_ST='-L%s' + v.STLIB_ST='-l%s' + v.STLIBPATH_ST='-L%s' + v.SONAME_ST='-Wl,-h,%s' + v.SHLIB_MARKER='-Bdynamic' + v.STLIB_MARKER='-Bstatic' + v.cprogram_PATTERN='%s' + v.CFLAGS_cshlib=['-xcode=pic32','-DPIC'] + v.LINKFLAGS_cshlib=['-G'] + v.cshlib_PATTERN='lib%s.so' + v.LINKFLAGS_cstlib=['-Bstatic'] + v.cstlib_PATTERN='lib%s.a' def configure(conf): conf.find_scc() conf.find_ar() diff --git a/waflib/Tools/suncxx.py b/waflib/Tools/suncxx.py index 7130fdf..0047098 100644 --- a/waflib/Tools/suncxx.py +++ b/waflib/Tools/suncxx.py @@ -2,6 +2,7 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file +from waflib import Errors from waflib.Tools import ccroot,ar from waflib.Configure import conf @conf @@ -10,33 +11,34 @@ def find_sxx(conf): cc=conf.find_program(['CC','c++'],var='CXX') try: conf.cmd_and_log(cc+['-flags']) - except Exception: + except Errors.WafError: conf.fatal('%r is not a Sun compiler'%cc) v.CXX_NAME='sun' conf.get_suncc_version(cc) @conf def sxx_common_flags(conf): v=conf.env - v['CXX_SRC_F']=[] - v['CXX_TGT_F']=['-c','-o'] - if not v['LINK_CXX']:v['LINK_CXX']=v['CXX'] - v['CXXLNK_SRC_F']=[] - v['CXXLNK_TGT_F']=['-o'] - v['CPPPATH_ST']='-I%s' - v['DEFINES_ST']='-D%s' - v['LIB_ST']='-l%s' - v['LIBPATH_ST']='-L%s' - v['STLIB_ST']='-l%s' - v['STLIBPATH_ST']='-L%s' - v['SONAME_ST']='-Wl,-h,%s' - v['SHLIB_MARKER']='-Bdynamic' - v['STLIB_MARKER']='-Bstatic' - v['cxxprogram_PATTERN']='%s' - v['CXXFLAGS_cxxshlib']=['-xcode=pic32','-DPIC'] - v['LINKFLAGS_cxxshlib']=['-G'] - v['cxxshlib_PATTERN']='lib%s.so' - v['LINKFLAGS_cxxstlib']=['-Bstatic'] - v['cxxstlib_PATTERN']='lib%s.a' + v.CXX_SRC_F=[] + v.CXX_TGT_F=['-c','-o',''] + if not v.LINK_CXX: + v.LINK_CXX=v.CXX + v.CXXLNK_SRC_F=[] + v.CXXLNK_TGT_F=['-o',''] + v.CPPPATH_ST='-I%s' + v.DEFINES_ST='-D%s' + v.LIB_ST='-l%s' + v.LIBPATH_ST='-L%s' + v.STLIB_ST='-l%s' + v.STLIBPATH_ST='-L%s' + v.SONAME_ST='-Wl,-h,%s' + v.SHLIB_MARKER='-Bdynamic' + v.STLIB_MARKER='-Bstatic' + v.cxxprogram_PATTERN='%s' + v.CXXFLAGS_cxxshlib=['-xcode=pic32','-DPIC'] + v.LINKFLAGS_cxxshlib=['-G'] + v.cxxshlib_PATTERN='lib%s.so' + v.LINKFLAGS_cxxstlib=['-Bstatic'] + v.cxxstlib_PATTERN='lib%s.a' def configure(conf): conf.find_sxx() conf.find_ar() diff --git a/waflib/Tools/tex.py b/waflib/Tools/tex.py index a91fd91..3a208d8 100644 --- a/waflib/Tools/tex.py +++ b/waflib/Tools/tex.py @@ -9,19 +9,22 @@ re_bibunit=re.compile(r'\\(?P<type>putbib)\[(?P<file>[^\[\]]*)\]',re.M) def bibunitscan(self): node=self.inputs[0] nodes=[] - if not node:return nodes + if not node: + return nodes code=node.read() for match in re_bibunit.finditer(code): path=match.group('file') if path: + found=None for k in('','.bib'): - Logs.debug('tex: trying %s%s'%(path,k)) + Logs.debug('tex: trying %s%s',path,k) fi=node.parent.find_resource(path+k) if fi: + found=True nodes.append(fi) - else: - Logs.debug('tex: could not find %s'%path) - Logs.debug("tex: found the following bibunit files: %s"%nodes) + if not found: + Logs.debug('tex: could not find %s',path) + Logs.debug('tex: found the following bibunit files: %s',nodes) return nodes exts_deps_tex=['','.ltx','.tex','.bib','.pdf','.png','.eps','.ps','.sty'] exts_tex=['.ltx','.tex'] @@ -42,14 +45,9 @@ class tex(Task.Task): Execute the program **makeglossaries** """ def exec_command(self,cmd,**kw): - bld=self.generator.bld - Logs.info('runner: %r'%cmd) - try: - if not kw.get('cwd',None): - kw['cwd']=bld.cwd - except AttributeError: - bld.cwd=kw['cwd']=bld.variant_dir - return Utils.subprocess.Popen(cmd,**kw).wait() + if self.env.PROMPT_LATEX: + kw['stdout']=kw['stderr']=None + return super(tex,self).exec_command(cmd,**kw) def scan_aux(self,node): nodes=[node] re_aux=re.compile(r'\\@input{(?P<file>[^{}]*)}',re.M) @@ -59,7 +57,7 @@ class tex(Task.Task): path=match.group('file') found=node.parent.find_or_declare(path) if found and found not in nodes: - Logs.debug('tex: found aux node '+found.abspath()) + Logs.debug('tex: found aux node %r',found) nodes.append(found) parse_node(found) parse_node(node) @@ -69,13 +67,13 @@ class tex(Task.Task): nodes=[] names=[] seen=[] - if not node:return(nodes,names) + if not node: + return(nodes,names) def parse_node(node): if node in seen: return seen.append(node) code=node.read() - global re_tex for match in re_tex.finditer(code): multibib=match.group('type') if multibib and multibib.startswith('bibliography'): @@ -90,7 +88,7 @@ class tex(Task.Task): found=None for k in exts_deps_tex: for up in self.texinputs_nodes: - Logs.debug('tex: trying %s%s'%(path,k)) + Logs.debug('tex: trying %s%s',path,k) found=up.find_resource(path+k) if found: break @@ -114,20 +112,26 @@ class tex(Task.Task): parse_node(node) for x in nodes: x.parent.get_bld().mkdir() - Logs.debug("tex: found the following : %s and names %s"%(nodes,names)) + Logs.debug("tex: found the following : %s and names %s",nodes,names) return(nodes,names) def check_status(self,msg,retcode): if retcode!=0: - raise Errors.WafError("%r command exit status %r"%(msg,retcode)) + raise Errors.WafError('%r command exit status %r'%(msg,retcode)) + def info(self,*k,**kw): + try: + info=self.generator.bld.conf.logger.info + except AttributeError: + info=Logs.info + info(*k,**kw) def bibfile(self): for aux_node in self.aux_nodes: try: ct=aux_node.read() except EnvironmentError: - Logs.error('Error reading %s: %r'%aux_node.abspath()) + Logs.error('Error reading %s: %r',aux_node.abspath()) continue if g_bibtex_re.findall(ct): - Logs.info('calling bibtex') + self.info('calling bibtex') self.env.env={} self.env.env.update(os.environ) self.env.env.update({'BIBINPUTS':self.texinputs(),'BSTINPUTS':self.texinputs()}) @@ -148,7 +152,7 @@ class tex(Task.Task): if bibunits: fn=['bu'+str(i)for i in range(1,len(bibunits)+1)] if fn: - Logs.info('calling bibtex on bibunits') + self.info('calling bibtex on bibunits') for f in fn: self.env.env={'BIBINPUTS':self.texinputs(),'BSTINPUTS':self.texinputs()} self.env.SRCFILE=f @@ -159,9 +163,9 @@ class tex(Task.Task): idx_path=self.idx_node.abspath() os.stat(idx_path) except OSError: - Logs.info('index file %s absent, not calling makeindex'%idx_path) + self.info('index file %s absent, not calling makeindex',idx_path) else: - Logs.info('calling makeindex') + self.info('calling makeindex') self.env.SRCFILE=self.idx_node.name self.env.env={} self.check_status('error when calling makeindex %s'%idx_path,self.makeindex_fun()) @@ -177,7 +181,7 @@ class tex(Task.Task): try: ct=aux_node.read() except EnvironmentError: - Logs.error('Error reading %s: %r'%aux_node.abspath()) + Logs.error('Error reading %s: %r',aux_node.abspath()) continue if g_glossaries_re.findall(ct): if not self.env.MAKEGLOSSARIES: @@ -190,12 +194,12 @@ class tex(Task.Task): return os.pathsep.join([k.abspath()for k in self.texinputs_nodes])+os.pathsep def run(self): env=self.env - if not env['PROMPT_LATEX']: + if not env.PROMPT_LATEX: env.append_value('LATEXFLAGS','-interaction=batchmode') env.append_value('PDFLATEXFLAGS','-interaction=batchmode') env.append_value('XELATEXFLAGS','-interaction=batchmode') - self.cwd=self.inputs[0].parent.get_bld().abspath() - Logs.info('first pass on %s'%self.__class__.__name__) + self.cwd=self.inputs[0].parent.get_bld() + self.info('first pass on %s',self.__class__.__name__) cur_hash=self.hash_aux_nodes() self.call_latex() self.hash_aux_nodes() @@ -211,7 +215,7 @@ class tex(Task.Task): Logs.error('No aux.h to process') if cur_hash and cur_hash==prev_hash: break - Logs.info('calling %s'%self.__class__.__name__) + self.info('calling %s',self.__class__.__name__) self.call_latex() def hash_aux_nodes(self): try: @@ -252,7 +256,13 @@ def apply_tex(self): if not getattr(self,'type',None)in('latex','pdflatex','xelatex'): self.type='pdflatex' outs=Utils.to_list(getattr(self,'outs',[])) - self.env['PROMPT_LATEX']=getattr(self,'prompt',1) + try: + self.generator.bld.conf + except AttributeError: + default_prompt=False + else: + default_prompt=True + self.env.PROMPT_LATEX=getattr(self,'prompt',default_prompt) deps_lst=[] if getattr(self,'deps',None): deps=self.to_list(self.deps) @@ -293,9 +303,9 @@ def apply_tex(self): if p: task.texinputs_nodes.append(p) else: - Logs.error('Invalid TEXINPUTS folder %s'%x) + Logs.error('Invalid TEXINPUTS folder %s',x) else: - Logs.error('Cannot resolve relative paths in TEXINPUTS %s'%x) + Logs.error('Cannot resolve relative paths in TEXINPUTS %s',x) if self.type=='latex': if'ps'in outs: tsk=self.create_task('dvips',task.outputs,node.change_ext('.ps')) @@ -314,4 +324,4 @@ def configure(self): self.find_program(p,var=p.upper()) except self.errors.ConfigurationError: pass - v['DVIPSFLAGS']='-Ppdf' + v.DVIPSFLAGS='-Ppdf' diff --git a/waflib/Tools/vala.py b/waflib/Tools/vala.py index 0521cbb..2f5a30d 100644 --- a/waflib/Tools/vala.py +++ b/waflib/Tools/vala.py @@ -3,7 +3,7 @@ # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file import re -from waflib import Context,Task,Utils,Logs,Options,Errors,Node +from waflib import Build,Context,Errors,Logs,Node,Options,Task,Utils from waflib.TaskGen import extension,taskgen_method from waflib.Configure import conf class valac(Task.Task): @@ -19,14 +19,16 @@ class valac(Task.Task): if self.generator.dump_deps_node: self.generator.dump_deps_node.write('\n'.join(self.generator.packages)) return ret -valac=Task.update_outputs(valac) @taskgen_method def init_vala_task(self): self.profile=getattr(self,'profile','gobject') + self.packages=packages=Utils.to_list(getattr(self,'packages',[])) + self.use=Utils.to_list(getattr(self,'use',[])) + if packages and not self.use: + self.use=packages[:] if self.profile=='gobject': - self.uselib=Utils.to_list(getattr(self,'uselib',[])) - if not'GOBJECT'in self.uselib: - self.uselib.append('GOBJECT') + if not'GOBJECT'in self.use: + self.use.append('GOBJECT') def addflags(flags): self.env.append_value('VALAFLAGS',flags) if self.profile: @@ -46,10 +48,10 @@ def init_vala_task(self): addflags('--directory=%s'%valatask.vala_dir_node.abspath()) if hasattr(self,'thread'): if self.profile=='gobject': - if not'GTHREAD'in self.uselib: - self.uselib.append('GTHREAD') + if not'GTHREAD'in self.use: + self.use.append('GTHREAD') else: - Logs.warn("Profile %s means no threading support"%self.profile) + Logs.warn('Profile %s means no threading support',self.profile) self.thread=False if self.thread: addflags('--thread') @@ -80,13 +82,11 @@ def init_vala_task(self): api_version=version[0]+".0" return api_version self.includes=Utils.to_list(getattr(self,'includes',[])) - self.uselib=self.to_list(getattr(self,'uselib',[])) valatask.install_path=getattr(self,'install_path','') valatask.vapi_path=getattr(self,'vapi_path','${DATAROOTDIR}/vala/vapi') - valatask.pkg_name=getattr(self,'pkg_name',self.env['PACKAGE']) + valatask.pkg_name=getattr(self,'pkg_name',self.env.PACKAGE) valatask.header_path=getattr(self,'header_path','${INCLUDEDIR}/%s-%s'%(valatask.pkg_name,_get_api_version())) valatask.install_binding=getattr(self,'install_binding',True) - self.packages=packages=Utils.to_list(getattr(self,'packages',[])) self.vapi_dirs=vapi_dirs=Utils.to_list(getattr(self,'vapi_dirs',[])) if hasattr(self,'use'): local_packages=Utils.to_list(self.use)[:] @@ -100,8 +100,10 @@ def init_vala_task(self): package_obj=self.bld.get_tgen_by_name(package) except Errors.WafError: continue + package_obj.post() package_name=package_obj.target - for task in package_obj.tasks: + task=getattr(package_obj,'valatask',None) + if task: for output in task.outputs: if output.name==package_name+".vapi": valatask.set_run_after(task) @@ -122,31 +124,23 @@ def init_vala_task(self): else: v_node=self.path.find_dir(vapi_dir) if not v_node: - Logs.warn('Unable to locate Vala API directory: %r'%vapi_dir) + Logs.warn('Unable to locate Vala API directory: %r',vapi_dir) else: addflags('--vapidir=%s'%v_node.abspath()) self.dump_deps_node=None if self.is_lib and self.packages: self.dump_deps_node=valatask.vala_dir_node.find_or_declare('%s.deps'%self.target) valatask.outputs.append(self.dump_deps_node) - self.includes.append(self.bld.srcnode.abspath()) - self.includes.append(self.bld.bldnode.abspath()) if self.is_lib and valatask.install_binding: headers_list=[o for o in valatask.outputs if o.suffix()==".h"] - try: - self.install_vheader.source=headers_list - except AttributeError: - self.install_vheader=self.bld.install_files(valatask.header_path,headers_list,self.env) + if headers_list: + self.install_vheader=self.add_install_files(install_to=valatask.header_path,install_from=headers_list) vapi_list=[o for o in valatask.outputs if(o.suffix()in(".vapi",".deps"))] - try: - self.install_vapi.source=vapi_list - except AttributeError: - self.install_vapi=self.bld.install_files(valatask.vapi_path,vapi_list,self.env) + if vapi_list: + self.install_vapi=self.add_install_files(install_to=valatask.vapi_path,install_from=vapi_list) gir_list=[o for o in valatask.outputs if o.suffix()=='.gir'] - try: - self.install_gir.source=gir_list - except AttributeError: - self.install_gir=self.bld.install_files(getattr(self,'gir_path','${DATAROOTDIR}/gir-1.0'),gir_list,self.env) + if gir_list: + self.install_gir=self.add_install_files(install_to=getattr(self,'gir_path','${DATAROOTDIR}/gir-1.0'),install_from=gir_list) if hasattr(self,'vala_resources'): nodes=self.to_nodes(self.vala_resources) valatask.vala_exclude=getattr(valatask,'vala_exclude',[])+nodes @@ -165,23 +159,35 @@ def vala_file(self,node): c_node=valatask.vala_dir_node.find_or_declare(name) valatask.outputs.append(c_node) self.source.append(c_node) +@extension('.vapi') +def vapi_file(self,node): + try: + valatask=self.valatask + except AttributeError: + valatask=self.valatask=self.create_task('valac') + self.init_vala_task() + valatask.inputs.append(node) @conf def find_valac(self,valac_name,min_version): valac=self.find_program(valac_name,var='VALAC') try: output=self.cmd_and_log(valac+['--version']) - except Exception: + except Errors.WafError: valac_version=None else: - ver=re.search(r'\d+.\d+.\d+',output).group(0).split('.') + ver=re.search(r'\d+.\d+.\d+',output).group().split('.') valac_version=tuple([int(x)for x in ver]) self.msg('Checking for %s version >= %r'%(valac_name,min_version),valac_version,valac_version and valac_version>=min_version) if valac and valac_version<min_version: self.fatal("%s version %r is too old, need >= %r"%(valac_name,valac_version,min_version)) - self.env['VALAC_VERSION']=valac_version + self.env.VALAC_VERSION=valac_version return valac @conf def check_vala(self,min_version=(0,8,0),branch=None): + if self.env.VALA_MINVER: + min_version=self.env.VALA_MINVER + if self.env.VALA_MINVER_BRANCH: + branch=self.env.VALA_MINVER_BRANCH if not branch: branch=min_version[:2] try: @@ -190,12 +196,12 @@ def check_vala(self,min_version=(0,8,0),branch=None): find_valac(self,'valac',min_version) @conf def check_vala_deps(self): - if not self.env['HAVE_GOBJECT']: + if not self.env.HAVE_GOBJECT: pkg_args={'package':'gobject-2.0','uselib_store':'GOBJECT','args':'--cflags --libs'} if getattr(Options.options,'vala_target_glib',None): pkg_args['atleast_version']=Options.options.vala_target_glib self.check_cfg(**pkg_args) - if not self.env['HAVE_GTHREAD']: + if not self.env.HAVE_GTHREAD: pkg_args={'package':'gthread-2.0','uselib_store':'GTHREAD','args':'--cflags --libs'} if getattr(Options.options,'vala_target_glib',None): pkg_args['atleast_version']=Options.options.vala_target_glib diff --git a/waflib/Tools/waf_unit_test.py b/waflib/Tools/waf_unit_test.py index 5727879..fde877d 100644 --- a/waflib/Tools/waf_unit_test.py +++ b/waflib/Tools/waf_unit_test.py @@ -2,23 +2,99 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import os +import os,shlex,sys from waflib.TaskGen import feature,after_method,taskgen_method from waflib import Utils,Task,Logs,Options +from waflib.Tools import ccroot testlock=Utils.threading.Lock() +SCRIPT_TEMPLATE="""#! %(python)s +import subprocess, sys +cmd = %(cmd)r +# if you want to debug with gdb: +#cmd = ['gdb', '-args'] + cmd +env = %(env)r +status = subprocess.call(cmd, env=env, cwd=%(cwd)r, shell=isinstance(cmd, str)) +sys.exit(status) +""" +@taskgen_method +def handle_ut_cwd(self,key): + cwd=getattr(self,key,None) + if cwd: + if isinstance(cwd,str): + if os.path.isabs(cwd): + self.ut_cwd=self.bld.root.make_node(cwd) + else: + self.ut_cwd=self.path.make_node(cwd) +@feature('test_scripts') +def make_interpreted_test(self): + for x in['test_scripts_source','test_scripts_template']: + if not hasattr(self,x): + Logs.warn('a test_scripts taskgen i missing %s'%x) + return + self.ut_run,lst=Task.compile_fun(self.test_scripts_template,shell=getattr(self,'test_scripts_shell',False)) + script_nodes=self.to_nodes(self.test_scripts_source) + for script_node in script_nodes: + tsk=self.create_task('utest',[script_node]) + tsk.vars=lst+tsk.vars + tsk.env['SCRIPT']=script_node.path_from(tsk.get_cwd()) + self.handle_ut_cwd('test_scripts_cwd') + env=getattr(self,'test_scripts_env',None) + if env: + self.ut_env=env + else: + self.ut_env=dict(os.environ) + paths=getattr(self,'test_scripts_paths',{}) + for(k,v)in paths.items(): + p=self.ut_env.get(k,'').split(os.pathsep) + if isinstance(v,str): + v=v.split(os.pathsep) + self.ut_env[k]=os.pathsep.join(p+v) @feature('test') -@after_method('apply_link') +@after_method('apply_link','process_use') def make_test(self): - if getattr(self,'link_task',None): - self.create_task('utest',self.link_task.outputs) + if not getattr(self,'link_task',None): + return + tsk=self.create_task('utest',self.link_task.outputs) + if getattr(self,'ut_str',None): + self.ut_run,lst=Task.compile_fun(self.ut_str,shell=getattr(self,'ut_shell',False)) + tsk.vars=lst+tsk.vars + self.handle_ut_cwd('ut_cwd') + if not hasattr(self,'ut_paths'): + paths=[] + for x in self.tmp_use_sorted: + try: + y=self.bld.get_tgen_by_name(x).link_task + except AttributeError: + pass + else: + if not isinstance(y,ccroot.stlink_task): + paths.append(y.outputs[0].parent.abspath()) + self.ut_paths=os.pathsep.join(paths)+os.pathsep + if not hasattr(self,'ut_env'): + self.ut_env=dct=dict(os.environ) + def add_path(var): + dct[var]=self.ut_paths+dct.get(var,'') + if Utils.is_win32: + add_path('PATH') + elif Utils.unversioned_sys_platform()=='darwin': + add_path('DYLD_LIBRARY_PATH') + add_path('LD_LIBRARY_PATH') + else: + add_path('LD_LIBRARY_PATH') + if not hasattr(self,'ut_cmd'): + self.ut_cmd=getattr(Options.options,'testcmd',False) @taskgen_method def add_test_results(self,tup): Logs.debug("ut: %r",tup) - self.utest_result=tup + try: + self.utest_results.append(tup) + except AttributeError: + self.utest_results=[tup] try: self.bld.utest_results.append(tup) except AttributeError: self.bld.utest_results=[tup] +@Task.deep_inputs class utest(Task.Task): color='PINK' after=['vnum','inst'] @@ -31,64 +107,53 @@ class utest(Task.Task): if getattr(Options.options,'all_tests',False): return Task.RUN_ME return ret - def add_path(self,dct,path,var): - dct[var]=os.pathsep.join(Utils.to_list(path)+[os.environ.get(var,'')]) def get_test_env(self): - try: - fu=getattr(self.generator.bld,'all_test_paths') - except AttributeError: - fu=os.environ.copy() - lst=[] - for g in self.generator.bld.groups: - for tg in g: - if getattr(tg,'link_task',None): - s=tg.link_task.outputs[0].parent.abspath() - if s not in lst: - lst.append(s) - if Utils.is_win32: - self.add_path(fu,lst,'PATH') - elif Utils.unversioned_sys_platform()=='darwin': - self.add_path(fu,lst,'DYLD_LIBRARY_PATH') - self.add_path(fu,lst,'LD_LIBRARY_PATH') - else: - self.add_path(fu,lst,'LD_LIBRARY_PATH') - self.generator.bld.all_test_paths=fu - return fu + return self.generator.ut_env + def post_run(self): + super(utest,self).post_run() + if getattr(Options.options,'clear_failed_tests',False)and self.waf_unit_test_results[1]: + self.generator.bld.task_sigs[self.uid()]=None def run(self): - filename=self.inputs[0].abspath() - self.ut_exec=getattr(self.generator,'ut_exec',[filename]) - if getattr(self.generator,'ut_fun',None): - self.generator.ut_fun(self) - cwd=getattr(self.generator,'ut_cwd','')or self.inputs[0].parent.abspath() - testcmd=getattr(self.generator,'ut_cmd',False)or getattr(Options.options,'testcmd',False) - if testcmd: - self.ut_exec=(testcmd%" ".join(self.ut_exec)).split(' ') - proc=Utils.subprocess.Popen(self.ut_exec,cwd=cwd,env=self.get_test_env(),stderr=Utils.subprocess.PIPE,stdout=Utils.subprocess.PIPE) + if hasattr(self.generator,'ut_run'): + return self.generator.ut_run(self) + self.ut_exec=getattr(self.generator,'ut_exec',[self.inputs[0].abspath()]) + ut_cmd=getattr(self.generator,'ut_cmd',False) + if ut_cmd: + self.ut_exec=shlex.split(ut_cmd%' '.join(self.ut_exec)) + return self.exec_command(self.ut_exec) + def exec_command(self,cmd,**kw): + self.generator.bld.log_command(cmd,kw) + if getattr(Options.options,'dump_test_scripts',False): + script_code=SCRIPT_TEMPLATE%{'python':sys.executable,'env':self.get_test_env(),'cwd':self.get_cwd().abspath(),'cmd':cmd} + script_file=self.inputs[0].abspath()+'_run.py' + Utils.writef(script_file,script_code) + os.chmod(script_file,Utils.O755) + if Logs.verbose>1: + Logs.info('Test debug file written as %r'%script_file) + proc=Utils.subprocess.Popen(cmd,cwd=self.get_cwd().abspath(),env=self.get_test_env(),stderr=Utils.subprocess.PIPE,stdout=Utils.subprocess.PIPE,shell=isinstance(cmd,str)) (stdout,stderr)=proc.communicate() - self.waf_unit_test_results=tup=(filename,proc.returncode,stdout,stderr) + self.waf_unit_test_results=tup=(self.inputs[0].abspath(),proc.returncode,stdout,stderr) testlock.acquire() try: return self.generator.add_test_results(tup) finally: testlock.release() - def post_run(self): - super(utest,self).post_run() - if getattr(Options.options,'clear_failed_tests',False)and self.waf_unit_test_results[1]: - self.generator.bld.task_sigs[self.uid()]=None + def get_cwd(self): + return getattr(self.generator,'ut_cwd',self.inputs[0].parent) def summary(bld): lst=getattr(bld,'utest_results',[]) if lst: Logs.pprint('CYAN','execution summary') total=len(lst) tfail=len([x for x in lst if x[1]]) - Logs.pprint('CYAN',' tests that pass %d/%d'%(total-tfail,total)) + Logs.pprint('GREEN',' tests that pass %d/%d'%(total-tfail,total)) for(f,code,out,err)in lst: if not code: - Logs.pprint('CYAN',' %s'%f) - Logs.pprint('CYAN',' tests that fail %d/%d'%(tfail,total)) + Logs.pprint('GREEN',' %s'%f) + Logs.pprint('GREEN'if tfail==0 else'RED',' tests that fail %d/%d'%(tfail,total)) for(f,code,out,err)in lst: if code: - Logs.pprint('CYAN',' %s'%f) + Logs.pprint('RED',' %s'%f) def set_exit_code(bld): lst=getattr(bld,'utest_results',[]) for(f,code,out,err)in lst: @@ -103,4 +168,5 @@ def options(opt): opt.add_option('--notests',action='store_true',default=False,help='Exec no unit tests',dest='no_tests') opt.add_option('--alltests',action='store_true',default=False,help='Exec all unit tests',dest='all_tests') opt.add_option('--clear-failed',action='store_true',default=False,help='Force failed unit tests to run again next time',dest='clear_failed_tests') - opt.add_option('--testcmd',action='store',default=False,help='Run the unit tests using the test-cmd string'' example "--test-cmd="valgrind --error-exitcode=1'' %s" to run under valgrind',dest='testcmd') + opt.add_option('--testcmd',action='store',default=False,dest='testcmd',help='Run the unit tests using the test-cmd string example "--testcmd="valgrind --error-exitcode=1 %s" to run under valgrind') + opt.add_option('--dump-test-scripts',action='store_true',default=False,help='Create python scripts to help debug tests',dest='dump_test_scripts') diff --git a/waflib/Tools/winres.py b/waflib/Tools/winres.py index a055887..ecb362b 100644 --- a/waflib/Tools/winres.py +++ b/waflib/Tools/winres.py @@ -2,14 +2,14 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import re,traceback -from waflib import Task,Logs,Utils +import re +from waflib import Task from waflib.TaskGen import extension from waflib.Tools import c_preproc @extension('.rc') def rc_file(self,node): obj_ext='.rc.o' - if self.env['WINRC_TGT_F']=='/fo': + if self.env.WINRC_TGT_F=='/fo': obj_ext='.res' rctask=self.create_task('winrc',node,node.change_ext(obj_ext)) try: @@ -18,10 +18,11 @@ def rc_file(self,node): self.compiled_tasks=[rctask] re_lines=re.compile('(?:^[ \t]*(#|%:)[ \t]*(ifdef|ifndef|if|else|elif|endif|include|import|define|undef|pragma)[ \t]*(.*?)\s*$)|''(?:^\w+[ \t]*(ICON|BITMAP|CURSOR|HTML|FONT|MESSAGETABLE|TYPELIB|REGISTRY|D3DFX)[ \t]*(.*?)\s*$)',re.IGNORECASE|re.MULTILINE) class rc_parser(c_preproc.c_parser): - def filter_comments(self,filepath): - code=Utils.readf(filepath) + def filter_comments(self,node): + code=node.read() if c_preproc.use_trigraphs: - for(a,b)in c_preproc.trig_def:code=code.split(a).join(b) + for(a,b)in c_preproc.trig_def: + code=code.split(a).join(b) code=c_preproc.re_nl.sub('',code) code=c_preproc.re_cpp.sub(c_preproc.repl,code) ret=[] @@ -31,55 +32,21 @@ class rc_parser(c_preproc.c_parser): else: ret.append(('include',m.group(5))) return ret - def addlines(self,node): - self.currentnode_stack.append(node.parent) - filepath=node.abspath() - self.count_files+=1 - if self.count_files>c_preproc.recursion_limit: - raise c_preproc.PreprocError("recursion limit exceeded") - pc=self.parse_cache - Logs.debug('preproc: reading file %r',filepath) - try: - lns=pc[filepath] - except KeyError: - pass - else: - self.lines.extend(lns) - return - try: - lines=self.filter_comments(filepath) - lines.append((c_preproc.POPFILE,'')) - lines.reverse() - pc[filepath]=lines - self.lines.extend(lines) - except IOError: - raise c_preproc.PreprocError("could not read the file %s"%filepath) - except Exception: - if Logs.verbose>0: - Logs.error("parsing %s failed"%filepath) - traceback.print_exc() class winrc(Task.Task): run_str='${WINRC} ${WINRCFLAGS} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${WINRC_TGT_F} ${TGT} ${WINRC_SRC_F} ${SRC}' color='BLUE' def scan(self): tmp=rc_parser(self.generator.includes_nodes) tmp.start(self.inputs[0],self.env) - nodes=tmp.nodes - names=tmp.names - if Logs.verbose: - Logs.debug('deps: deps for %s: %r; unresolved %r'%(str(self),nodes,names)) - return(nodes,names) + return(tmp.nodes,tmp.names) def configure(conf): v=conf.env - v['WINRC_TGT_F']='-o' - v['WINRC_SRC_F']='-i' - if not conf.env.WINRC: + if not v.WINRC: if v.CC_NAME=='msvc': - conf.find_program('RC',var='WINRC',path_list=v['PATH']) - v['WINRC_TGT_F']='/fo' - v['WINRC_SRC_F']='' + conf.find_program('RC',var='WINRC',path_list=v.PATH) + v.WINRC_TGT_F='/fo' + v.WINRC_SRC_F='' else: - conf.find_program('windres',var='WINRC',path_list=v['PATH']) - if not conf.env.WINRC: - conf.fatal('winrc was not found!') - v['WINRCFLAGS']=[] + conf.find_program('windres',var='WINRC',path_list=v.PATH) + v.WINRC_TGT_F='-o' + v.WINRC_SRC_F='-i' diff --git a/waflib/Tools/xlc.py b/waflib/Tools/xlc.py index c56443b..a86010d 100644 --- a/waflib/Tools/xlc.py +++ b/waflib/Tools/xlc.py @@ -12,28 +12,29 @@ def find_xlc(conf): @conf def xlc_common_flags(conf): v=conf.env - v['CC_SRC_F']=[] - v['CC_TGT_F']=['-c','-o'] - if not v['LINK_CC']:v['LINK_CC']=v['CC'] - v['CCLNK_SRC_F']=[] - v['CCLNK_TGT_F']=['-o'] - v['CPPPATH_ST']='-I%s' - v['DEFINES_ST']='-D%s' - v['LIB_ST']='-l%s' - v['LIBPATH_ST']='-L%s' - v['STLIB_ST']='-l%s' - v['STLIBPATH_ST']='-L%s' - v['RPATH_ST']='-Wl,-rpath,%s' - v['SONAME_ST']=[] - v['SHLIB_MARKER']=[] - v['STLIB_MARKER']=[] - v['LINKFLAGS_cprogram']=['-Wl,-brtl'] - v['cprogram_PATTERN']='%s' - v['CFLAGS_cshlib']=['-fPIC'] - v['LINKFLAGS_cshlib']=['-G','-Wl,-brtl,-bexpfull'] - v['cshlib_PATTERN']='lib%s.so' - v['LINKFLAGS_cstlib']=[] - v['cstlib_PATTERN']='lib%s.a' + v.CC_SRC_F=[] + v.CC_TGT_F=['-c','-o'] + if not v.LINK_CC: + v.LINK_CC=v.CC + v.CCLNK_SRC_F=[] + v.CCLNK_TGT_F=['-o'] + v.CPPPATH_ST='-I%s' + v.DEFINES_ST='-D%s' + v.LIB_ST='-l%s' + v.LIBPATH_ST='-L%s' + v.STLIB_ST='-l%s' + v.STLIBPATH_ST='-L%s' + v.RPATH_ST='-Wl,-rpath,%s' + v.SONAME_ST=[] + v.SHLIB_MARKER=[] + v.STLIB_MARKER=[] + v.LINKFLAGS_cprogram=['-Wl,-brtl'] + v.cprogram_PATTERN='%s' + v.CFLAGS_cshlib=['-fPIC'] + v.LINKFLAGS_cshlib=['-G','-Wl,-brtl,-bexpfull'] + v.cshlib_PATTERN='lib%s.so' + v.LINKFLAGS_cstlib=[] + v.cstlib_PATTERN='lib%s.a' def configure(conf): conf.find_xlc() conf.find_ar() diff --git a/waflib/Tools/xlcxx.py b/waflib/Tools/xlcxx.py index f348bbf..8a081b6 100644 --- a/waflib/Tools/xlcxx.py +++ b/waflib/Tools/xlcxx.py @@ -12,28 +12,29 @@ def find_xlcxx(conf): @conf def xlcxx_common_flags(conf): v=conf.env - v['CXX_SRC_F']=[] - v['CXX_TGT_F']=['-c','-o'] - if not v['LINK_CXX']:v['LINK_CXX']=v['CXX'] - v['CXXLNK_SRC_F']=[] - v['CXXLNK_TGT_F']=['-o'] - v['CPPPATH_ST']='-I%s' - v['DEFINES_ST']='-D%s' - v['LIB_ST']='-l%s' - v['LIBPATH_ST']='-L%s' - v['STLIB_ST']='-l%s' - v['STLIBPATH_ST']='-L%s' - v['RPATH_ST']='-Wl,-rpath,%s' - v['SONAME_ST']=[] - v['SHLIB_MARKER']=[] - v['STLIB_MARKER']=[] - v['LINKFLAGS_cxxprogram']=['-Wl,-brtl'] - v['cxxprogram_PATTERN']='%s' - v['CXXFLAGS_cxxshlib']=['-fPIC'] - v['LINKFLAGS_cxxshlib']=['-G','-Wl,-brtl,-bexpfull'] - v['cxxshlib_PATTERN']='lib%s.so' - v['LINKFLAGS_cxxstlib']=[] - v['cxxstlib_PATTERN']='lib%s.a' + v.CXX_SRC_F=[] + v.CXX_TGT_F=['-c','-o'] + if not v.LINK_CXX: + v.LINK_CXX=v.CXX + v.CXXLNK_SRC_F=[] + v.CXXLNK_TGT_F=['-o'] + v.CPPPATH_ST='-I%s' + v.DEFINES_ST='-D%s' + v.LIB_ST='-l%s' + v.LIBPATH_ST='-L%s' + v.STLIB_ST='-l%s' + v.STLIBPATH_ST='-L%s' + v.RPATH_ST='-Wl,-rpath,%s' + v.SONAME_ST=[] + v.SHLIB_MARKER=[] + v.STLIB_MARKER=[] + v.LINKFLAGS_cxxprogram=['-Wl,-brtl'] + v.cxxprogram_PATTERN='%s' + v.CXXFLAGS_cxxshlib=['-fPIC'] + v.LINKFLAGS_cxxshlib=['-G','-Wl,-brtl,-bexpfull'] + v.cxxshlib_PATTERN='lib%s.so' + v.LINKFLAGS_cxxstlib=[] + v.cxxstlib_PATTERN='lib%s.a' def configure(conf): conf.find_xlcxx() conf.find_ar() diff --git a/waflib/Utils.py b/waflib/Utils.py index 6c2a8e0..273ebb0 100644 --- a/waflib/Utils.py +++ b/waflib/Utils.py @@ -2,8 +2,24 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file -import os,sys,errno,traceback,inspect,re,shutil,datetime,gc,platform -import subprocess +from __future__ import with_statement +import atexit,os,sys,errno,inspect,re,datetime,platform,base64,signal,functools,time +try: + import cPickle +except ImportError: + import pickle as cPickle +if os.name=='posix'and sys.version_info[0]<3: + try: + import subprocess32 as subprocess + except ImportError: + import subprocess +else: + import subprocess +try: + TimeoutExpired=subprocess.TimeoutExpired +except AttributeError: + class TimeoutExpired(Exception): + pass from collections import deque,defaultdict try: import _winreg as winreg @@ -14,16 +30,17 @@ except ImportError: winreg=None from waflib import Errors try: - from collections import UserDict -except ImportError: - from UserDict import UserDict -try: from hashlib import md5 except ImportError: try: - from md5 import md5 + from hashlib import sha1 as md5 except ImportError: pass +else: + try: + md5().digest() + except ValueError: + from hashlib import sha1 as md5 try: import threading except ImportError: @@ -37,89 +54,119 @@ except ImportError: def release(self): pass threading.Lock=threading.Thread=Lock -else: - run_old=threading.Thread.run - def run(*args,**kwargs): - try: - run_old(*args,**kwargs) - except(KeyboardInterrupt,SystemExit): - raise - except Exception: - sys.excepthook(*sys.exc_info()) - threading.Thread.run=run -SIG_NIL='iluvcuteoverload' +SIG_NIL='SIG_NIL_SIG_NIL_'.encode() O644=420 O755=493 rot_chr=['\\','|','/','-'] rot_idx=0 -try: - from collections import OrderedDict as ordered_iter_dict -except ImportError: - class ordered_iter_dict(dict): - def __init__(self,*k,**kw): - self.lst=[] - dict.__init__(self,*k,**kw) - def clear(self): - dict.clear(self) - self.lst=[] - def __setitem__(self,key,value): - dict.__setitem__(self,key,value) - try: - self.lst.remove(key) - except ValueError: - pass - self.lst.append(key) - def __delitem__(self,key): - dict.__delitem__(self,key) - try: - self.lst.remove(key) - except ValueError: - pass - def __iter__(self): - for x in self.lst: - yield x - def keys(self): - return self.lst -is_win32=os.sep=='\\'or sys.platform=='win32' -def readf(fname,m='r',encoding='ISO8859-1'): +class ordered_iter_dict(dict): + def __init__(self,*k,**kw): + self.lst=deque() + dict.__init__(self,*k,**kw) + def clear(self): + dict.clear(self) + self.lst=deque() + def __setitem__(self,key,value): + if key in dict.keys(self): + self.lst.remove(key) + dict.__setitem__(self,key,value) + self.lst.append(key) + def __delitem__(self,key): + dict.__delitem__(self,key) + try: + self.lst.remove(key) + except ValueError: + pass + def __iter__(self): + return reversed(self.lst) + def keys(self): + return reversed(self.lst) +class lru_node(object): + __slots__=('next','prev','key','val') + def __init__(self): + self.next=self + self.prev=self + self.key=None + self.val=None +class lru_cache(object): + __slots__=('maxlen','table','head') + def __init__(self,maxlen=100): + self.maxlen=maxlen + self.table={} + self.head=lru_node() + self.head.next=self.head + self.head.prev=self.head + def __getitem__(self,key): + node=self.table[key] + if node is self.head: + return node.val + node.prev.next=node.next + node.next.prev=node.prev + node.next=self.head.next + node.prev=self.head + self.head=node.next.prev=node.prev.next=node + return node.val + def __setitem__(self,key,val): + if key in self.table: + node=self.table[key] + node.val=val + self.__getitem__(key) + else: + if len(self.table)<self.maxlen: + node=lru_node() + node.prev=self.head + node.next=self.head.next + node.prev.next=node.next.prev=node + else: + node=self.head=self.head.next + try: + del self.table[node.key] + except KeyError: + pass + node.key=key + node.val=val + self.table[key]=node +class lazy_generator(object): + def __init__(self,fun,params): + self.fun=fun + self.params=params + def __iter__(self): + return self + def __next__(self): + try: + it=self.it + except AttributeError: + it=self.it=self.fun(*self.params) + return next(it) + next=__next__ +is_win32=os.sep=='\\'or sys.platform=='win32'or os.name=='nt' +def readf(fname,m='r',encoding='latin-1'): if sys.hexversion>0x3000000 and not'b'in m: m+='b' - f=open(fname,m) - try: + with open(fname,m)as f: txt=f.read() - finally: - f.close() if encoding: txt=txt.decode(encoding) else: txt=txt.decode() else: - f=open(fname,m) - try: + with open(fname,m)as f: txt=f.read() - finally: - f.close() return txt -def writef(fname,data,m='w',encoding='ISO8859-1'): +def writef(fname,data,m='w',encoding='latin-1'): if sys.hexversion>0x3000000 and not'b'in m: data=data.encode(encoding) m+='b' - f=open(fname,m) - try: + with open(fname,m)as f: f.write(data) - finally: - f.close() def h_file(fname): - f=open(fname,'rb') m=md5() - try: + with open(fname,'rb')as f: while fname: fname=f.read(200000) m.update(fname) - finally: - f.close() return m.digest() -def readf_win32(f,m='r',encoding='ISO8859-1'): +def readf_win32(f,m='r',encoding='latin-1'): flags=os.O_NOINHERIT|os.O_RDONLY if'b'in m: flags|=os.O_BINARY @@ -131,23 +178,17 @@ def readf_win32(f,m='r',encoding='ISO8859-1'): raise IOError('Cannot read from %r'%f) if sys.hexversion>0x3000000 and not'b'in m: m+='b' - f=os.fdopen(fd,m) - try: + with os.fdopen(fd,m)as f: txt=f.read() - finally: - f.close() if encoding: txt=txt.decode(encoding) else: txt=txt.decode() else: - f=os.fdopen(fd,m) - try: + with os.fdopen(fd,m)as f: txt=f.read() - finally: - f.close() return txt -def writef_win32(f,data,m='w',encoding='ISO8859-1'): +def writef_win32(f,data,m='w',encoding='latin-1'): if sys.hexversion>0x3000000 and not'b'in m: data=data.encode(encoding) m+='b' @@ -159,25 +200,19 @@ def writef_win32(f,data,m='w',encoding='ISO8859-1'): try: fd=os.open(f,flags) except OSError: - raise IOError('Cannot write to %r'%f) - f=os.fdopen(fd,m) - try: + raise OSError('Cannot write to %r'%f) + with os.fdopen(fd,m)as f: f.write(data) - finally: - f.close() def h_file_win32(fname): try: fd=os.open(fname,os.O_BINARY|os.O_RDONLY|os.O_NOINHERIT) except OSError: - raise IOError('Cannot read from %r'%fname) - f=os.fdopen(fd,'rb') + raise OSError('Cannot read from %r'%fname) m=md5() - try: + with os.fdopen(fd,'rb')as f: while fname: fname=f.read(200000) m.update(fname) - finally: - f.close() return m.digest() readf_unix=readf writef_unix=writef @@ -209,7 +244,7 @@ def listdir_win32(s): try: import ctypes except ImportError: - return[x+':\\'for x in list('ABCDEFGHIJKLMNOPQRSTUVWXYZ')] + return[x+':\\'for x in'ABCDEFGHIJKLMNOPQRSTUVWXYZ'] else: dlen=4 maxdrives=26 @@ -236,15 +271,25 @@ def num2ver(ver): ret+=256**(3-i)*int(ver[i]) return ret return ver -def ex_stack(): - exc_type,exc_value,tb=sys.exc_info() - exc_lines=traceback.format_exception(exc_type,exc_value,tb) - return''.join(exc_lines) -def to_list(sth): - if isinstance(sth,str): - return sth.split() +def to_list(val): + if isinstance(val,str): + return val.split() + else: + return val +def console_encoding(): + try: + import ctypes + except ImportError: + pass else: - return sth + try: + codepage=ctypes.windll.kernel32.GetConsoleCP() + except AttributeError: + pass + else: + if codepage: + return'cp%d'%codepage + return sys.stdout.encoding or('cp1252'if is_win32 else'latin-1') def split_path_unix(path): return path.split('/') def split_path_cygwin(path): @@ -253,43 +298,45 @@ def split_path_cygwin(path): ret[0]='/'+ret[0] return ret return path.split('/') -re_sp=re.compile('[/\\\\]') +re_sp=re.compile('[/\\\\]+') def split_path_win32(path): if path.startswith('\\\\'): - ret=re.split(re_sp,path)[2:] - ret[0]='\\'+ret[0] + ret=re_sp.split(path)[1:] + ret[0]='\\\\'+ret[0] + if ret[0]=='\\\\?': + return ret[1:] return ret - return re.split(re_sp,path) + return re_sp.split(path) msysroot=None def split_path_msys(path): - if(path.startswith('/')or path.startswith('\\'))and not path.startswith('//')and not path.startswith('\\\\'): + if path.startswith(('/','\\'))and not path.startswith(('//','\\\\')): global msysroot if not msysroot: - msysroot=subprocess.check_output(['cygpath','-w','/']).decode(sys.stdout.encoding or'iso8859-1') + msysroot=subprocess.check_output(['cygpath','-w','/']).decode(sys.stdout.encoding or'latin-1') msysroot=msysroot.strip() path=os.path.normpath(msysroot+os.sep+path) return split_path_win32(path) if sys.platform=='cygwin': split_path=split_path_cygwin elif is_win32: - if os.environ.get('MSYSTEM',None): + if os.environ.get('MSYSTEM'): split_path=split_path_msys else: split_path=split_path_win32 else: split_path=split_path_unix split_path.__doc__=""" -Split a path by / or \\. This function is not like os.path.split +Splits a path by / or \\; do not confuse this function with with ``os.path.split`` :type path: string :param path: path to split -:return: list of strings +:return: list of string """ def check_dir(path): if not os.path.isdir(path): try: os.makedirs(path) - except OSError ,e: + except OSError as e: if not os.path.isdir(path): raise Errors.WafError('Cannot create the folder %r'%path,ex=e) def check_exe(name,env=None): @@ -302,7 +349,7 @@ def check_exe(name,env=None): return os.path.abspath(name) else: env=env or os.environ - for path in env["PATH"].split(os.pathsep): + for path in env['PATH'].split(os.pathsep): path=path.strip('"') exe_file=os.path.join(path,name) if is_exe(exe_file): @@ -317,18 +364,27 @@ def quote_define_name(s): fu=re.sub('_+','_',fu) fu=fu.upper() return fu +re_sh=re.compile('\\s|\'|"') +def shell_escape(cmd): + if isinstance(cmd,str): + return cmd + return' '.join(repr(x)if re_sh.search(x)else x for x in cmd) def h_list(lst): - m=md5() - m.update(str(lst)) - return m.digest() + return md5(repr(lst).encode()).digest() def h_fun(fun): try: return fun.code except AttributeError: + if isinstance(fun,functools.partial): + code=list(fun.args) + code.extend(sorted(fun.keywords.items())) + code.append(h_fun(fun.func)) + fun.code=h_list(code) + return fun.code try: h=inspect.getsource(fun) - except IOError: - h="nocode" + except EnvironmentError: + h='nocode' try: fun.code=h except AttributeError: @@ -342,7 +398,7 @@ def h_cmd(ins): else: ret=str(h_fun(ins)) if sys.hexversion>0x3000000: - ret=ret.encode('iso8859-1','xmlcharrefreplace') + ret=ret.encode('latin-1','xmlcharrefreplace') return ret reg_subst=re.compile(r"(\\\\)|(\$\$)|\$\{([^}]+)\}") def subst_vars(expr,params): @@ -389,9 +445,11 @@ def nada(*k,**kw): pass class Timer(object): def __init__(self): - self.start_time=datetime.datetime.utcnow() + self.start_time=self.now() def __str__(self): - delta=datetime.datetime.utcnow()-self.start_time + delta=self.now()-self.start_time + if not isinstance(delta,datetime.timedelta): + delta=datetime.timedelta(seconds=delta) days=delta.days hours,rem=divmod(delta.seconds,3600) minutes,seconds=divmod(rem,60) @@ -404,18 +462,11 @@ class Timer(object): if days or hours or minutes: result+='%dm'%minutes return'%s%.3fs'%(result,seconds) -if is_win32: - old=shutil.copy2 - def copy2(src,dst): - old(src,dst) - shutil.copystat(src,dst) - setattr(shutil,'copy2',copy2) -if os.name=='java': - try: - gc.disable() - gc.enable() - except NotImplementedError: - gc.disable=gc.enable + def now(self): + return datetime.datetime.utcnow() + if hasattr(time,'perf_counter'): + def now(self): + return time.perf_counter() def read_la_file(path): sp=re.compile(r'^([^=]+)=\'(.*)\'$') dc={} @@ -426,23 +477,13 @@ def read_la_file(path): except ValueError: pass return dc -def nogc(fun): - def f(*k,**kw): - try: - gc.disable() - ret=fun(*k,**kw) - finally: - gc.enable() - return ret - f.__doc__=fun.__doc__ - return f def run_once(fun): cache={} - def wrap(k): + def wrap(*k): try: return cache[k] except KeyError: - ret=fun(k) + ret=fun(*k) cache[k]=ret return ret wrap.__cache__=cache @@ -453,7 +494,7 @@ def get_registry_app_path(key,filename): return None try: result=winreg.QueryValue(key,"Software\\Microsoft\\Windows\\CurrentVersion\\App Paths\\%s.exe"%filename[0]) - except WindowsError: + except OSError: pass else: if os.path.isfile(result): @@ -466,3 +507,114 @@ def lib64(): return'' def sane_path(p): return os.path.abspath(os.path.expanduser(p)) +process_pool=[] +def get_process(): + try: + return process_pool.pop() + except IndexError: + filepath=os.path.dirname(os.path.abspath(__file__))+os.sep+'processor.py' + cmd=[sys.executable,'-c',readf(filepath)] + return subprocess.Popen(cmd,stdout=subprocess.PIPE,stdin=subprocess.PIPE,bufsize=0) +def run_prefork_process(cmd,kwargs,cargs): + if not'env'in kwargs: + kwargs['env']=dict(os.environ) + try: + obj=base64.b64encode(cPickle.dumps([cmd,kwargs,cargs])) + except(TypeError,AttributeError): + return run_regular_process(cmd,kwargs,cargs) + proc=get_process() + if not proc: + return run_regular_process(cmd,kwargs,cargs) + proc.stdin.write(obj) + proc.stdin.write('\n'.encode()) + proc.stdin.flush() + obj=proc.stdout.readline() + if not obj: + raise OSError('Preforked sub-process %r died'%proc.pid) + process_pool.append(proc) + lst=cPickle.loads(base64.b64decode(obj)) + assert len(lst)==5 + ret,out,err,ex,trace=lst + if ex: + if ex=='OSError': + raise OSError(trace) + elif ex=='ValueError': + raise ValueError(trace) + elif ex=='TimeoutExpired': + exc=TimeoutExpired(cmd,timeout=cargs['timeout'],output=out) + exc.stderr=err + raise exc + else: + raise Exception(trace) + return ret,out,err +def lchown(path,user=-1,group=-1): + if isinstance(user,str): + import pwd + entry=pwd.getpwnam(user) + if not entry: + raise OSError('Unknown user %r'%user) + user=entry[2] + if isinstance(group,str): + import grp + entry=grp.getgrnam(group) + if not entry: + raise OSError('Unknown group %r'%group) + group=entry[2] + return os.lchown(path,user,group) +def run_regular_process(cmd,kwargs,cargs={}): + proc=subprocess.Popen(cmd,**kwargs) + if kwargs.get('stdout')or kwargs.get('stderr'): + try: + out,err=proc.communicate(**cargs) + except TimeoutExpired: + if kwargs.get('start_new_session')and hasattr(os,'killpg'): + os.killpg(proc.pid,signal.SIGKILL) + else: + proc.kill() + out,err=proc.communicate() + exc=TimeoutExpired(proc.args,timeout=cargs['timeout'],output=out) + exc.stderr=err + raise exc + status=proc.returncode + else: + out,err=(None,None) + try: + status=proc.wait(**cargs) + except TimeoutExpired as e: + if kwargs.get('start_new_session')and hasattr(os,'killpg'): + os.killpg(proc.pid,signal.SIGKILL) + else: + proc.kill() + proc.wait() + raise e + return status,out,err +def run_process(cmd,kwargs,cargs={}): + if kwargs.get('stdout')and kwargs.get('stderr'): + return run_prefork_process(cmd,kwargs,cargs) + else: + return run_regular_process(cmd,kwargs,cargs) +def alloc_process_pool(n,force=False): + global run_process,get_process,alloc_process_pool + if not force: + n=max(n-len(process_pool),0) + try: + lst=[get_process()for x in range(n)] + except OSError: + run_process=run_regular_process + get_process=alloc_process_pool=nada + else: + for x in lst: + process_pool.append(x) +def atexit_pool(): + for k in process_pool: + try: + os.kill(k.pid,9) + except OSError: + pass + else: + k.wait() +if(sys.hexversion<0x207000f and not is_win32)or sys.hexversion>=0x306000f: + atexit.register(atexit_pool) +if os.environ.get('WAF_NO_PREFORK')or sys.platform=='cli'or not sys.executable: + run_process=run_regular_process + get_process=alloc_process_pool=nada diff --git a/waflib/ansiterm.py b/waflib/ansiterm.py index 8de767d..1d8bc78 100644 --- a/waflib/ansiterm.py +++ b/waflib/ansiterm.py @@ -232,7 +232,7 @@ else: return struct.unpack("HHHH",fcntl.ioctl(FD,termios.TIOCGWINSZ,struct.pack("HHHH",0,0,0,0)))[1] try: fun() - except Exception ,e: + except Exception as e: pass else: get_term_cols=fun diff --git a/waflib/extras/c_emscripten.py b/waflib/extras/c_emscripten.py new file mode 100644 index 0000000..272e896 --- /dev/null +++ b/waflib/extras/c_emscripten.py @@ -0,0 +1,73 @@ +#! /usr/bin/env python +# encoding: utf-8 +# WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file + +import subprocess,shlex,sys +from waflib.Tools import ccroot,gcc,gxx +from waflib.Configure import conf +from waflib.TaskGen import after_method,feature +from waflib.Tools.compiler_c import c_compiler +from waflib.Tools.compiler_cxx import cxx_compiler +for supported_os in('linux','darwin','gnu','aix'): + c_compiler[supported_os].append('c_emscripten') + cxx_compiler[supported_os].append('c_emscripten') +@conf +def get_emscripten_version(conf,cc): + dummy=conf.cachedir.parent.make_node("waf-emscripten.c") + dummy.write("") + cmd=cc+['-dM','-E','-x','c',dummy.abspath()] + env=conf.env.env or None + try: + p=subprocess.Popen(cmd,stdout=subprocess.PIPE,stderr=subprocess.PIPE,env=env) + out=p.communicate()[0] + except Exception as e: + conf.fatal('Could not determine emscripten version %r: %s'%(cmd,e)) + if not isinstance(out,str): + out=out.decode(sys.stdout.encoding or'latin-1') + k={} + out=out.splitlines() + for line in out: + lst=shlex.split(line) + if len(lst)>2: + key=lst[1] + val=lst[2] + k[key]=val + if not('__clang__'in k and'EMSCRIPTEN'in k): + conf.fatal('Could not determine the emscripten compiler version.') + conf.env.DEST_OS='generic' + conf.env.DEST_BINFMT='elf' + conf.env.DEST_CPU='asm-js' + conf.env.CC_VERSION=(k['__clang_major__'],k['__clang_minor__'],k['__clang_patchlevel__']) + return k +@conf +def find_emscripten(conf): + cc=conf.find_program(['emcc'],var='CC') + conf.get_emscripten_version(cc) + conf.env.CC=cc + conf.env.CC_NAME='emscripten' + cxx=conf.find_program(['em++'],var='CXX') + conf.env.CXX=cxx + conf.env.CXX_NAME='emscripten' + conf.find_program(['emar'],var='AR') +def configure(conf): + conf.find_emscripten() + conf.find_ar() + conf.gcc_common_flags() + conf.gxx_common_flags() + conf.cc_load_tools() + conf.cc_add_flags() + conf.cxx_load_tools() + conf.cxx_add_flags() + conf.link_add_flags() + conf.env.ARFLAGS=['rcs'] + conf.env.cshlib_PATTERN='%s.js' + conf.env.cxxshlib_PATTERN='%s.js' + conf.env.cstlib_PATTERN='%s.a' + conf.env.cxxstlib_PATTERN='%s.a' + conf.env.cprogram_PATTERN='%s.html' + conf.env.cxxprogram_PATTERN='%s.html' + conf.env.CXX_TGT_F=['-c','-o',''] + conf.env.CC_TGT_F=['-c','-o',''] + conf.env.CXXLNK_TGT_F=['-o',''] + conf.env.CCLNK_TGT_F=['-o',''] + conf.env.append_value('LINKFLAGS',['-Wl,--enable-auto-import']) diff --git a/waflib/extras/compat15.py b/waflib/extras/compat15.py deleted file mode 100644 index aa30079..0000000 --- a/waflib/extras/compat15.py +++ /dev/null @@ -1,301 +0,0 @@ -#! /usr/bin/env python -# encoding: utf-8 -# WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file - -import sys -from waflib import ConfigSet,Logs,Options,Scripting,Task,Build,Configure,Node,Runner,TaskGen,Utils,Errors,Context -sys.modules['Environment']=ConfigSet -ConfigSet.Environment=ConfigSet.ConfigSet -sys.modules['Logs']=Logs -sys.modules['Options']=Options -sys.modules['Scripting']=Scripting -sys.modules['Task']=Task -sys.modules['Build']=Build -sys.modules['Configure']=Configure -sys.modules['Node']=Node -sys.modules['Runner']=Runner -sys.modules['TaskGen']=TaskGen -sys.modules['Utils']=Utils -sys.modules['Constants']=Context -Context.SRCDIR='' -Context.BLDDIR='' -from waflib.Tools import c_preproc -sys.modules['preproc']=c_preproc -from waflib.Tools import c_config -sys.modules['config_c']=c_config -ConfigSet.ConfigSet.copy=ConfigSet.ConfigSet.derive -ConfigSet.ConfigSet.set_variant=Utils.nada -Utils.pproc=Utils.subprocess -Build.BuildContext.add_subdirs=Build.BuildContext.recurse -Build.BuildContext.new_task_gen=Build.BuildContext.__call__ -Build.BuildContext.is_install=0 -Node.Node.relpath_gen=Node.Node.path_from -Utils.pproc=Utils.subprocess -Utils.get_term_cols=Logs.get_term_cols -def cmd_output(cmd,**kw): - silent=False - if'silent'in kw: - silent=kw['silent'] - del(kw['silent']) - if'e'in kw: - tmp=kw['e'] - del(kw['e']) - kw['env']=tmp - kw['shell']=isinstance(cmd,str) - kw['stdout']=Utils.subprocess.PIPE - if silent: - kw['stderr']=Utils.subprocess.PIPE - try: - p=Utils.subprocess.Popen(cmd,**kw) - output=p.communicate()[0] - except OSError ,e: - raise ValueError(str(e)) - if p.returncode: - if not silent: - msg="command execution failed: %s -> %r"%(cmd,str(output)) - raise ValueError(msg) - output='' - return output -Utils.cmd_output=cmd_output -def name_to_obj(self,s,env=None): - if Logs.verbose: - Logs.warn('compat: change "name_to_obj(name, env)" by "get_tgen_by_name(name)"') - return self.get_tgen_by_name(s) -Build.BuildContext.name_to_obj=name_to_obj -def env_of_name(self,name): - try: - return self.all_envs[name] - except KeyError: - Logs.error('no such environment: '+name) - return None -Build.BuildContext.env_of_name=env_of_name -def set_env_name(self,name,env): - self.all_envs[name]=env - return env -Configure.ConfigurationContext.set_env_name=set_env_name -def retrieve(self,name,fromenv=None): - try: - env=self.all_envs[name] - except KeyError: - env=ConfigSet.ConfigSet() - self.prepare_env(env) - self.all_envs[name]=env - else: - if fromenv: - Logs.warn("The environment %s may have been configured already"%name) - return env -Configure.ConfigurationContext.retrieve=retrieve -Configure.ConfigurationContext.sub_config=Configure.ConfigurationContext.recurse -Configure.ConfigurationContext.check_tool=Configure.ConfigurationContext.load -Configure.conftest=Configure.conf -Configure.ConfigurationError=Errors.ConfigurationError -Utils.WafError=Errors.WafError -Options.OptionsContext.sub_options=Options.OptionsContext.recurse -Options.OptionsContext.tool_options=Context.Context.load -Options.Handler=Options.OptionsContext -Task.simple_task_type=Task.task_type_from_func=Task.task_factory -Task.TaskBase.classes=Task.classes -def setitem(self,key,value): - if key.startswith('CCFLAGS'): - key=key[1:] - self.table[key]=value -ConfigSet.ConfigSet.__setitem__=setitem -@TaskGen.feature('d') -@TaskGen.before('apply_incpaths') -def old_importpaths(self): - if getattr(self,'importpaths',[]): - self.includes=self.importpaths -from waflib import Context -eld=Context.load_tool -def load_tool(*k,**kw): - ret=eld(*k,**kw) - if'set_options'in ret.__dict__: - if Logs.verbose: - Logs.warn('compat: rename "set_options" to options') - ret.options=ret.set_options - if'detect'in ret.__dict__: - if Logs.verbose: - Logs.warn('compat: rename "detect" to "configure"') - ret.configure=ret.detect - return ret -Context.load_tool=load_tool -def get_curdir(self): - return self.path.abspath() -Context.Context.curdir=property(get_curdir,Utils.nada) -def get_srcdir(self): - return self.srcnode.abspath() -Configure.ConfigurationContext.srcdir=property(get_srcdir,Utils.nada) -def get_blddir(self): - return self.bldnode.abspath() -Configure.ConfigurationContext.blddir=property(get_blddir,Utils.nada) -Configure.ConfigurationContext.check_message_1=Configure.ConfigurationContext.start_msg -Configure.ConfigurationContext.check_message_2=Configure.ConfigurationContext.end_msg -rev=Context.load_module -def load_module(path,encoding=None): - ret=rev(path,encoding) - if'set_options'in ret.__dict__: - if Logs.verbose: - Logs.warn('compat: rename "set_options" to "options" (%r)'%path) - ret.options=ret.set_options - if'srcdir'in ret.__dict__: - if Logs.verbose: - Logs.warn('compat: rename "srcdir" to "top" (%r)'%path) - ret.top=ret.srcdir - if'blddir'in ret.__dict__: - if Logs.verbose: - Logs.warn('compat: rename "blddir" to "out" (%r)'%path) - ret.out=ret.blddir - Utils.g_module=Context.g_module - Options.launch_dir=Context.launch_dir - return ret -Context.load_module=load_module -old_post=TaskGen.task_gen.post -def post(self): - self.features=self.to_list(self.features) - if'cc'in self.features: - if Logs.verbose: - Logs.warn('compat: the feature cc does not exist anymore (use "c")') - self.features.remove('cc') - self.features.append('c') - if'cstaticlib'in self.features: - if Logs.verbose: - Logs.warn('compat: the feature cstaticlib does not exist anymore (use "cstlib" or "cxxstlib")') - self.features.remove('cstaticlib') - self.features.append(('cxx'in self.features)and'cxxstlib'or'cstlib') - if getattr(self,'ccflags',None): - if Logs.verbose: - Logs.warn('compat: "ccflags" was renamed to "cflags"') - self.cflags=self.ccflags - return old_post(self) -TaskGen.task_gen.post=post -def waf_version(*k,**kw): - Logs.warn('wrong version (waf_version was removed in waf 1.6)') -Utils.waf_version=waf_version -import os -@TaskGen.feature('c','cxx','d') -@TaskGen.before('apply_incpaths','propagate_uselib_vars') -@TaskGen.after('apply_link','process_source') -def apply_uselib_local(self): - env=self.env - from waflib.Tools.ccroot import stlink_task - self.uselib=self.to_list(getattr(self,'uselib',[])) - self.includes=self.to_list(getattr(self,'includes',[])) - names=self.to_list(getattr(self,'uselib_local',[])) - get=self.bld.get_tgen_by_name - seen=set([]) - seen_uselib=set([]) - tmp=Utils.deque(names) - if tmp: - if Logs.verbose: - Logs.warn('compat: "uselib_local" is deprecated, replace by "use"') - while tmp: - lib_name=tmp.popleft() - if lib_name in seen: - continue - y=get(lib_name) - y.post() - seen.add(lib_name) - if getattr(y,'uselib_local',None): - for x in self.to_list(getattr(y,'uselib_local',[])): - obj=get(x) - obj.post() - if getattr(obj,'link_task',None): - if not isinstance(obj.link_task,stlink_task): - tmp.append(x) - if getattr(y,'link_task',None): - link_name=y.target[y.target.rfind(os.sep)+1:] - if isinstance(y.link_task,stlink_task): - env.append_value('STLIB',[link_name]) - else: - env.append_value('LIB',[link_name]) - self.link_task.set_run_after(y.link_task) - self.link_task.dep_nodes+=y.link_task.outputs - tmp_path=y.link_task.outputs[0].parent.bldpath() - if not tmp_path in env['LIBPATH']: - env.prepend_value('LIBPATH',[tmp_path]) - for v in self.to_list(getattr(y,'uselib',[])): - if v not in seen_uselib: - seen_uselib.add(v) - if not env['STLIB_'+v]: - if not v in self.uselib: - self.uselib.insert(0,v) - if getattr(y,'export_includes',None): - self.includes.extend(y.to_incnodes(y.export_includes)) -@TaskGen.feature('cprogram','cxxprogram','cstlib','cxxstlib','cshlib','cxxshlib','dprogram','dstlib','dshlib') -@TaskGen.after('apply_link') -def apply_objdeps(self): - names=getattr(self,'add_objects',[]) - if not names: - return - names=self.to_list(names) - get=self.bld.get_tgen_by_name - seen=[] - while names: - x=names[0] - if x in seen: - names=names[1:] - continue - y=get(x) - if getattr(y,'add_objects',None): - added=0 - lst=y.to_list(y.add_objects) - lst.reverse() - for u in lst: - if u in seen:continue - added=1 - names=[u]+names - if added:continue - y.post() - seen.append(x) - for t in getattr(y,'compiled_tasks',[]): - self.link_task.inputs.extend(t.outputs) -@TaskGen.after('apply_link') -def process_obj_files(self): - if not hasattr(self,'obj_files'): - return - for x in self.obj_files: - node=self.path.find_resource(x) - self.link_task.inputs.append(node) -@TaskGen.taskgen_method -def add_obj_file(self,file): - if not hasattr(self,'obj_files'):self.obj_files=[] - if not'process_obj_files'in self.meths:self.meths.append('process_obj_files') - self.obj_files.append(file) -old_define=Configure.ConfigurationContext.__dict__['define'] -@Configure.conf -def define(self,key,val,quote=True,comment=''): - old_define(self,key,val,quote,comment) - if key.startswith('HAVE_'): - self.env[key]=1 -old_undefine=Configure.ConfigurationContext.__dict__['undefine'] -@Configure.conf -def undefine(self,key,comment=''): - old_undefine(self,key,comment) - if key.startswith('HAVE_'): - self.env[key]=0 -def set_incdirs(self,val): - Logs.warn('compat: change "export_incdirs" by "export_includes"') - self.export_includes=val -TaskGen.task_gen.export_incdirs=property(None,set_incdirs) -def install_dir(self,path): - if not path: - return[] - destpath=Utils.subst_vars(path,self.env) - if self.is_install>0: - Logs.info('* creating %s'%destpath) - Utils.check_dir(destpath) - elif self.is_install<0: - Logs.info('* removing %s'%destpath) - try: - os.remove(destpath) - except OSError: - pass -Build.BuildContext.install_dir=install_dir -repl={'apply_core':'process_source','apply_lib_vars':'process_source','apply_obj_vars':'propagate_uselib_vars','exec_rule':'process_rule'} -def after(*k): - k=[repl.get(key,key)for key in k] - return TaskGen.after_method(*k) -def before(*k): - k=[repl.get(key,key)for key in k] - return TaskGen.before_method(*k) -TaskGen.before=before diff --git a/waflib/fixpy2.py b/waflib/fixpy2.py index 5e434d1..9aa8418 100644 --- a/waflib/fixpy2.py +++ b/waflib/fixpy2.py @@ -2,10 +2,10 @@ # encoding: utf-8 # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file +from __future__ import with_statement import os all_modifs={} def fixdir(dir): - global all_modifs for k in all_modifs: for v in all_modifs[k]: modif(os.path.join(dir,'waflib'),k,v) @@ -20,20 +20,13 @@ def modif(dir,name,fun): modif(dir,x,fun) return filename=os.path.join(dir,name) - f=open(filename,'r') - try: + with open(filename,'r')as f: txt=f.read() - finally: - f.close() txt=fun(txt) - f=open(filename,'w') - try: + with open(filename,'w')as f: f.write(txt) - finally: - f.close() def subst(*k): def do_subst(fun): - global all_modifs for x in k: try: all_modifs[x].append(fun) @@ -43,11 +36,12 @@ def subst(*k): return do_subst @subst('*') def r1(code): - code=code.replace(',e:',',e:') - code=code.replace("",'') - code=code.replace('','') - return code + code=code.replace('as e:',',e:') + code=code.replace(".decode(sys.stdout.encoding or'latin-1',errors='replace')",'') + return code.replace('.encode()','') @subst('Runner.py') def r4(code): - code=code.replace('next(self.biter)','self.biter.next()') - return code + return code.replace('next(self.biter)','self.biter.next()') +@subst('Context.py') +def r5(code): + return code.replace("('Execution failure: %s'%str(e),ex=e)","('Execution failure: %s'%str(e),ex=e),None,sys.exc_info()[2]") diff --git a/waflib/processor.py b/waflib/processor.py new file mode 100755 index 0000000..10f7c1b --- /dev/null +++ b/waflib/processor.py @@ -0,0 +1,55 @@ +#! /usr/bin/env python +# encoding: utf-8 +# WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file + +import os,sys,traceback,base64,signal +try: + import cPickle +except ImportError: + import pickle as cPickle +try: + import subprocess32 as subprocess +except ImportError: + import subprocess +try: + TimeoutExpired=subprocess.TimeoutExpired +except AttributeError: + class TimeoutExpired(Exception): + pass +def run(): + txt=sys.stdin.readline().strip() + if not txt: + sys.exit(1) + [cmd,kwargs,cargs]=cPickle.loads(base64.b64decode(txt)) + cargs=cargs or{} + ret=1 + out,err,ex,trace=(None,None,None,None) + try: + proc=subprocess.Popen(cmd,**kwargs) + try: + out,err=proc.communicate(**cargs) + except TimeoutExpired: + if kwargs.get('start_new_session')and hasattr(os,'killpg'): + os.killpg(proc.pid,signal.SIGKILL) + else: + proc.kill() + out,err=proc.communicate() + exc=TimeoutExpired(proc.args,timeout=cargs['timeout'],output=out) + exc.stderr=err + raise exc + ret=proc.returncode + except Exception as e: + exc_type,exc_value,tb=sys.exc_info() + exc_lines=traceback.format_exception(exc_type,exc_value,tb) + trace=str(cmd)+'\n'+''.join(exc_lines) + ex=e.__class__.__name__ + tmp=[ret,out,err,ex,trace] + obj=base64.b64encode(cPickle.dumps(tmp)) + sys.stdout.write(obj.decode()) + sys.stdout.write('\n') + sys.stdout.flush() +while 1: + try: + run() + except KeyboardInterrupt: + break @@ -14,19 +14,10 @@ import sys APPNAME = 'aubio' -# source VERSION -for l in open('VERSION').readlines(): exec (l.strip()) +from this_version import * -VERSION = '.'.join ([str(x) for x in [ - AUBIO_MAJOR_VERSION, - AUBIO_MINOR_VERSION, - AUBIO_PATCH_VERSION - ]]) + AUBIO_VERSION_STATUS - -LIB_VERSION = '.'.join ([str(x) for x in [ - LIBAUBIO_LT_CUR, - LIBAUBIO_LT_REV, - LIBAUBIO_LT_AGE]]) +VERSION = get_aubio_version() +LIB_VERSION = get_libaubio_version() top = '.' out = 'build' @@ -47,12 +38,22 @@ def add_option_enable_disable(ctx, name, default = None, help = help_disable_str ) def options(ctx): + ctx.add_option('--build-type', action = 'store', + default = "release", + choices = ('debug', 'release'), + dest = 'build_type', + help = 'whether to compile with (--build-type=release)' \ + ' or without (--build-type=debug)' \ + ' compiler opimizations [default: release]') add_option_enable_disable(ctx, 'fftw3f', default = False, help_str = 'compile with fftw3f instead of ooura (recommended)', help_disable_str = 'do not compile with fftw3f') add_option_enable_disable(ctx, 'fftw3', default = False, help_str = 'compile with fftw3 instead of ooura', help_disable_str = 'do not compile with fftw3') + add_option_enable_disable(ctx, 'intelipp', default = False, + help_str = 'use Intel IPP libraries (auto)', + help_disable_str = 'do not use Intel IPP libraries') add_option_enable_disable(ctx, 'complex', default = False, help_str ='compile with C99 complex', help_disable_str = 'do not use C99 complex (default)' ) @@ -83,46 +84,107 @@ def options(ctx): add_option_enable_disable(ctx, 'apple-audio', default = None, help_str = 'use CoreFoundation (darwin only) (auto)', help_disable_str = 'do not use CoreFoundation framework') - add_option_enable_disable(ctx, 'atlas', default = None, - help_str = 'use Atlas library (auto)', - help_disable_str = 'do not use Atlas library') + add_option_enable_disable(ctx, 'blas', default = False, + help_str = 'use BLAS acceleration library (no)', + help_disable_str = 'do not use BLAS library') + add_option_enable_disable(ctx, 'atlas', default = False, + help_str = 'use ATLAS acceleration library (no)', + help_disable_str = 'do not use ATLAS library') + add_option_enable_disable(ctx, 'wavread', default = True, + help_str = 'compile with source_wavread (default)', + help_disable_str = 'do not compile source_wavread') + add_option_enable_disable(ctx, 'wavwrite', default = True, + help_str = 'compile with source_wavwrite (default)', + help_disable_str = 'do not compile source_wavwrite') add_option_enable_disable(ctx, 'docs', default = None, help_str = 'build documentation (auto)', help_disable_str = 'do not build documentation') + add_option_enable_disable(ctx, 'tests', default = True, + help_str = 'build tests (true)', + help_disable_str = 'do not build or run tests') + + add_option_enable_disable(ctx, 'examples', default = True, + help_str = 'build examples (true)', + help_disable_str = 'do not build examples') + ctx.add_option('--with-target-platform', type='string', - help='set target platform for cross-compilation', dest='target_platform') + help='set target platform for cross-compilation', + dest='target_platform') ctx.load('compiler_c') ctx.load('waf_unit_test') ctx.load('gnu_dirs') + ctx.load('waf_gensyms', tooldir='.') def configure(ctx): + target_platform = sys.platform + if ctx.options.target_platform: + target_platform = ctx.options.target_platform + from waflib import Options - ctx.load('compiler_c') + + if target_platform=='emscripten': + ctx.load('c_emscripten') + else: + ctx.load('compiler_c') + ctx.load('waf_unit_test') ctx.load('gnu_dirs') + ctx.load('waf_gensyms', tooldir='.') # check for common headers ctx.check(header_name='stdlib.h') ctx.check(header_name='stdio.h') ctx.check(header_name='math.h') ctx.check(header_name='string.h') + ctx.check(header_name='errno.h') ctx.check(header_name='limits.h') + ctx.check(header_name='stdarg.h') ctx.check(header_name='getopt.h', mandatory = False) ctx.check(header_name='unistd.h', mandatory = False) - target_platform = sys.platform - if ctx.options.target_platform: - target_platform = ctx.options.target_platform ctx.env['DEST_OS'] = target_platform + if ctx.options.build_type == "debug": + ctx.define('DEBUG', 1) + else: + ctx.define('NDEBUG', 1) + if ctx.env.CC_NAME != 'msvc': - ctx.env.CFLAGS += ['-g', '-Wall', '-Wextra'] + if ctx.options.build_type == "debug": + # no optimization in debug mode + ctx.env.prepend_value('CFLAGS', ['-O0']) + else: + if target_platform == 'emscripten': + # -Oz for small js file generation + ctx.env.prepend_value('CFLAGS', ['-Oz']) + else: + # default to -O2 in release mode + ctx.env.prepend_value('CFLAGS', ['-O2']) + # enable debug symbols and configure warnings + ctx.env.prepend_value('CFLAGS', ['-g', '-Wall', '-Wextra']) else: - ctx.env.CFLAGS += ['/W4', '/MD'] - ctx.env.CFLAGS += ['/D_CRT_SECURE_NO_WARNINGS'] + # enable debug symbols + ctx.env.CFLAGS += ['/Z7'] + # /FS flag available in msvc >= 12 (2013) + if 'MSVC_VERSION' in ctx.env and ctx.env.MSVC_VERSION >= 12: + ctx.env.CFLAGS += ['/FS'] + ctx.env.LINKFLAGS += ['/DEBUG', '/INCREMENTAL:NO'] + # configure warnings + ctx.env.CFLAGS += ['/W4', '/D_CRT_SECURE_NO_WARNINGS'] + # ignore "possible loss of data" warnings + ctx.env.CFLAGS += ['/wd4305', '/wd4244', '/wd4245', '/wd4267'] + # ignore "unreferenced formal parameter" warnings + ctx.env.CFLAGS += ['/wd4100'] + # set optimization level and runtime libs + if (ctx.options.build_type == "release"): + ctx.env.CFLAGS += ['/Ox'] + ctx.env.CFLAGS += ['/MD'] + else: + assert(ctx.options.build_type == "debug") + ctx.env.CFLAGS += ['/MDd'] ctx.check_cc(lib='m', uselib_store='M', mandatory=False) @@ -144,9 +206,17 @@ def configure(ctx): ctx.env.FRAMEWORK += ['CoreFoundation', 'AudioToolbox'] ctx.define('HAVE_SOURCE_APPLE_AUDIO', 1) ctx.define('HAVE_SINK_APPLE_AUDIO', 1) + ctx.msg('Checking for AudioToolbox.framework', 'yes') + else: + ctx.msg('Checking for AudioToolbox.framework', 'no (disabled)', + color = 'YELLOW') if (ctx.options.enable_accelerate != False): ctx.define('HAVE_ACCELERATE', 1) ctx.env.FRAMEWORK += ['Accelerate'] + ctx.msg('Checking for Accelerate framework', 'yes') + else: + ctx.msg('Checking for Accelerate framework', 'no (disabled)', + color = 'YELLOW') if target_platform in [ 'ios', 'iosimulator' ]: MINSDKVER="6.1" @@ -181,12 +251,38 @@ def configure(ctx): ctx.env.LINKFLAGS += [ '-isysroot' , SDKROOT] if target_platform == 'emscripten': - import os.path - ctx.env.CFLAGS += [ '-I' + os.path.join(os.environ['EMSCRIPTEN'], 'system', 'include') ] - ctx.env.CFLAGS += ['-Oz'] + if ctx.options.build_type == "debug": + ctx.env.cshlib_PATTERN = '%s.js' + ctx.env.LINKFLAGS += ['-s','ASSERTIONS=2'] + ctx.env.LINKFLAGS += ['-s','SAFE_HEAP=1'] + ctx.env.LINKFLAGS += ['-s','ALIASING_FUNCTION_POINTERS=0'] + ctx.env.LINKFLAGS += ['-O0'] + else: + ctx.env.LINKFLAGS += ['-Oz'] + ctx.env.cshlib_PATTERN = '%s.min.js' + + # doesnt ship file system support in lib + ctx.env.LINKFLAGS_cshlib += ['-s', 'NO_FILESYSTEM=1'] + # put memory file inside generated js files for easier portability + ctx.env.LINKFLAGS += ['--memory-init-file', '0'] ctx.env.cprogram_PATTERN = "%s.js" - if (ctx.options.enable_atlas != True): - ctx.options.enable_atlas = False + ctx.env.cstlib_PATTERN = '%s.a' + + # tell emscripten functions we want to expose + from python.lib.gen_external import get_c_declarations, \ + get_cpp_objects_from_c_declarations, \ + get_all_func_names_from_lib, \ + generate_lib_from_c_declarations + # emscripten can't use double + c_decls = get_c_declarations(usedouble=False) + objects = list(get_cpp_objects_from_c_declarations(c_decls)) + # ensure that aubio structs are exported + objects += ['fvec_t', 'cvec_t', 'fmat_t'] + lib = generate_lib_from_c_declarations(objects, c_decls) + exported_funcnames = get_all_func_names_from_lib(lib) + c_mangled_names = ['_' + s for s in exported_funcnames] + ctx.env.LINKFLAGS_cshlib += ['-s', + 'EXPORTED_FUNCTIONS=%s' % c_mangled_names] # check support for C99 __VA_ARGS__ macros check_c99_varargs = ''' @@ -214,12 +310,28 @@ def configure(ctx): else: ctx.msg('Checking if complex.h is enabled', 'no') + # check for Intel IPP + if (ctx.options.enable_intelipp != False): + has_ipp_headers = ctx.check(header_name=['ippcore.h', 'ippvm.h', + 'ipps.h'], mandatory = False) + has_ipp_libs = ctx.check(lib=['ippcore', 'ippvm', 'ipps'], + uselib_store='INTEL_IPP', mandatory = False) + if (has_ipp_headers and has_ipp_libs): + ctx.msg('Checking if Intel IPP is available', 'yes') + ctx.define('HAVE_INTEL_IPP', 1) + if ctx.env.CC_NAME == 'msvc': + # force linking multi-threaded static IPP libraries on Windows + # with msvc + ctx.define('_IPP_SEQUENTIAL_STATIC', 1) + else: + ctx.msg('Checking if Intel IPP is available', 'no') + # check for fftw3 if (ctx.options.enable_fftw3 != False or ctx.options.enable_fftw3f != False): # one of fftwf or fftw3f if (ctx.options.enable_fftw3f != False): - ctx.check_cfg(package = 'fftw3f', atleast_version = '3.0.0', - args = '--cflags --libs', + ctx.check_cfg(package = 'fftw3f', + args = '--cflags --libs fftw3f >= 3.0.0', mandatory = ctx.options.enable_fftw3f) if (ctx.options.enable_double == True): ctx.msg('Warning', @@ -228,35 +340,45 @@ def configure(ctx): # fftw3f disabled, take most sensible one according to # enable_double if (ctx.options.enable_double == True): - ctx.check_cfg(package = 'fftw3', atleast_version = '3.0.0', - args = '--cflags --libs', mandatory = - ctx.options.enable_fftw3) + ctx.check_cfg(package = 'fftw3', + args = '--cflags --libs fftw3 >= 3.0.0.', + mandatory = ctx.options.enable_fftw3) else: - ctx.check_cfg(package = 'fftw3f', atleast_version = '3.0.0', - args = '--cflags --libs', + ctx.check_cfg(package = 'fftw3f', + args = '--cflags --libs fftw3f >= 3.0.0', mandatory = ctx.options.enable_fftw3) ctx.define('HAVE_FFTW3', 1) - # fftw not enabled, use vDSP or ooura + # fftw not enabled, use vDSP, intelIPP or ooura if 'HAVE_FFTW3F' in ctx.env.define_key: ctx.msg('Checking for FFT implementation', 'fftw3f') elif 'HAVE_FFTW3' in ctx.env.define_key: ctx.msg('Checking for FFT implementation', 'fftw3') elif 'HAVE_ACCELERATE' in ctx.env.define_key: ctx.msg('Checking for FFT implementation', 'vDSP') + elif 'HAVE_INTEL_IPP' in ctx.env.define_key: + ctx.msg('Checking for FFT implementation', 'Intel IPP') else: ctx.msg('Checking for FFT implementation', 'ooura') # check for libsndfile if (ctx.options.enable_sndfile != False): - ctx.check_cfg(package = 'sndfile', atleast_version = '1.0.4', - args = '--cflags --libs', + ctx.check_cfg(package = 'sndfile', + args = '--cflags --libs sndfile >= 1.0.4', mandatory = ctx.options.enable_sndfile) # check for libsamplerate + if (ctx.options.enable_double): + if (ctx.options.enable_samplerate): + ctx.fatal("Could not compile aubio in double precision mode' \ + ' with libsamplerate") + else: + ctx.options.enable_samplerate = False + ctx.msg('Checking if using samplerate', + 'no (disabled in double precision mode)', color = 'YELLOW') if (ctx.options.enable_samplerate != False): - ctx.check_cfg(package = 'samplerate', atleast_version = '0.0.15', - args = '--cflags --libs', + ctx.check_cfg(package = 'samplerate', + args = '--cflags --libs samplerate >= 0.0.15', mandatory = ctx.options.enable_samplerate) # check for jack @@ -267,33 +389,72 @@ def configure(ctx): # check for libav if (ctx.options.enable_avcodec != False): - ctx.check_cfg(package = 'libavcodec', atleast_version = '54.35.0', - args = '--cflags --libs', uselib_store = 'AVCODEC', - mandatory = ctx.options.enable_avcodec) - ctx.check_cfg(package = 'libavformat', atleast_version = '52.3.0', - args = '--cflags --libs', uselib_store = 'AVFORMAT', + ctx.check_cfg(package = 'libavcodec', + args = '--cflags --libs libavcodec >= 54.35.0', + uselib_store = 'AVCODEC', mandatory = ctx.options.enable_avcodec) - ctx.check_cfg(package = 'libavutil', atleast_version = '52.3.0', - args = '--cflags --libs', uselib_store = 'AVUTIL', + ctx.check_cfg(package = 'libavformat', + args = '--cflags --libs libavformat >= 52.3.0', + uselib_store = 'AVFORMAT', mandatory = ctx.options.enable_avcodec) - ctx.check_cfg(package = 'libavresample', atleast_version = '1.0.1', - args = '--cflags --libs', uselib_store = 'AVRESAMPLE', + ctx.check_cfg(package = 'libavutil', + args = '--cflags --libs libavutil >= 52.3.0', + uselib_store = 'AVUTIL', mandatory = ctx.options.enable_avcodec) - if all ( 'HAVE_' + i in ctx.env - for i in ['AVCODEC', 'AVFORMAT', 'AVUTIL', 'AVRESAMPLE'] ): - ctx.define('HAVE_LIBAV', 1) - ctx.msg('Checking for all libav libraries', 'yes') + ctx.check_cfg(package = 'libswresample', + args = '--cflags --libs libswresample >= 1.2.0', + uselib_store = 'SWRESAMPLE', + mandatory = False) + if 'HAVE_SWRESAMPLE' not in ctx.env: + ctx.check_cfg(package = 'libavresample', + args = '--cflags --libs libavresample >= 1.0.1', + uselib_store = 'AVRESAMPLE', + mandatory = False) + + msg_check = 'Checking for all libav libraries' + if 'HAVE_AVCODEC' not in ctx.env: + ctx.msg(msg_check, 'not found (missing avcodec)', color = 'YELLOW') + elif 'HAVE_AVFORMAT' not in ctx.env: + ctx.msg(msg_check, 'not found (missing avformat)', color = 'YELLOW') + elif 'HAVE_AVUTIL' not in ctx.env: + ctx.msg(msg_check, 'not found (missing avutil)', color = 'YELLOW') + elif 'HAVE_SWRESAMPLE' not in ctx.env \ + and 'HAVE_AVRESAMPLE' not in ctx.env: + resample_missing = 'not found (avresample or swresample required)' + ctx.msg(msg_check, resample_missing, color = 'YELLOW') else: - ctx.msg('Checking for all libav libraries', 'not found', color = 'YELLOW') - - ctx.define('HAVE_WAVREAD', 1) - ctx.define('HAVE_WAVWRITE', 1) + ctx.msg(msg_check, 'yes') + if 'HAVE_SWRESAMPLE' in ctx.env: + ctx.define('HAVE_SWRESAMPLE', 1) + elif 'HAVE_AVRESAMPLE' in ctx.env: + ctx.define('HAVE_AVRESAMPLE', 1) + ctx.define('HAVE_LIBAV', 1) - # use ATLAS - if (ctx.options.enable_atlas != False): - ctx.check(header_name = 'atlas/cblas.h', mandatory = ctx.options.enable_atlas) - #ctx.check(lib = 'lapack', uselib_store = 'LAPACK', mandatory = ctx.options.enable_atlas) - ctx.check(lib = 'cblas', uselib_store = 'BLAS', mandatory = ctx.options.enable_atlas) + if (ctx.options.enable_wavread != False): + ctx.define('HAVE_WAVREAD', 1) + ctx.msg('Checking if using source_wavread', + ctx.options.enable_wavread and 'yes' or 'no') + if (ctx.options.enable_wavwrite!= False): + ctx.define('HAVE_WAVWRITE', 1) + ctx.msg('Checking if using sink_wavwrite', + ctx.options.enable_wavwrite and 'yes' or 'no') + + # use BLAS/ATLAS + if (ctx.options.enable_blas != False): + ctx.check_cfg(package = 'blas', + args = '--cflags --libs', + uselib_store='BLAS', mandatory = ctx.options.enable_blas) + if 'LIB_BLAS' in ctx.env: + blas_header = None + if ctx.env['LIBPATH_BLAS']: + if 'atlas' in ctx.env['LIBPATH_BLAS'][0]: + blas_header = 'atlas/cblas.h' + elif 'openblas' in ctx.env['LIBPATH_BLAS'][0]: + blas_header = 'openblas/cblas.h' + else: + blas_header = 'cblas.h' + ctx.check(header_name = blas_header, mandatory = + ctx.options.enable_atlas) # use memcpy hacks if (ctx.options.enable_memcpy == True): @@ -304,6 +465,7 @@ def configure(ctx): # the following defines will be passed as arguments to the compiler # instead of being written to src/config.h + ctx.define('HAVE_CONFIG_H', 1) # add some defines used in examples ctx.define('AUBIO_PREFIX', ctx.env['PREFIX']) @@ -326,23 +488,47 @@ def configure(ctx): except ctx.errors.ConfigurationError: ctx.to_log('doxygen was not found (ignoring)') + # check if sphinx-build is installed, optional + try: + ctx.find_program('sphinx-build', var='SPHINX') + except ctx.errors.ConfigurationError: + ctx.to_log('sphinx-build was not found (ignoring)') + def build(bld): bld.env['VERSION'] = VERSION bld.env['LIB_VERSION'] = LIB_VERSION - # add sub directories + # main source bld.recurse('src') - if bld.env['DEST_OS'] not in ['ios', 'iosimulator', 'android']: - bld.recurse('examples') - bld.recurse('tests') + # add sub directories + if bld.env['DEST_OS'] not in ['ios', 'iosimulator', 'android']: + if bld.env['DEST_OS']=='emscripten' and not bld.options.testcmd: + bld.options.testcmd = 'node %s' + if bld.options.enable_examples: + bld.recurse('examples') + if bld.options.enable_tests: + bld.recurse('tests') + + # pkg-config template bld( source = 'aubio.pc.in' ) + # documentation + txt2man(bld) + doxygen(bld) + sphinx(bld) + + from waflib.Tools import waf_unit_test + bld.add_post_fun(waf_unit_test.summary) + bld.add_post_fun(waf_unit_test.set_exit_code) + +def txt2man(bld): # build manpages from txt files using txt2man if bld.env['TXT2MAN']: from waflib import TaskGen if 'MANDIR' not in bld.env: - bld.env['MANDIR'] = bld.env['PREFIX'] + '/share/man' + bld.env['MANDIR'] = bld.env['DATAROOTDIR'] + '/man' + bld.env.VERSION = VERSION rule_str = '${TXT2MAN} -t `basename ${TGT} | cut -f 1 -d . | tr a-z A-Z`' rule_str += ' -r ${PACKAGE}\\ ${VERSION} -P ${PACKAGE}' rule_str += ' -v ${PACKAGE}\\ User\\\'s\\ manual' @@ -357,34 +543,100 @@ def build(bld): ) bld( source = bld.path.ant_glob('doc/*.txt') ) +def doxygen(bld): # build documentation from source files using doxygen if bld.env['DOXYGEN']: - bld( name = 'doxygen', rule = 'doxygen ${SRC} > /dev/null', - source = 'doc/web.cfg', - cwd = 'doc') - bld.install_files( '${PREFIX}' + '/share/doc/libaubio-doc', - bld.path.ant_glob('doc/web/html/**'), - cwd = bld.path.find_dir ('doc/web'), - relative_trick = True) + bld.env.VERSION = VERSION + rule = '( cat ${SRC[0]} && echo PROJECT_NUMBER=${VERSION}' + rule += ' && echo OUTPUT_DIRECTORY=%s && echo HTML_OUTPUT=%s )' + rule += ' | doxygen - > /dev/null' + rule %= (os.path.abspath(out), 'api') + bld( name = 'doxygen', rule = rule, + source = ['doc/web.cfg'] + + bld.path.find_dir('src').ant_glob('**/*.h'), + target = bld.path.find_or_declare('api/index.html'), + cwd = bld.path.find_dir('doc')) + # evaluate nodes lazily to prevent build directory traversal warnings + bld.install_files('${DATAROOTDIR}/doc/libaubio-doc/api', + bld.path.find_or_declare('api').ant_glob('**/*', + generator=True), cwd=bld.path.find_or_declare('api'), + relative_trick=True) + +def sphinx(bld): + # build documentation from source files using sphinx-build + try: + import aubio + has_aubio = True + except ImportError: + from waflib import Logs + Logs.pprint('YELLOW', "Sphinx manual: install aubio first") + has_aubio = False + if bld.env['SPHINX'] and has_aubio: + bld.env.VERSION = VERSION + rule = '${SPHINX} -b html -D release=${VERSION}' \ + ' -D version=${VERSION} -W -a -q' \ + ' -d %s ' % os.path.join(os.path.abspath(out), 'doctrees') + rule += ' . %s' % os.path.join(os.path.abspath(out), 'manual') + bld( name = 'sphinx', rule = rule, + cwd = bld.path.find_dir('doc'), + source = bld.path.find_dir('doc').ant_glob('*.rst'), + target = bld.path.find_or_declare('manual/index.html')) + # evaluate nodes lazily to prevent build directory traversal warnings + bld.install_files('${DATAROOTDIR}/doc/libaubio-doc/manual', + bld.path.find_or_declare('manual').ant_glob('**/*', + generator=True), cwd=bld.path.find_or_declare('manual'), + relative_trick=True) + +# register the previous rules as build rules +from waflib.Build import BuildContext + +class build_txt2man(BuildContext): + cmd = 'txt2man' + fun = 'txt2man' + +class build_manpages(BuildContext): + cmd = 'manpages' + fun = 'txt2man' + +class build_sphinx(BuildContext): + cmd = 'sphinx' + fun = 'sphinx' + +class build_doxygen(BuildContext): + cmd = 'doxygen' + fun = 'doxygen' def shutdown(bld): from waflib import Logs if bld.options.target_platform in ['ios', 'iosimulator']: - msg ='building for %s, contact the author for a commercial license' % bld.options.target_platform + msg ='building for %s, contact the author for a commercial license' \ + % bld.options.target_platform Logs.pprint('RED', msg) msg =' Paul Brossier <piem@aubio.org>' Logs.pprint('RED', msg) def dist(ctx): - ctx.excl = ' **/.waf-1* **/*~ **/*.pyc **/*.swp **/.lock-w* **/.git*' + ctx.excl = ' **/.waf*' + ctx.excl += ' **/.git*' + ctx.excl += ' **/*~ **/*.pyc **/*.swp **/*.swo **/*.swn **/.lock-w*' ctx.excl += ' **/build/*' + ctx.excl += ' doc/_build' + ctx.excl += ' python/demos_*' ctx.excl += ' **/python/gen **/python/build **/python/dist' ctx.excl += ' **/python/ext/config.h' + ctx.excl += ' **/python/lib/aubio/_aubio.so' + ctx.excl += ' **.egg-info' + ctx.excl += ' **/.eggs' + ctx.excl += ' **/.pytest_cache' + ctx.excl += ' **/.cache' ctx.excl += ' **/**.zip **/**.tar.bz2' + ctx.excl += ' **.tar.bz2' ctx.excl += ' **/doc/full/* **/doc/web/*' + ctx.excl += ' **/doc/full.cfg' ctx.excl += ' **/python/*.db' ctx.excl += ' **/python.old/*' ctx.excl += ' **/python/*/*.old' + ctx.excl += ' **/python/lib/aubio/*.so' ctx.excl += ' **/python/tests/sounds' ctx.excl += ' **/**.asc' ctx.excl += ' **/dist*' @@ -392,3 +644,6 @@ def dist(ctx): ctx.excl += ' **/.travis.yml' ctx.excl += ' **/.landscape.yml' ctx.excl += ' **/.appveyor.yml' + ctx.excl += ' **/.circleci/*' + ctx.excl += ' **/azure-pipelines.yml' + ctx.excl += ' **/.coverage*' |