Meson has a common pattern of using 'if len(foo) == 0:' or
'if len(foo) != 0:', however, this is a common anti-pattern in python.
Instead tests for emptiness/non-emptiness should be done with a simple
'if foo:' or 'if not foo:'
Consider the following:
>>> import timeit
>>> timeit.timeit('if len([]) == 0: pass')
0.10730923599840025
>>> timeit.timeit('if not []: pass')
0.030033907998586074
>>> timeit.timeit('if len(['a', 'b', 'c', 'd']) == 0: pass')
0.1154778649979562
>>> timeit.timeit("if not ['a', 'b', 'c', 'd']: pass")
0.08259823200205574
>>> timeit.timeit('if len("") == 0: pass')
0.089759664999292
>>> timeit.timeit('if not "": pass')
0.02340641999762738
>>> timeit.timeit('if len("foo") == 0: pass')
0.08848102600313723
>>> timeit.timeit('if not "foo": pass')
0.04032287199879647
And for the one additional case of 'if len(foo.strip()) == 0', which can
be replaced with 'if not foo.isspace()'
>>> timeit.timeit('if len(" ".strip()) == 0: pass')
0.15294511600222904
>>> timeit.timeit('if " ".isspace(): pass')
0.09413968399894657
>>> timeit.timeit('if len(" abc".strip()) == 0: pass')
0.2023209120015963
>>> timeit.timeit('if " abc".isspace(): pass')
0.09571301700270851
In other words, it's always a win to not use len(), when you don't
actually want to check the length.
Sometimes you want to link to a C++ library that exports C API, which
means the linker must link in the C++ stdlib, and we must use a C++
compiler for linking. The same is also applicable for objc/objc++ etc,
so we can keep using clike_langs for the priority order.
Closes https://github.com/mesonbuild/meson/issues/1653
This detects and allows passing a generated file as a vs_module_def, it
also adds a testcase that tests using configure_file to generate the
.def file.
You can now pass a list of strings to the install_dir: kwarg to
build_target and custom_target.
Custom Targets:
===============
Allows you to specify the installation directory for each
corresponding output. For example:
custom_target('different-install-dirs',
output : ['first.file', 'second.file'],
...
install : true,
install_dir : ['somedir', 'otherdir])
This would install first.file to somedir and second.file to otherdir.
If only one install_dir is provided, all outputs are installed there
(same behaviour as before).
To only install some outputs, pass `false` for the outputs that you
don't want installed. For example:
custom_target('only-install-second',
output : ['first.file', 'second.file'],
...
install : true,
install_dir : [false, 'otherdir])
This would install second.file to otherdir and not install first.file.
Build Targets:
==============
With build_target() (which includes executable(), library(), etc),
usually there is only one primary output. However some types of
targets have multiple outputs.
For example, while generating Vala libraries, valac also generates
a header and a .vapi file both of which often need to be installed.
This allows you to specify installation directories for those too.
# This will only install the library (same as before)
shared_library('somevalalib', 'somesource.vala',
...
install : true)
# This will install the library, the header, and the vapi into the
# respective directories
shared_library('somevalalib', 'somesource.vala',
...
install : true,
install_dir : ['libdir', 'incdir', 'vapidir'])
# This will install the library into the default libdir and
# everything else into the specified directories
shared_library('somevalalib', 'somesource.vala',
...
install : true,
install_dir : [true, 'incdir', 'vapidir'])
# This will NOT install the library, and will install everything
# else into the specified directories
shared_library('somevalalib', 'somesource.vala',
...
install : true,
install_dir : [false, 'incdir', 'vapidir'])
true/false can also be used for secondary outputs in the same way.
Valac can also generate a GIR file for libraries when the `vala_gir:`
keyword argument is passed to library(). In that case, `install_dir:`
must be given a list with four elements, one for each output.
Includes tests for all these.
Closes https://github.com/mesonbuild/meson/issues/705
Closes https://github.com/mesonbuild/meson/issues/891
Closes https://github.com/mesonbuild/meson/issues/892
Closes https://github.com/mesonbuild/meson/issues/1178
Closes https://github.com/mesonbuild/meson/issues/1193
Previously, two functionally identical builds could produce different
build.ninja files. The ordering of the rules themselves doesn't affect
behaviour, but unnecessary changes in commandline arguments can cause
spurious rebuilds and if the ordering of the overall file is stable
than it's easy to use `diff` to compare different build.ninja files
and spot the differences in ordering that are triggering the unnecessary
rebuilds.
Now as long as you have a C compiler available in the project, it will
be used to compile assembly even if the target contains a C++ compiler
and even if the target contains only assembly and C++ sources.
Earlier, the order in which sources appeared in a target would decide
which compiler would be used.
However, if the project only provides a C++ compiler, that will be
used for compiling assembly sources.
If this breaks your use-case, please tell us.
Includes a test that ensures that all of the above is adhered to.
Use an ordered dict for the compiler dictionary and sort it according
to a priority order: fortran, c, c++, etc.
This also ensures that builds are reproducible because it would be
a toss-up whether a C or a C++ compiler would be used based on the
order in which compilers.items() would return items.
Closes https://github.com/mesonbuild/meson/issues/1370
We were adding them to the CompilerArgs instance in the order in which
they are specified, which is wrong because later dependencies would
override previous ones. Add them in the reverse order instead.
Closes https://github.com/mesonbuild/meson/issues/1495
This changes how generated files are added to the VS project.
Previously, they were all added as a single CustomBuildStep with all
generator commands, inputs and outputs merged together.
Now, each input file is added separately to the project and is given a
CustomBuild command. This adds all generator input files to the files list
in the VS gui and allows to run only some of the generator commands if
only some of the input files have changed.
We check for the existence of PDB files in the install script, so we
don't need to do all this mucking about here. That's more robust too
because we don't need to parse build arguments in buildtype=plain
and decide if the PDB file would be generated.
This means replacing @PLAINNAME@ and @BASENAME@ in the outputs. This is
the same feature as generator().
This is only allowed when there is only one input file for obvious
reasons + failing test for this.
Factor it out into a function in mesonlib.py. This will allow us to
reuse it for generators and for configure_file(). The latter doesn't
implement this at all right now.
Also includes unit tests.
Without this, files() in the arguments give an error because it's a list
of mesonlib.File objects:
Array as argument 1 contains a non-string.
It also breaks in nested lists. Includes a test for this.
At the same time, also fix the order in which compile arguments are
added. Detailed comments have been added concerning the priority and
order of the arguments.
Also adds a unit test and an integration test for the same.
With the 'install_mode' kwarg, you can now specify the file and
directory permissions and the owner and the group to be used while
installing. You can pass either:
* A single string specifying just the permissions
* A list of strings with:
- The first argument a string of permissions
- The second argument a string specifying the owner or
an int specifying the uid
- The third argument a string specifying the group or
an int specifying the gid
Specifying `false` as any of the arguments skips setting that one.
The format of the permissions kwarg is the same as the symbolic
notation used by ls -l with the first character that specifies 'd',
'-', 'c', etc for the file type omitted since that is always obvious
from the context.
Includes unit tests for the same. Sadly these only run on Linux right
now, but we want them to run on all platforms. We do set the mode in the
integration tests for all platforms but we don't check if they were
actually set correctly.
Set the rules for the symlinking on the target itself, and then reuse
that information while generating aliases during the build, and then
pass it to the install script too.
If you declare_dependency(link_with : 'string'), an exception is
supposed to be raised, but instead of a proper message, it's an
exception about a missing attribute.
Cache the absolute dir that the script is searched in and the name of
the script. These are the only two things that change.
Update the test to test for both #1235 and the case when a script of the
same name is in a different directory (which also covers the subproject
case).
Closes#1235
This is much more accurate since this is actually what determines what
file naming to use and whether there will be PDB debugging information
or not.
Closes#1169
This greatly improves the logic for determining the linker. Previously,
we would completely break if a target contained only extracted objects
and we were using more than one compiler in our project.
This also fixes determination of the linker if our target only contains
generated objc++ sources, and other funky combinations.
This avoids us having no compilers at all for targets that are composed
entirely of objects with no sources.
Now we will always have a compiler for a target even if it is composed
entirely of objects generated with custom targets unless it has
completely unknown sources.
Everywhere we use this object, we end up iterating over it and comparing
compiler.get_language() with something. Using a dict is the obvious
choice and simplifies a lot of code.
The error message is misleading (talks about external dependencies), and
doesn't tell you what you need to do (use the output of
declare_dependency, dependency, or find_library). At the same time
rename add_external_deps to add_deps since it adds internal deps too.
Plus many more error message improvements all over the place.
Not only does extract_all_objects() now work properly again,
extract_objects() also works if you specify a subset of sources all of
which have been compiled into a single unified object.
So, for instance, this allows you to extract all the objects
corresponding to the C sources compiled into a target consisting of
C and C++ sources.
This is the first step in making Vala support have feature-parity with
C/C++ support. Vala and Vapi sources generated with Generators and
CustomTargets are no longer ignored. Dependencies are setup properly and
they are added to the commandline.
get_filename() made no sense for CustomTarget since it can have multiple
outputs. Also use get_outputs() for GeneratedList since it has the same
meaning and remove unused set_generated().
As a side-effect, we now install all the outputs of a CustomTarget.
With C/C++, on Windows you don't need to pass any arguments for a static
library to be PIC. On UNIX platforms you need to pass -fPIC.
Other languages such as D have compiler-specific PIC arguments required
for PIC support in static libraries on UNIX platforms.
This kwarg allows people to specify which static libraries should be
built with PIC support. This is usually used for static libraries that
will be linked into shared libraries.
This is definitely more correct since it takes into account the
cross-compilation status. We also now do the Java and CSharp sanity
checks on the BuildTarget level instead of in the Ninja backend.
When the BuildTarget (executable, shared-library, static-library, etc)
is created, process the source list and assign compilers to this target.
This allows us to do compiler-specific file-naming without resorting to
ugly hacks.
This is one step towards consolidating all the 'what language does this
target use' checks and the 'this target should only have $lang files as
sources' checks we have all over the codebase. All those checks should
be done only when the target is created.
Add support for passing a description to configuration data
setter methods via a 'description' kwarg. The description
string will be used when meson generates the entire configure
file without a template, autoconf-style.
For commands that always output to stdout and don't have a "-o" or
"--output" or some other similar option, this 'capture' setting allows
the build to capture the result and place it in the output file.
At the same time, this also adds a bunch of tests that document and keep
track of how we expect quoting to pass through via Ninja to the
compiler.
We need at least Ninja 1.6.0 for this.
This fixes https://github.com/mesonbuild/meson/issues/489
The first file might be a header file, in which case this test will
fail, so check all the files till a match is found instead.
Also remove duplicate and incorrect can_compile check. It just checks
the suffix and we already check that above.
This allows us to output either the relative or absolute path as
requested. Fixes usage of configure_file inside CustomTarget commands
with the VS backends.
This commit contains several changes to the naming and versioning of
shared and static libraries. The details are documented at:
https://github.com/mesonbuild/meson/pull/417
Here's a brief summary:
* The results of binary and compiler detection via environment functions
are now cached so that they can be called repeatedly without
performance penalty. This is necessary because every
build.SharedLibrary object has to know whether the compiler is MSVC or
not (output filenames depend on that), and so the compiler detection
has to be called for each object instantiation.
* Linux shared libraries don't always have a library version. Sometimes
only soversions are specified (and vice-versa), so support both.
* Don't use versioned filenames when generating DLLs, DLLs are never
versioned using the suffix in the way that .so libraries are. Hence,
they don't use "aliases". Only Linux shared libraries use those.
* OS X dylibs do not use filename aliases at all. They only use the
soversion in the dylib name (libfoo.X.dylib), and that's it. If
there's no soversion specified, the dylib is called libfoo.dylib.
Further versioning in dylibs is supposed to be done with the
-current_version argument to clang, but this is TBD.
https://developer.apple.com/library/mac/documentation/DeveloperTools/Conceptual/DynamicLibraries/100-Articles/DynamicLibraryDesignGuidelines.html#//apple_ref/doc/uid/TP40002013-SW23
* Install DLLs into bindir and import libraries into libdir
* Static libraries are now always called libfoo.a, even with MSVC
* .lib import libraries are always generated when building with MSVC
* .dll.a import libraries are always generated when building with
MinGW/GCC or MinGW/clang
* TODO: Use dlltool if available to generate .dll.a when .lib is
generated and vice-versa.
* Library and executable suffix/prefixes are now always correctly
overriden by the values of the 'name_prefix' and 'name_suffix' keyword
arguments.
On MSVC, shared libraries only export symbols that have been explicitly exported
either as part of the symbol prototype or via a module definitions file.
On compilers other than MSVC, all symbols are exported in the shared library by
default and the format for the list of symbols to export is different, so this
is only used with the VisualStudio compiler.
The module defs file path can either be relative to the current source directory
or an absolute path using meson.source_root() + '/some/path'