September 10, 2012
Am Sun, 09 Sep 2012 12:55:19 -0700
schrieb Brad Roberts <braddr@puremagic.com>:

> On 9/9/2012 1:15 AM, Johannes Pfau wrote:
> > Am Sat, 08 Sep 2012 16:25:49 +0100
> > schrieb Russel Winder <russel@winder.org.uk>:
> > 
> >> On Sat, 2012-09-08 at 07:20 -0700, Ellery Newcomer wrote: […]
> >>> Okay, here: https://bitbucket.org/ariovistus/deimos-elfutils/overview
> >>>
> >>> I have some code with a working makefile and a nonworking SConstruct file.
> >>>
> >>> I believe the issue is the header files have pragma(lib, X) in them, and a single call to dmd links the appropriate lib in, but scons' link step loses that information.
> >>>
> >>> Do you have any intention of supporting pragma(lib) in scons?
> >>
> >> If that is valid Dv2 and SCons doesn't deal with it then the SCons D tools are broken and need fixing.
> >>
> >> Is there a tiny project replicating this that I can turn into a unit/system test. The red so caused will necessitate action :-)
> >>
> > 
> > Please note that pragma(lib) is an evil feature. For example it will
> > never work in gdc.
> > 
> 
> It's not impossible and never is rather defeatist.  Using the frontend as is and grabing the json output, part of which includes the pragmas, would be easy.  Then invoking gdc with the appropriate options to get the library linked in.  rdmd is a good example of this sort of process.
> 
> 

Is there a special flag to enable pragmas for the json output? It does not work with gdc right now, but it should be possible to make it work.

Sorry, I should have said 'It'll _probably_ never be supported in gdc'. There are some possible solutions but:

* It must be good enough to get approved when gdc is merged into gcc.
  (remember it must be portable and gpl and you can't use
  stdout/stdin...)
* Someone would have to implement the solution. I guess Iain had his
  reasons not to implement it so somebody else would have to do that.

Of course you can always try to make it work with external build tools. But a solution _in_ gdc seems not very likely.


I don't want to badmouth the pragma(lib) feature, in some cases it's
nice to have (mainly building simple script-like programs with few
source files). But for bigger projects, pragma(lib) makes things
difficult (incremental compilation; build tools usually check if a
library is available before trying to link against it so they can put
out a nice warning. pragma(lib) in dmd subverts this feature; can't
specify linker path with pragma lib, can't specify static vs dynamic
linking, ...).

The C/C++ architecture splits compilation and linking. Trying to
conflate those concepts as pragma(lib) does, might even be a good
idea(other languages have done it for some time now). But as we have to
deal with tools that were designed for C/C++ (linkers, gcc) we'll
always hit some issues with pragma(lib).

September 10, 2012
Am Mon, 10 Sep 2012 14:48:30 +0200
schrieb Johannes Pfau <nospam@example.com>:

> Sorry, I should have said 'It'll _probably_ never be supported in gdc'. There are some possible solutions but:
> 
> * It must be good enough to get approved when gdc is merged into gcc.
>   (remember it must be portable and gpl and you can't use
>   stdout/stdin...)
> * Someone would have to implement the solution. I guess Iain had his
>   reasons not to implement it so somebody else would have to do that.
> 
> Of course you can always try to make it work with external build tools. But a solution _in_ gdc seems not very likely.

For reference: Here's the gdc bug report for pragma(lib): http://d.puremagic.com/issues/show_bug.cgi?id=1690

Filed 2007, closed 2012 as RESOLVED/WONTFIX.
September 10, 2012
On 09/05/2012 07:10 PM, bearophile wrote:
>
> NumPy arrays <==> D arrays
>

I've been thinking about this one a bit more, and I am not sure it belongs in pyd.

First, the conversion is not symmetric. One can convert a numpy.ndarray to a d array like so:

PyObject* ndarray;
double[][] matrix = d_type!(double[][])(ndarray);

however, going back

PyObject* res = _py(matrix);

It is not at all clear that the user wants res to be a numpy.ndarray. The problem is partially that d arrays would be overloaded to a few too many things (list, str, array, any iterable, any buffer). That last one is a doozy. d_type never actually touches ndarray's type, so _py can hardly know what to use to convert matrix. (what if ndarray is actually a foo.BizBar matrix?)

I could just specialize _py for numpy.ndarrays, defaulting to lists of lists (which is what we do already), but I kinda want a specialized type for numpy.ndarrays.

Also, all these conversions imply data copying; is this reasonable for numpy arrays?

It is easy enough to get a void* and shape information out of the ndarray, but building a decent matrix type out of them is not trivial. Is there a good matrix library for D that would be suitable for this?

Oh yeah, also: rectangular matrices. For static arrays, the conversion is 1 memcpy. For dynamic arrays: lots of memcpys. I suppose I could abuse slicing much.
September 10, 2012
Ellery Newcomer:

> I've been thinking about this one a bit more, and I am not sure it belongs in pyd.

I understand. The point of Pyd is to interface D and Python, while NumPy is something external. So if you find difficulties just keep it out. Adding it later is possible.

Bye,
bearophile
September 10, 2012
On 09/10/2012 12:11 PM, bearophile wrote:
>
> I understand. The point of Pyd is to interface D and Python, while NumPy
> is something external. So if you find difficulties just keep it out.
> Adding it later is possible.
>

Thing is, pyd will convert a ndarray to d array already, it just won't do it as quickly as it could if it made use of the underlying c array, and

_py(d_type!(double[][])(ndarray))

will result in a list of lists.

So it's really a question of should I add more oddness to an already odd situation.


<OT> Bugger, I'm going to have to go through pyd and replace all usages of str with unicode. </OT>
September 11, 2012
On Mon, 2012-09-10 at 15:54 -0700, Ellery Newcomer wrote: […]
> <OT> Bugger, I'm going to have to go through pyd and replace all usages of str with unicode. </OT>

Python 2 and Python 3 are totally different in this regard. I don't have a obvious proposal to make to avoid having PyD for Python 2 and a different PyD for Python 3, but the six package might have some hints as it is intended to support creating Python codebases guaranteed to run under Python 2 and Python 3.

It is a pity the world doesn't just spontaneously switch to Python 3 so that Python 2 is just a "we used to use that" technology."

-- 
Russel. ============================================================================= Dr Russel Winder      t: +44 20 7585 2200   voip: sip:russel.winder@ekiga.net 41 Buckmaster Road    m: +44 7770 465 077   xmpp: russel@winder.org.uk London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder


September 11, 2012
On 09/10/2012 10:50 PM, Russel Winder wrote:
>
> Python 2 and Python 3 are totally different in this regard. I don't have
> a obvious proposal to make to avoid having PyD for Python 2 and a
> different PyD for Python 3, but the six package might have some hints as
> it is intended to support creating Python codebases guaranteed to run
> under Python 2 and Python 3.

Pyd doesn't really have a python codebase, I was talking mostly about PyString_AsString -> PyUnicode_Whatever, since even for Python 2, unicode is much more appropriate for anything interfacing with D.

For getting pyd to support python 3, its mostly a matter of choosing the right C API functions, and anyways I have version identifiers I can rely on if there is divergence.

Wait, CeleriD is python. I might need that six package after all. Thanks for the tip.
September 12, 2012
On 09/05/2012 07:10 PM, bearophile wrote:
> Ellery Newcomer:
>
>> Yep.
>
> Oh, good.
>
>
>> Have any suggestions for supported conversion out of the box?
>
> There are several important cases, like:
>
> Some D lazy ranges <==> Python lazy iterators/generators
>
> array.array <==> D arrays
>
> NumPy arrays <==> D arrays
>

Welp. I started on NumPy arrays <== D arrays, and it turned out to be pretty easy. It's in its own function; maybe I'll put it in pyd.extras or something. But now I have just about all of the above cases working.

Bearophile: would you be interested in contributing some code showcasing what we can do with numpy? Just, say, a D function operating on D arrays that does something that maybe numpy doesn't have built in.

1 2 3 4
Next ›   Last »