August 06, 2017

On 06.08.2017 18:31, Johnson Jones wrote:
> (this is a bit long of a post so take your time(when you have time, I'm in no rush) and get some coffee ;)
> 
> 
> What about locals not seeming to show up? Not sure if my question got answered or not?

I think there is some misunderstanding about the interaction between the debugger, the semantic engine and the compiler generated JSON information.

The short version: there is no interaction at all.

The slightly longer version:

- the debugger is a component completely different from the editor. It operates on the debug info built into the executable. This represents the information from the last successful build, but this is hardly accessible to the editor. It can get obsolete and wrong if you start editing.

- the JSON information is similar to the debug information in that it represents the information from the last successful build, but more accessible to the editor. It gets obsolete and wrong if you start editing, too. In addition it doesn't include any information about local variables, so that you have to analyze the code with other means to make sense of any identifier.

- that's the job of the semantic analyzer engine. It updates whenever you change any of the source code. It figures out what type an identifier is by following the appropriate lookup rules, and it can also list its members. So matching it to information also available elsewhere does not really help, because that's not the difficult part.


> ------------------------------Part 2----------------------------
> I just tried downloading the source code and get >
> "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\\Common7\Tools\vsvars32.bat"
> 
> when building build.

I suspect Visual D doesn't build out of the box with VS2017. I still use VS2013 to avoid dependencies in the resulting binaries on newer versions.

See the appveyor.xml how to obtain external dependencies and how to override environment settings.

You might also want it to check the logs on Appveyor that builds with VS2013 and VS2015: https://ci.appveyor.com/project/rainers/visuald/build/1.0.138
August 06, 2017
On Sunday, 6 August 2017 at 19:27:26 UTC, Rainer Schuetze wrote:
>
>
> On 06.08.2017 18:31, Johnson Jones wrote:
>> (this is a bit long of a post so take your time(when you have time, I'm in no rush) and get some coffee ;)
>> 
>> 
>> What about locals not seeming to show up? Not sure if my question got answered or not?
>
> I think there is some misunderstanding about the interaction between the debugger, the semantic engine and the compiler generated JSON information.
>
> The short version: there is no interaction at all.
>
> The slightly longer version:
>
> - the debugger is a component completely different from the editor. It operates on the debug info built into the executable. This represents the information from the last successful build, but this is hardly accessible to the editor. It can get obsolete and wrong if you start editing.
>
> - the JSON information is similar to the debug information in that it represents the information from the last successful build, but more accessible to the editor. It gets obsolete and wrong if you start editing, too. In addition it doesn't include any information about local variables, so that you have to analyze the code with other means to make sense of any identifier.
>
> - that's the job of the semantic analyzer engine. It updates whenever you change any of the source code. It figures out what type an identifier is by following the appropriate lookup rules, and it can also list its members. So matching it to information also available elsewhere does not really help, because that's not the difficult part.
>
>

Yeah, but these locals are variables that haven't changed. It's part of a library that never changes so the information should always be up to date and everything should be consistent with respect to those elements.

If it's the semantic analyzers job to report a locals type and all the sub type info, then it's not doing it's job. It should provide coverage in such a way that things that haven't been modified since the last build(like "external" libraries) should be ok to report information because it's safe to assume it's correct.

Editing files is an extremely local thing and rarely breaks the semantics of much of the program(it can in extreme cases but usually very rare and less likely the larger the program).

If it's still a problem, how about eventually adding an option where we can specify certain json paths as being semantically correct and the semantic engine uses that data as if it were, regardless. It is up to the programmer to make it so.

Hence, for something like phobos or gtk or other libraries we won't be modifying, generate the json, stick it in that dir... and the semantic engine uses it regardless of correctness. It matches the locals types up with it and if they match it presents that info.

What's the worse that could happen? We get some invalid elements in intellisense? I'm ok with that... as long as they are only wrong if I actually modified the libraries and produced an inconsistent state, which I won't do since I don't modify phobos.



>> ------------------------------Part 2----------------------------
>> I just tried downloading the source code and get >
>> "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\\Common7\Tools\vsvars32.bat"
>> 
>> when building build.
>
> I suspect Visual D doesn't build out of the box with VS2017. I still use VS2013 to avoid dependencies in the resulting binaries on newer versions.
>
> See the appveyor.xml how to obtain external dependencies and how to override environment settings.
>
> You might also want it to check the logs on Appveyor that builds with VS2013 and VS2015: https://ci.appveyor.com/project/rainers/visuald/build/1.0.138

ok, I'll see what I can do with it.


August 07, 2017

On 06.08.2017 22:18, Johnson Jones wrote:
> On Sunday, 6 August 2017 at 19:27:26 UTC, Rainer Schuetze wrote:
>>
>>
>> On 06.08.2017 18:31, Johnson Jones wrote:
>>> (this is a bit long of a post so take your time(when you have time, I'm in no rush) and get some coffee ;)
>>>
>>>
>>> What about locals not seeming to show up? Not sure if my question got answered or not?
>>
>> I think there is some misunderstanding about the interaction between the debugger, the semantic engine and the compiler generated JSON information.
>>
>> The short version: there is no interaction at all.
>>
>> The slightly longer version:
>>
>> - the debugger is a component completely different from the editor. It operates on the debug info built into the executable. This represents the information from the last successful build, but this is hardly accessible to the editor. It can get obsolete and wrong if you start editing.
>>
>> - the JSON information is similar to the debug information in that it represents the information from the last successful build, but more accessible to the editor. It gets obsolete and wrong if you start editing, too. In addition it doesn't include any information about local variables, so that you have to analyze the code with other means to make sense of any identifier.
>>
>> - that's the job of the semantic analyzer engine. It updates whenever you change any of the source code. It figures out what type an identifier is by following the appropriate lookup rules, and it can also list its members. So matching it to information also available elsewhere does not really help, because that's not the difficult part.
>>
>>
> 
> Yeah, but these locals are variables that haven't changed. It's part of a library that never changes so the information should always be up to date and everything should be consistent with respect to those elements.
> 
> If it's the semantic analyzers job to report a locals type and all the sub type info, then it's not doing it's job. It should provide coverage in such a way that things that haven't been modified since the last build(like "external" libraries) should be ok to report information because it's safe to assume it's correct.
> 
> Editing files is an extremely local thing and rarely breaks the semantics of much of the program(it can in extreme cases but usually very rare and less likely the larger the program).
> 
> If it's still a problem, how about eventually adding an option where we can specify certain json paths as being semantically correct and the semantic engine uses that data as if it were, regardless. It is up to the programmer to make it so.
> 
> Hence, for something like phobos or gtk or other libraries we won't be modifying, generate the json, stick it in that dir... and the semantic engine uses it regardless of correctness. It matches the locals types up with it and if they match it presents that info.
> 
> What's the worse that could happen? We get some invalid elements in intellisense? I'm ok with that... as long as they are only wrong if I actually modified the libraries and produced an inconsistent state, which I won't do since I don't modify phobos.

Yeah, getting some additional information from JSON files could work, but it's not so easy to make the connection. The JSON file does not list any locals, so you still have to make sense of an identifier. If you find it's type (not vailable in JSON), listing its members is not a big deal.

I think time is better invested at improving the semantic engine, though in the long run, the dmd compiler is supposed to be usable as a library (but getting it to a point where it can be integrated with an IDE is way further ahead IMO).

So, if you can provide full source files instead of single line snippets of things that don't work, it will likely be a lot easier to reproduce and fix the semantic engine. Also, you might want to add them as reports to https://issues.dlang.org/ for component visuald so they don't get lost.
August 07, 2017
On Monday, 7 August 2017 at 18:06:37 UTC, Rainer Schuetze wrote:
>
>
> On 06.08.2017 22:18, Johnson Jones wrote:
>> On Sunday, 6 August 2017 at 19:27:26 UTC, Rainer Schuetze wrote:
>>>
>>>
>>> On 06.08.2017 18:31, Johnson Jones wrote:
>>>> (this is a bit long of a post so take your time(when you have time, I'm in no rush) and get some coffee ;)
>>>>
>>>>
>>>> What about locals not seeming to show up? Not sure if my question got answered or not?
>>>
>>> I think there is some misunderstanding about the interaction between the debugger, the semantic engine and the compiler generated JSON information.
>>>
>>> The short version: there is no interaction at all.
>>>
>>> The slightly longer version:
>>>
>>> - the debugger is a component completely different from the editor. It operates on the debug info built into the executable. This represents the information from the last successful build, but this is hardly accessible to the editor. It can get obsolete and wrong if you start editing.
>>>
>>> - the JSON information is similar to the debug information in that it represents the information from the last successful build, but more accessible to the editor. It gets obsolete and wrong if you start editing, too. In addition it doesn't include any information about local variables, so that you have to analyze the code with other means to make sense of any identifier.
>>>
>>> - that's the job of the semantic analyzer engine. It updates whenever you change any of the source code. It figures out what type an identifier is by following the appropriate lookup rules, and it can also list its members. So matching it to information also available elsewhere does not really help, because that's not the difficult part.
>>>
>>>
>> 
>> Yeah, but these locals are variables that haven't changed. It's part of a library that never changes so the information should always be up to date and everything should be consistent with respect to those elements.
>> 
>> If it's the semantic analyzers job to report a locals type and all the sub type info, then it's not doing it's job. It should provide coverage in such a way that things that haven't been modified since the last build(like "external" libraries) should be ok to report information because it's safe to assume it's correct.
>> 
>> Editing files is an extremely local thing and rarely breaks the semantics of much of the program(it can in extreme cases but usually very rare and less likely the larger the program).
>> 
>> If it's still a problem, how about eventually adding an option where we can specify certain json paths as being semantically correct and the semantic engine uses that data as if it were, regardless. It is up to the programmer to make it so.
>> 
>> Hence, for something like phobos or gtk or other libraries we won't be modifying, generate the json, stick it in that dir... and the semantic engine uses it regardless of correctness. It matches the locals types up with it and if they match it presents that info.
>> 
>> What's the worse that could happen? We get some invalid elements in intellisense? I'm ok with that... as long as they are only wrong if I actually modified the libraries and produced an inconsistent state, which I won't do since I don't modify phobos.
>
> Yeah, getting some additional information from JSON files could work, but it's not so easy to make the connection. The JSON file does not list any locals, so you still have to make sense of an identifier. If you find it's type (not vailable in JSON), listing its members is not a big deal.
>
> I think time is better invested at improving the semantic engine, though in the long run, the dmd compiler is supposed to be usable as a library (but getting it to a point where it can be integrated with an IDE is way further ahead IMO).
>
> So, if you can provide full source files instead of single line snippets of things that don't work, it will likely be a lot easier to reproduce and fix the semantic engine. Also, you might want to add them as reports to https://issues.dlang.org/ for component visuald so they don't get lost.

Could Dscanner not be used?

https://github.com/dlang-community/D-Scanner

The "--ast" or "--xml" options will dump the complete abstract syntax tree of the given source file to standard output in XML format.

Simply match the source line number up with the ast, extract the type. This type then is used as a look up in the JSON(it should be there somewhere, if not, update dmd to add type information so cross referencing can be used).

IMO, DMD should be able to generate basically every cross-relation that one needs to do proper debugging. When it evaluates a mixin it can generate whatever info is required and even allow debugging mixins since dmd invokes dmd again to generate the mixin output, insert a debugger in between.

It seems that the main problem is that dmd itself is not very well designed to support these types of features. Maybe that is the place to start?

Basically every line in the source code to correspond to a line in the binary(although, not a one to one function and not even a function at all, one can still encode such a mapping to find out what is where) and vice versa. All type info, mixin expansions, etc should be easily understood. All this has to work else dmd couldn't compile them.

The only problem seems to be dmd extracting and presenting the information in such a way that Visual D can understand it?


In a sense, I don't see why any semantic analysis has to be done. Everything is already done by DMD and it will do it perfectly(because it has to).

It would probably be better to add the proper modifications to DMD so that it can self-regulate with changes in DMD or the grammar.  This prevents having to update any separate utility every time dmd changes something.

I don't know if dmd as a library will really add any thing new unless it already has the functionality to do what is required... but if that's the case it should also be able, then to output all the required info to a file that Visual D could use, so it's just a performance difference.

As far as source code is concerned, it happens on any project, so any project should exhibit such problems. Of course, mixin issues will only happen when mixins are used.

I think any typical use of Visual D tends to demonstrate some issue that needs to be resolved. These are not issues that rarely pop up but are pretty much an every day thing. IMO, there is something fundamentally broke with Visual D or DMD as regard to debugging capabilities. I know some of it is with dmd and the power that be don't care because they don't use modern debugging techniques... so it doesn't effect them. They are probably experienced enough programs and don't write any real apps in D either(exclude utilities are libraries that don't really use a wide variety of things) to have these types of issues... and when they do they either know how to fix them, use their old school debugging methods, or just work through them... none of which is acceptable for me or the average user of D. (and I seriously doubt they have probably even used Visual D, much less for any serious project)

As a case in point: mixin debugging. This is necessary. It is no different than normal debugging. Where would modern programming be without debuggers? Well, that is where we are at with mixins. We have no real way to debug them. I use them all the time and have to really be careful with what I'm doing and use stupid tricks to debug them. Mainly string mixins, which is far more time consuming than getting an error at some line number. I've learned now that I should use write them as normal runtime functions first and then once they work to make them ctfe.

But mixin debugging should be easy. After all, it's just a d program inside a d program. dmd compiles it internally, but if dmd had an "internal" debugger then we could debug it and get better results.

In a since, it's just back tracing anyways.

dmd -> mixin -> dmd -> mixin output
   <=                               <=

If the internal dmd would keep track of whats going on it could map the output line numbers to the input and we could debug the output(which is effectively inserted in to the main source code directly in a "temp" file(dmd may obfuscate that process but that is what effectively is going on))... Visual D could link errors to the original mixin through the line mapping and even open out the generated output in a view for us to see what the mixin outputted.


Of course, all this is not so simple but it's not so hard... just work(which I guess is the hard part).

My guess is that because dmd was designed a long time ago that it didn't take in to account what might be, and hence we have what we have... which is great on one hand and sorry on the other.

I would think that one of the most important things that the D foundation would be working on besides bugs in DMD and language design issues is a proper IDE and debugger... but that doesn't seem to be the case.

Anyways, I've ranted long enough... sorry...














August 10, 2017

On 07.08.2017 21:56, Johnson wrote:
>> So, if you can provide full source files instead of single line snippets of things that don't work, it will likely be a lot easier to reproduce and fix the semantic engine. Also, you might want to add them as reports to https://issues.dlang.org/ for component visuald so they don't get lost.
> 
> Could Dscanner not be used?
> 
> https://github.com/dlang-community/D-Scanner
> 
> The "--ast" or "--xml" options will dump the complete abstract syntax tree of the given source file to standard output in XML format.

Dscanner is more a lint-like tool, not usually used for Intellisense. That's rather done by DCD (https://github.com/dlang-community/DCD).

Last time I checked, DCD was well behind D_Parser (a fork of https://github.com/aBothe/D_Parser) used by Visual D: no mixins, CTFE, UFCS, limited template support.


> 
> Simply match the source line number up with the ast, extract the type. This type then is used as a look up in the JSON(it should be there somewhere, if not, update dmd to add type information so cross referencing can be used).
> 
> IMO, DMD should be able to generate basically every cross-relation that one needs to do proper debugging. When it evaluates a mixin it can generate whatever info is required and even allow debugging mixins since dmd invokes dmd again to generate the mixin output, insert a debugger in between.
> 
> It seems that the main problem is that dmd itself is not very well designed to support these types of features. Maybe that is the place to start?
> 
> Basically every line in the source code to correspond to a line in the binary(although, not a one to one function and not even a function at all, one can still encode such a mapping to find out what is where) and vice versa. All type info, mixin expansions, etc should be easily understood. All this has to work else dmd couldn't compile them.
> 
> The only problem seems to be dmd extracting and presenting the information in such a way that Visual D can understand it?
> 
> 
> In a sense, I don't see why any semantic analysis has to be done. Everything is already done by DMD and it will do it perfectly(because it has to).
> 
> It would probably be better to add the proper modifications to DMD so that it can self-regulate with changes in DMD or the grammar.  This prevents having to update any separate utility every time dmd changes something.
> 
> I don't know if dmd as a library will really add any thing new unless it already has the functionality to do what is required... but if that's the case it should also be able, then to output all the required info to a file that Visual D could use, so it's just a performance difference.

Again, precompiled information doesn't help while you edit code. Long term the DMD frontend can be used, but one major problem with using it right now is that it bails out if it cannot parse the code, so it won't even try to give information about identifiers at the current edit location. There is work being done on that, but currently more directed towards tools like d-scanner.

> 
> As far as source code is concerned, it happens on any project, so any project should exhibit such problems. Of course, mixin issues will only happen when mixins are used.
> 
> I think any typical use of Visual D tends to demonstrate some issue that needs to be resolved. These are not issues that rarely pop up but are pretty much an every day thing. IMO, there is something fundamentally broke with Visual D or DMD as regard to debugging capabilities. I know some of it is with dmd and the power that be don't care because they don't use modern debugging techniques... so it doesn't effect them. They are probably experienced enough programs and don't write any real apps in D either(exclude utilities are libraries that don't really use a wide variety of things) to have these types of issues... and when they do they either know how to fix them, use their old school debugging methods, or just work through them... none of which is acceptable for me or the average user of D. (and I seriously doubt they have probably even used Visual D, much less for any serious project)
> 
> As a case in point: mixin debugging. This is necessary. It is no different than normal debugging. Where would modern programming be without debuggers? Well, that is where we are at with mixins. We have no real way to debug them. I use them all the time and have to really be careful with what I'm doing and use stupid tricks to debug them. Mainly string mixins, which is far more time consuming than getting an error at some line number. I've learned now that I should use write them as normal runtime functions first and then once they work to make them ctfe.
> 
> But mixin debugging should be easy. After all, it's just a d program inside a d program. dmd compiles it internally, but if dmd had an "internal" debugger then we could debug it and get better results.
> 
> In a since, it's just back tracing anyways.
> 
> dmd -> mixin -> dmd -> mixin output
>     <=                               <=
> 
> If the internal dmd would keep track of whats going on it could map the output line numbers to the input and we could debug the output(which is effectively inserted in to the main source code directly in a "temp" file(dmd may obfuscate that process but that is what effectively is going on))... Visual D could link errors to the original mixin through the line mapping and even open out the generated output in a view for us to see what the mixin outputted.
> 
> 
> Of course, all this is not so simple but it's not so hard... just work(which I guess is the hard part).

There have been several proposals to help debugging mixins, here's one of them: https://issues.dlang.org/show_bug.cgi?id=5051

Visual D even highlights *.mixin files as D code...


> 
> My guess is that because dmd was designed a long time ago that it didn't take in to account what might be, and hence we have what we have... which is great on one hand and sorry on the other.
> 
> I would think that one of the most important things that the D foundation would be working on besides bugs in DMD and language design issues is a proper IDE and debugger... but that doesn't seem to be the case.
> 
> Anyways, I've ranted long enough... sorry...
August 10, 2017

On 06.08.2017 21:27, Rainer Schuetze wrote:
> 
>> ------------------------------Part 2----------------------------
>> I just tried downloading the source code and get >
>> "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\\Common7\Tools\vsvars32.bat"
>>
>> when building build.
> 
> I suspect Visual D doesn't build out of the box with VS2017. I still use VS2013 to avoid dependencies in the resulting binaries on newer versions.
> 

I have updated the build scripts and the C to D conversion to also work with VS2017 and Windows SDK 10.0.15063.0.
August 10, 2017
On Thursday, 10 August 2017 at 07:05:26 UTC, Rainer Schuetze wrote:
>
>
> On 06.08.2017 21:27, Rainer Schuetze wrote:
>> 
>>> ------------------------------Part 2----------------------------
>>> I just tried downloading the source code and get >
>>> "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\\Common7\Tools\vsvars32.bat"
>>>
>>> when building build.
>> 
>> I suspect Visual D doesn't build out of the box with VS2017. I still use VS2013 to avoid dependencies in the resulting binaries on newer versions.
>> 
>
> I have updated the build scripts and the C to D conversion to also work with VS2017 and Windows SDK 10.0.15063.0.

Awesome! I think I got it to build!

I had a few issues. I still had to fix a few hard coded paths I listed above(I think it was mainly the tools/build.bat:

c:\d\dmd2\windows\bin\dmd.exe -g tlb2idl.d oleaut32.lib uuid.lib snn.lib kernel32.lib
if errorlevel 1 goto xit
"C:\Program Files (x86)\VisualD\cv2pdb\cv2pdb.exe" -D1 tlb2idl.exe tlb2idl_pdb.exe
:xit

which had hard coded paths for dmd2 and cv2pdb.

A new issue was with

Must fix up vsct or add proper vssdk path to visuald.visualdproj:

C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\VSSDK\VisualStudioIntegration\Tools\Bin\vsct.exe


here: (my modified version)

<File path="Resources\pkgcmd.vsct" customcmd="set VSSDKInstall=%VSSDK140Install%
if &quot;%VSSDKInstall%&quot; == &quot;&quot; set VSSDKInstall=%VSSDK120Install%
if &quot;%VSSDKInstall%&quot; == &quot;&quot; set VSSDKInstall=%VSSDK110Install%
if &quot;%VSSDKInstall%&quot; == &quot;&quot; set VSSDKInstall=%VSSDK100Install%
if &quot;%VSSDKInstall%&quot; == &quot;&quot; set VSSDKInstall=%VSSDK90Install%
if &quot;%VSSDKInstall%&quot; == &quot;&quot; set VSSDKInstall=C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\VSSDK\ <- this line
set VSCT=%VSSDKInstall%\VisualStudioIntegration\Tools\Bin\VSCT.exe
set VSCT=C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\VSSDK\VisualStudioIntegration\Tools\Bin\vsct.exe
if not exist &quot;%VSCT%&quot; goto no_VSCT
&quot;%VSCT%&quot; $(InputPath) $(InputDir)\$(InputName).cto -I&quot;%VSSDKInstall%\VisualStudioIntegration\Common\Inc&quot;
goto reportError


which was generating pkgcmd.cto.build like (premod)

set PATH=C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\VC\Tools\MSVC\14.10.25017\bin\HostX86\x86;C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE;C:\Program Files (x86)\Windows Kits\8.1\bin\x86;C:\D\dmd2\windows\bin;%PATH%
set VSSDKInstall=%VSSDK140Install%
if "%VSSDKInstall%" == "" set VSSDKInstall=%VSSDK120Install%
if "%VSSDKInstall%" == "" set VSSDKInstall=%VSSDK110Install%
if "%VSSDKInstall%" == "" set VSSDKInstall=%VSSDK100Install%
if "%VSSDKInstall%" == "" set VSSDKInstall=%VSSDK90Install%
if "%VSSDKInstall%" == "" set VSSDKInstall=C:\l\vssdk
set VSCT=%VSSDKInstall%\VisualStudioIntegration\Tools\Bin\VSCT.exe

So, I guess VSSDKx0Install isn't set on my system because it was always looking for vsct in C:\l\vssdk.

Fix that and everything built!

Now the question is, what do I do with this? Is there any automated way to update VisualD after a build or someway to simultaneously debug visualD when something happens in another project?

Basically, how to make it all seamless debug app/fix visualD cycle? Or do you just manually do all the work? I figure copying visualD over to the install dir requires stopping VS since visualD.dll and others will probably be in use. It would be nice, say, to be able to break in to visualD's source when debugging to to get at specific parts of visual D like how it displays watches so I can more quickly fix things rather than having to learn how visual D is put together before I actually start investigating issues or enhancements.

Thanks again! Maybe I can contribute a little to Visual D in the future ;)

I saw all your other replies, thanks! I got them. I might reply to a few later if I get some time.


I'll spend a little time trying to figure out what's going on and how visual D is setup and where the main bulk of the code for visual studio exists and hopefully it will make enough sense to get somewhere.

Thanks!
August 11, 2017

On 10.08.2017 15:47, Johnson wrote:
> On Thursday, 10 August 2017 at 07:05:26 UTC, Rainer Schuetze wrote:
>>
>>
>> On 06.08.2017 21:27, Rainer Schuetze wrote:
>>>
>>>> ------------------------------Part 2----------------------------
>>>> I just tried downloading the source code and get >
>>>> "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\\Common7\Tools\vsvars32.bat"
>>>>
>>>> when building build.
>>>
>>> I suspect Visual D doesn't build out of the box with VS2017. I still use VS2013 to avoid dependencies in the resulting binaries on newer versions.
>>>
>>
>> I have updated the build scripts and the C to D conversion to also work with VS2017 and Windows SDK 10.0.15063.0.
> 
> Awesome! I think I got it to build!
> 
> I had a few issues. I still had to fix a few hard coded paths I listed above(I think it was mainly the tools/build.bat:
> 
> c:\d\dmd2\windows\bin\dmd.exe -g tlb2idl.d oleaut32.lib uuid.lib snn.lib kernel32.lib
> if errorlevel 1 goto xit
> "C:\Program Files (x86)\VisualD\cv2pdb\cv2pdb.exe" -D1 tlb2idl.exe tlb2idl_pdb.exe
> :xit
> 
> which had hard coded paths for dmd2 and cv2pdb.

Sorry, this is an outdated build batch. tlb2idl is also built by the build-project in the solution. I'll better delete this batch.

> 
> A new issue was with
> 
> Must fix up vsct or add proper vssdk path to visuald.visualdproj:
> 
> C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\VSSDK\VisualStudioIntegration\Tools\Bin\vsct.exe
[...]
> set VSSDKInstall=%VSSDK140Install%
> if "%VSSDKInstall%" == "" set VSSDKInstall=%VSSDK120Install%
> if "%VSSDKInstall%" == "" set VSSDKInstall=%VSSDK110Install%
> if "%VSSDKInstall%" == "" set VSSDKInstall=%VSSDK100Install%
> if "%VSSDKInstall%" == "" set VSSDKInstall=%VSSDK90Install%
> if "%VSSDKInstall%" == "" set VSSDKInstall=C:\l\vssdk
> set VSCT=%VSSDKInstall%\VisualStudioIntegration\Tools\Bin\VSCT.exe
>
> So, I guess VSSDKx0Install isn't set on my system because it was always looking for vsct in C:\l\vssdk.
> 
> Fix that and everything built!

This is missing the check for VSSDK150Install which is set by the vcvars32.bat of VS2017. I've added that.

> 
> Now the question is, what do I do with this? Is there any automated way to update VisualD after a build or someway to simultaneously debug visualD when something happens in another project?

I usually debug Visual D as described at http://rainers.github.io/visuald/visuald/BuildFromSource.html under "Deployment", but that doesn't work for VS2017 anymore due to the private registry of VS. AFAICT the "official" way is to have a package definition in a user extension folder, but that never worked for me (or I haven't tried enough because the other way worked, too).

If you replace the standard installed DLL with a debug version, VS can get quite slow. This is mainly because the debug version writes a log-file visuald.log into the folder where you started devenv from.

You can also attach a debugger to a running devenv process. If you are using a release build of visuald with optimizations, debug information can be a lot more confusing than without, though.

> Basically, how to make it all seamless debug app/fix visualD cycle? Or do you just manually do all the work? I figure copying visualD over to the install dir requires stopping VS since visualD.dll and others will probably be in use. It would be nice, say, to be able to break in to visualD's source when debugging to to get at specific parts of visual D like how it displays watches so I can more quickly fix things rather than having to learn how visual D is put together before I actually start investigating issues or enhancements.

I guess you might have noticed, but to make sure you don't get lost in the dependencies:

- the semantic engine is a local COM server written in C#: https://github.com/rainers/D_Parser/tree/visuald

I usually debug this by starting DParserCOMServer.exe from within the solution, then kill all other running instances. When you edit code in Visual D, it will reconnect to the debug instance.

- the debug engine is written in C++: https://github.com/rainers/mago

Debugging the Concord extension (MagoNatCC built in configuration "Debug StaticDE") is easiest: just copy the DLL bin\Win32\Debug\MagoNatCC.dll to <VSInstall>\Common7\Packages\Debugger, start debugging in another devenv process and attach the debugger to this other process.

- CodeView to PDB conversion for builds with -m32 is done by cv2pdb: https://github.com/rainers/cv2pdb

> 
> Thanks again! Maybe I can contribute a little to Visual D in the future ;)

Looking forward to your pull requests!


> 
> I saw all your other replies, thanks! I got them. I might reply to a few later if I get some time.
> 
> 
> I'll spend a little time trying to figure out what's going on and how visual D is setup and where the main bulk of the code for visual studio exists and hopefully it will make enough sense to get somewhere.
> 
> Thanks!
1 2
Next ›   Last »