June 26, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | On Wednesday, 26 June 2013 at 21:28:08 UTC, H. S. Teoh wrote: > On Wed, Jun 26, 2013 at 10:23:21PM +0200, Idan Arye wrote: >> I guess that depends whether or not F5 is your build process >> (http://www.codinghorror.com/blog/2007/10/the-f5-key-is-not-a-build-process.html). > > What's F5? In Eclipse, F5 is the key to compile your project and run it in debug mode. The link I've given is to a blog entry by Jeff Atwood, where he explains why it's so bad to use F5 as your "build process" - that is, rely on the IDE to build your project. Ofcourse, nothing is wrong with the F5 key itself. In my Vim settings I've mapped F5 to launch a proper build system. > So we use makefiles... which are a royal PITA, but at least they give > you a semblance of reproducibility (fresh version control checkout, run > ./configure && make, and it produces a usable product at the end). I > have a lot of gripes about makefiles and would never use such broken > decrepit technology in my own projects, but they are nevertheless better > on the reproducibility front than some IDE "build process" that nobody > knows how to replicate after the key developer leaves the company. Add that to the long, long list of crappy tools that became the standard... > That's why I said, auto-generated files should NOT be included in > version control. Unfortunately it's still being done here at my work, > and every now and then we have to deal with silly spurious merge > conflicts in addition to subtle ABI inconsistency bugs like I described > above. My point is that bad practices lead to more bad practices: Letting your IDE automatically handle the details of the building process is bad. As your project become more complicated, and you need to use third party libraries and auto-generated files, then the bad practice of using F5 as your build process forces you to other bad practices - downloading those third libraries and using those generation tools manually. Even if you document what you did - which is far better than *not* documenting it - it's still a bad practice, since configuring a build system to do those things is a bit easier than explaining in English what needs to be done, and invoking the build system is much easier than following the instructions manually. If you need to use SCM things get even worse. Other people will need to build the project, so they will need those libraries and auto-generated files. If you used a build system and a dependency manager, that would be easy - but you didn't, so now the other guys need to follow your documented instructions manually(assuming there are documented instructions) - and it becomes very cumbersome. And it gets worse - if someone changes something in that textual "build system" - that is, does something and writes it in the documentation - everyone else need to do it. But people don't reread the how-to-configure document every time they do a checkout - so now you need to email everybody about the change. Now, what if someone called sick for a couple of weeks? Now he has to scan through the mailing list to collect all the mails about changes to the configuration process. Alternatively, he could scan the configuration instruction document and compare it to what he has already done - assuming he remembers he ran tool A, but not tool B, and he ran tool C but with different flags than what's specified in the up-to-date instructions. Another option is to use the SCM's diff - but that's still a pain, and frankly - I don't think people that can't use a build system are smart enough to use diff... So, the best thing to do is to check in those auto-generated files and those external libraries, and let the SCM keep everyone synced. That's a crappy solution - but if you don't use a build system, it's your best solution. And when bad practice becomes your best solution - you know you have a problem. |
June 26, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to Idan Arye | On Thu, Jun 27, 2013 at 01:07:19AM +0200, Idan Arye wrote: > On Wednesday, 26 June 2013 at 21:28:08 UTC, H. S. Teoh wrote: [...] > >So we use makefiles... which are a royal PITA, but at least they give you a semblance of reproducibility (fresh version control checkout, run ./configure && make, and it produces a usable product at the end). I have a lot of gripes about makefiles and would never use such broken decrepit technology in my own projects, but they are nevertheless better on the reproducibility front than some IDE "build process" that nobody knows how to replicate after the key developer leaves the company. > > Add that to the long, long list of crappy tools that became the standard... Tell me about it. Makefiles have so many dark nasty corners that any non-trivial application will have an unreadable, unmaintainable makefile. Not to mention reliance on timestamps (very unreliable), and inability to parallel-build without specifically crafting the makefile to support that, etc.. [...] > My point is that bad practices lead to more bad practices: > > Letting your IDE automatically handle the details of the building process is bad. +1. > As your project become more complicated, and you need to use third party libraries and auto-generated files, then the bad practice of using F5 as your build process forces you to other bad practices - downloading those third libraries and using those generation tools manually. Stop right there. As soon as "manual" enters the picture, you no longer have a build process. You may have a *caricature* of a build process, but it's no build process at all. I don't care if it's hitting F5 or running make, if I cannot (check out the code from version control / download and unpack the source tarball) and *automatically* recreate the entire distribution binary by running a (script / makefile / whatever), then it's not a build process. To me, a build process means it's possible to write a script that, given just the pure source tree, can recreate, without any human intervention, the entire binary blob that you give your customers. Anything short of that does not qualify as a build process. A hand-written document that explains the 50+1 gcc/dmd/whatever commands you must type at the command prompt to build the software does not qualify as a build system. > Now, what if someone called sick for a couple of weeks? Now he has to scan through the mailing list to collect all the mails about changes to the configuration process. Alternatively, he could scan the configuration instruction document and compare it to what he has already done - assuming he remembers he ran tool A, but not tool B, and he ran tool C but with different flags than what's specified in the up-to-date instructions. Another option is to use the SCM's diff - but that's still a pain, and frankly - I don't think people that can't use a build system are smart enough to use diff... If you have to manually type anything more than "gcc -o prog prog.c" to build a project (and that includes adding compile flags), that project has already failed. > So, the best thing to do is to check in those auto-generated files and those external libraries, and let the SCM keep everyone synced. That's a crappy solution - but if you don't use a build system, it's your best solution. And when bad practice becomes your best solution - you know you have a problem. If you don't have a build system, your project is already doomed. Nevermind auto-generated files, external libraries, or SCMs, those are just nails in the coffin. Any project that spans more than a single source file (and I don't mean just code -- that includes data, autogenerated files, whatever inputs are required to create the final product) *needs* a build system. T -- The day Microsoft makes something that doesn't suck is probably the day they start making vacuum cleaners... -- Slashdotter |
June 27, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | On Wednesday, 26 June 2013 at 23:40:52 UTC, H. S. Teoh wrote: > Stop right there. As soon as "manual" enters the picture, you no longer > have a build process. You may have a *caricature* of a build process, > but it's no build process at all. I don't care if it's hitting F5 or > running make, if I cannot (check out the code from version control / > download and unpack the source tarball) and *automatically* recreate the > entire distribution binary by running a (script / makefile / whatever), > then it's not a build process. Whether it needs to be automated to be called a build process is a matter of definitions. The important thing is to agree that it's bad. > A hand-written document that explains the 50+1 gcc/dmd/whatever commands > you must type at the command prompt to build the software does not > qualify as a build system. > > ... > > If you have to manually type anything more than "gcc -o prog prog.c" to > build a project (and that includes adding compile flags), that project > has already failed. I don't consider having to write 50 commands each time you want to build the software that harmful - not because it's good, but because it's so bad that no developer will agree to live with it. And luckily for most developers - that's a problem they don't have to live with, because IDEs can handle it pretty well. The real problem is with commands that you only have to type now and then. For example, let's assume you have a .lex file somewhere in your project. Visual Studio does not no how to handle it(I think - it has been years since I last touched VS, and I didn't do any advanced stuff with it). But VS knows pretty well how to handle everything else, and you don't want to start learning a build system just for that single .lex file - after all, it's just one command, and you don't really need to do it every time - after all, you rarely touch it, and the auto-generated .yy.c file stays in the file system for the next build. So, you use the shell to call `flex`, and then compile your project with VS, and continue coding happily without thinking about that .lex file. A few weeks pass, and you have to change something in the .lex file. So you change it, and compile the code, and run the project, and nothing changes - because you forgot to call `flex`. So you check your code - but everything seems OK. So you do a rebuild - because that usually helps in such situations - and VS deletes the .exe and all the .obj files, but it doesn't delete the .yy.c file - because it's a C source file, so VS assumes it's part of the source code - and then VS compiles everything from scratch - and again nothing changes! So, you do what any sane programmer would do - you throw the computer out of the window. When your new computer arrives, you check out the code from the repository and try to compile, and this time you get a compile error - because you don't have the .yy.c file. Now you finally understand that you forgot to call `flex`! Well, you learned from your mistake so you won't repeat it again, so you say to yourself that there is still no point in introducing a build system just to handle a single .lex file... That's why I'm not worried about problems that you can't live with. If people can't live with a problem - they will find and implement a solution. It's the problems you *can* live with that make me worry - because there will always be people who prefer to live with the problem than to be bothered with the solution... > If you don't have a build system, your project is already doomed. > Nevermind auto-generated files, external libraries, or SCMs, those are > just nails in the coffin. Any project that spans more than a single > source file (and I don't mean just code -- that includes data, > autogenerated files, whatever inputs are required to create the final > product) *needs* a build system. With that I don't agree - simple projects that only have source files can get away with IDE building, even if they have multiple source files. I'm talking about zero configuration projects - no auto-generated files, no third party libraries - all you have to do is create a default project in the IDE, import all the source files, and hit F5(or the equivalent shortcut). The moment you have to change a single compiler switch - you need a build system. I myself always use a build system, because I use Vim so I don't have IDE building. The only exception is single-source files of interpreted languages, where I can use shebangs. |
June 27, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to Idan Arye | On Thu, Jun 27, 2013 at 03:00:05AM +0200, Idan Arye wrote: > On Wednesday, 26 June 2013 at 23:40:52 UTC, H. S. Teoh wrote: > > >Stop right there. As soon as "manual" enters the picture, you no longer have a build process. You may have a *caricature* of a build process, but it's no build process at all. I don't care if it's hitting F5 or running make, if I cannot (check out the code from version control / download and unpack the source tarball) and *automatically* recreate the entire distribution binary by running a (script / makefile / whatever), then it's not a build process. > > Whether it needs to be automated to be called a build process is a matter of definitions. The important thing is to agree that it's bad. No, what I meant was that going from clean source (i.e., only fresh source file, no auto-generated files, no intermediate files, no cached object files, clean, pristine source) to the fully-built binary should be possible *without* manually typing any commands other than invoking the build script / IDE build function / whatever. IOW, builds must be reproducible. They should not rely on arbitrary undocumented commands that the original author typed at arbitrary points in time, that produced intermediate files that are required later. Every single command necessary to start from pristine, unprocessed source code to fully-functional binary must be encapsulated in the build script / build command / whatever you call it, so that, in principle, pushing a single button will produce the final, releasable binary. You should be able to ship the pristine, unprocessed source code to somebody and they should be able to get a binary out of it by "pushing the same button", so to speak. [...] > I don't consider having to write 50 commands each time you want to build the software that harmful - not because it's good, but because it's so bad that no developer will agree to live with it. And luckily for most developers - that's a problem they don't have to live with, because IDEs can handle it pretty well. > > The real problem is with commands that you only have to type now and then. That's what I mean. When you have commands that you "only have to type now and then", they MUST be part of the automated build process, be it your build script, IDE project file, or whatever it is you use to build your program. Otherwise, it's not possible for you to just ship the pristine source code to somebody else and have them able to build it just by hitting the "build" button. > For example, let's assume you have a .lex file somewhere in your project. Visual Studio does not no how to handle it(I think - it has been years since I last touched VS, and I didn't do any advanced stuff with it). But VS knows pretty well how to handle everything else, and you don't want to start learning a build system just for that single .lex file - after all, it's just one command, and you don't really need to do it every time - after all, you rarely touch it, and the auto-generated .yy.c file stays in the file system for the next build. That's the formula for disaster. Consider: 1) You write some code; 2) You decide you need flex, but VS doesn't support calling flex, so you run it by hand; 3) Programmer B wants to try out your code, so you ship him the source files. It fails miserably 'cos he doesn't have flex installed. 4) Solution? Just include the .yy.c the next time you send him the code. Now it compiles. Everything's OK now, right? Wrong. 5) You make some changes to the code, but forget to rerun flex. Now the .yy.c is out of sync with the .lex, but it just happens to still compile, so you ship the new code to programmer B. 6) Programmer B compiles everything and ships the product to the customer. 7) In the meantime, you suddenly remember you didn't re-run flex, so you do that and recompile everything. 8) The customer comes back and complains there are bugs in the code. You can't reproduce it, 'cos your .yy.c is up-to-date now. 9) Another customer complains that the previous release of the code has a critical bug. You check out the old code from version control, but .yy.c wasn't in version control, so the old code doesn't even compile. 10) After hours of hair-pulling, the old code finally compiles. Of course, you've done all sorts of things to try to make it compile, but the dynamic libraries are not the same, the new version of VS has a different default setting, etc., so of course, you can't reproduce the customer's problem. 11) You give up, and check out the new code to continue working on something else. But the .yy.c is again out-of-sync with the .lex 'cos you touched it while trying to make the old version compile. The code compiles, but has subtle bugs caused by the out-of-sync file. 12) After you finally remember to run flex again, programmer B checks out the code, and now his build fails, 'cos the .yy.c is out of sync and causes a compile error. 13) You decide that since the .yy.c keeps causing problems, you should check it into the VCS. Now everything works fine. Or does it? 14) Programmer B checks out the code, and modifies the .lex, but doesn't re-run flex. He checks in the changes. You check out the changes, and now your code doesn't work anymore, 'cos the .yy.c is out of date. See how this is a vicious cycle of endless frustration and wasted time? The correct way of doing things is to include EVERYTHING you need to go from raw source files to final binary in a single build script / project file / whatever. You have to guarantee that, given the pristine source code (i.e. without any externally-generated products), a single button (or script, or makefile, etc.) will be able to regenerate the binaries you shipped. This has to work for EVERY RELEASED VERSION of your program. You should be able to check out any prior version of your code, and be assured that after you hit the "compile" button, the executable you get at the end is IDENTICAL to the executable you shipped to the customer 12 months ago. Anything else is just the formula for endless frustration, untraceable bugs, and project failure. If your IDE's build function doesn't support full end-to-end reproducible builds, it's worthless and should be thrown out. > So, you use the shell to call `flex`, and then compile your project with VS, and continue coding happily without thinking about that .lex file. > > A few weeks pass, and you have to change something in the .lex file. So you change it, and compile the code, and run the project, and nothing changes - because you forgot to call `flex`. So you check your code - but everything seems OK. So you do a rebuild - because that usually helps in such situations - and VS deletes the .exe and all the .obj files, but it doesn't delete the .yy.c file - because it's a C source file, so VS assumes it's part of the source code - and then VS compiles everything from scratch - and again nothing changes! > > So, you do what any sane programmer would do - you throw the computer out of the window. Or rather, you throw the IDE out the window, 'cos its build function is defective. :-P > When your new computer arrives, you check out the code from the repository and try to compile, and this time you get a compile error - because you don't have the .yy.c file. Now you finally understand that you forgot to call `flex`! This is a sign of a defective IDE build function. > Well, you learned from your mistake so you won't repeat it again, so you say to yourself that there is still no point in introducing a build system just to handle a single .lex file... > > That's why I'm not worried about problems that you can't live with. If people can't live with a problem - they will find and implement a solution. It's the problems you *can* live with that make me worry - because there will always be people who prefer to live with the problem than to be bothered with the solution... Then they only have themselves to blame when they face an endless stream of build problems, heisenbugs that appear/disappear depending on what extra commands you type at the command prompt, inability to track down customer reported bugs in old versions, and all sorts of neat and handy things like that. > >If you don't have a build system, your project is already doomed. Nevermind auto-generated files, external libraries, or SCMs, those are just nails in the coffin. Any project that spans more than a single source file (and I don't mean just code -- that includes data, autogenerated files, whatever inputs are required to create the final product) *needs* a build system. > > With that I don't agree - simple projects that only have source files can get away with IDE building, even if they have multiple source files. I'm talking about zero configuration projects - no auto-generated files, no third party libraries - all you have to do is create a default project in the IDE, import all the source files, and hit F5(or the equivalent shortcut). The moment you have to change a single compiler switch - you need a build system. I'd argue that you need a build system from the get-go. Ideally, the IDE's project file SHOULD support such things as building external products. If it doesn't, it's essentially worthless and you should use a real build system instead. But even if this is supported, there's still the problem of compile switches inserted by the IDE that you may not know about. Consider if the IDE has a configuration window where you can select compile switches. You twiddle with some of those settings and later forget about them completely. Then you ship your files to developer B, and he hits the build button and gets a different executable, 'cos his IDE settings don't match yours. This is just the same sad story rehashed. For any serious software project, reproducible builds is a must. There's simply no way around it. Shipping executables that depend on arbitrary IDE settings that vary depending on which developer did it, is a very bad business model. Shipping executables that you cannot reproduce by checking out a previous version of the code from the VCS is a very bad business model. Even *developing* a software for which you can't make reproducible executables is a bad business model -- it hurts programmer productivity. Countless hours are wasted trying to track down bugs and other strange problems that ultimately come from non-reproducible builds. It also hurts morale: nobody dares check out the latest code from the VCS 'cos it has a reputation of introducing random build failures, which wastes time (have to make clean; make, every single time, and if you're dealing with C/C++ where the build times are measured in hours, that just kills productivity instantly). As a result, you get endless merge conflicts when everybody tries to check in their code which has been out-of-sync for weeks, and everybody blames each other for the conflicts ("argh why did you touch this file in *my* subdirectory?!"). Not having 100% reproducible builds is simply not workable. > I myself always use a build system, because I use Vim so I don't have IDE building. The only exception is single-source files of interpreted languages, where I can use shebangs. Single-source files are OK without a build system, though sometimes I still do it, just so I get the compile flags right. For shebangs, it's a different story 'cos you can just put the compile flags into the shebang line. But anything beyond that, requires a *reproducible* build system (even if it's the IDE's build command). Otherwise you're just setting yourself up for needless frustration and failure. T -- Designer clothes: how to cover less by paying more. |
June 27, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | On Thursday, 27 June 2013 at 04:15:27 UTC, H. S. Teoh wrote: > Anything else is just the formula for endless frustration, untraceable > bugs, and project failure. If your IDE's build function doesn't support > full end-to-end reproducible builds, it's worthless and should be > thrown out. The IDE's build function is not defective - it's just incomplete. It does what it does well - the problem is that what it does is not enough. Most IDEs I know rely on plugins to do advanced stuff. So if you insist on using the IDE's build function, you'll want to get a flex plugin for your IDE(hopefully there is one...) and that plugin will enhance the IDE's build function to auto-generate the .yy.c file, and if it's a good plugin it'll also enhance the IDE's clean function to delete that file and/or it's SCM interface to ignore that file. It's a shame, really - IDEs could do so much more. A few years ago I programmed in C#, and I was using Vim. I used MSBuild as my build system - it's the build system Visual Studio runs behind the scenes, and it's buildfiles are the .csproj files. Those .csproj files are basically XML. VS makes them very messy, but after you clean them up and understand the format they look pretty much like Ant's build.xml files, and you can use them like a proper build system. I used those .csproj files to automate the build process, the testing, and the deployment. But I could only do it because I broke away from Visual Studio! I doubt it would accept those .csproj files after I cleaned away all the metadata it put there... So people who use VS's build function are actually using a decent build system - but they can't utilize it to it's fullest! VS has menus that allow you to change some paths and switches, but you can't do things like one target that does multiple tasks sequentially. So, Visual Studio uses a build system that could automate our .lex file - it just forgets to give you access to that functionality... > Or rather, you throw the IDE out the window, 'cos its build function is > defective. :-P The IDE is a software - you can't *physically* throw it out the window. > Then they only have themselves to blame when they face an endless stream > of build problems, heisenbugs that appear/disappear depending on what > extra commands you type at the command prompt, inability to track down > customer reported bugs in old versions, and all sorts of neat and handy > things like that. If they work alone on the project, it's their problem. If you need to join that project - now it's your problem as well. Good luck with introducing a build system to an existing project and making everyone use it... |
June 27, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to Idan Arye | On Thu, Jun 27, 2013 at 10:20:59PM +0200, Idan Arye wrote: > On Thursday, 27 June 2013 at 04:15:27 UTC, H. S. Teoh wrote: > >Anything else is just the formula for endless frustration, untraceable bugs, and project failure. If your IDE's build function doesn't support full end-to-end reproducible builds, it's worthless and should be thrown out. > > The IDE's build function is not defective - it's just incomplete. Incomplete == defective. :) > It does what it does well - the problem is that what it does is not enough. Most IDEs I know rely on plugins to do advanced stuff. So if you insist on using the IDE's build function, you'll want to get a flex plugin for your IDE(hopefully there is one...) and that plugin will enhance the IDE's build function to auto-generate the .yy.c file, and if it's a good plugin it'll also enhance the IDE's clean function to delete that file and/or it's SCM interface to ignore that file. That's something I never really understood about the Windows / GUI world. The backend functionality is already all there, yet for some strange reason the application refuses to have the means to access that functionality, requiring instead for you to install "plugins". To me, a "plugin" should *enhance* functionality by adding what wasn't there before, but in this case, it seems to be more about removing artificial barriers to reveal what has already been there all along. Same thing goes with the iPhone emoji apps, and many other such examples. As a CLI-only person, I find this really hard to grok. > It's a shame, really - IDEs could do so much more. A few years ago I programmed in C#, and I was using Vim. I used MSBuild as my build system - it's the build system Visual Studio runs behind the scenes, and it's buildfiles are the .csproj files. Those .csproj files are basically XML. VS makes them very messy, but after you clean them up and understand the format they look pretty much like Ant's build.xml files, and you can use them like a proper build system. > > I used those .csproj files to automate the build process, the testing, and the deployment. But I could only do it because I broke away from Visual Studio! I doubt it would accept those .csproj files after I cleaned away all the metadata it put there... > > So people who use VS's build function are actually using a decent build system - but they can't utilize it to it's fullest! VS has menus that allow you to change some paths and switches, but you can't do things like one target that does multiple tasks sequentially. So, Visual Studio uses a build system that could automate our .lex file - it just forgets to give you access to that functionality... Yeah, this is something I just don't understand with GUI-centric apps. It annoys me a lot, actually, that the necessary functionality is already there, yet there's no way for you to access it without opening the hood. And too often, the hood is welded shut, esp. when you're talking about the Windows world. It reinforces my opinion that GUIs are crippled point-n-grunt caricatures of a *real* UI, which is to use *language* that can convey what you want in much more expressive ways. > >Or rather, you throw the IDE out the window, 'cos its build function is defective. :-P > > The IDE is a software - you can't *physically* throw it out the window. It was a proverbial window, not a physical one. :) Well, either that, or kick it off the GUI window... :-P > >Then they only have themselves to blame when they face an endless stream of build problems, heisenbugs that appear/disappear depending on what extra commands you type at the command prompt, inability to track down customer reported bugs in old versions, and all sorts of neat and handy things like that. > > If they work alone on the project, it's their problem. If you need to join that project - now it's your problem as well. Which is why I would not touch such projects with a 10-foot pole. The world is big enough to have more pleasant projects that I work with. > Good luck with introducing a build system to an existing project and making everyone use it... Heh, yeah. I've been complaining about the nasty mess that is the Makefile-based build system at my work for a long time, and so far nobody has listened except for my ex-supervisor (who has since left the company, sigh). In fact, it's been going downhill. We *used* to support parallel building. Or at least, some semblance of parallel building, as long as you make sure certain components are separately built in single-threaded mode. It saved many many hours of idle waiting. Ever since the PTBs decided to merge in another major project, though, (which involved recursively copying all files from said other project on top of the existing source tree, and then cleaning up the resulting mess), parallel building has been completely out of the question. Worse yet, that other project's makefiles were (and still are) so nasty, that you couldn't simply re-run make after making some changes; you have to make clean, and then wait 2.5 hours for the miserable thing to build from scratch. If you don't make clean, the build will die halfway with obscure linker errors or errors about missing files, etc.. Gah. What I would give, to convince people to move to a saner build system... But old habits die hard, and people dislike change. What can you do. *shrug* T -- Let's call it an accidental feature. -- Larry Wall |
June 27, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | On Thursday, 27 June 2013 at 20:43:47 UTC, H. S. Teoh wrote:
clip
>
> That's something I never really understood about the Windows / GUI
> world. The backend functionality is already all there, yet for some
> strange reason the application refuses to have the means to access that
> functionality, requiring instead for you to install "plugins". To me, a
> "plugin" should *enhance* functionality by adding what wasn't there
> before, but in this case, it seems to be more about removing artificial
> barriers to reveal what has already been there all along. Same thing
> goes with the iPhone emoji apps, and many other such examples.
>
> As a CLI-only person, I find this really hard to grok.
>
>
This isn't directly related to CLI, but one thing I really like
about
text file based build/configuration systems that I dislike about
IDE's is
that you can easily add comments to your build scripts/config
files to
explain why you did something a certain way. This is helpful to
you and
to anyone else who might have to tweak it later. I also like the
sort of
inline help that some configuration files provide through the use
of comments.
I generally find a good, text-based system easier to understand
and work
with than a GUI based system. For example, with Visual Studio I
remember writing down in a separate document that lists of steps
to perform in order to link to a particular libraries (installed
in non-standard locations) on my system, it involved lots of
clicking.
Comparatively using QMake (Qt Projects) I just find a .pro file
that links to the correct libraries and copy over the relevant
lines. Qt has the Qt Creator tool that edits .pro files for you,
but most of the time I just edit the .pro files by hand if I want
to make changes. Qt Creator seems to be able to deal with this.
I am sure Visual Studio has a way of dealing with the problem
I've described, but you can't beat copying a few lines from a
text file for 'ease of use'.
|
June 27, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | On Thursday, 27 June 2013 at 20:43:47 UTC, H. S. Teoh wrote:
> That's something I never really understood about the Windows / GUI
> world. The backend functionality is already all there, yet for some
> strange reason the application refuses to have the means to access that
> functionality, requiring instead for you to install "plugins". To me, a
> "plugin" should *enhance* functionality by adding what wasn't there
> before, but in this case, it seems to be more about removing artificial
> barriers to reveal what has already been there all along. Same thing
> goes with the iPhone emoji apps, and many other such examples.
>
> As a CLI-only person, I find this really hard to grok.
With the popularity of XML build systems, that shouldn't be that hard for an IDE to provide you with a GUI to edit the targets and make complex build processes. I would have expected big IDEs like Eclipse to have that feature...
|
June 27, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to Idan Arye | On Thu, Jun 27, 2013 at 11:48:15PM +0200, Idan Arye wrote: > On Thursday, 27 June 2013 at 20:43:47 UTC, H. S. Teoh wrote: > >That's something I never really understood about the Windows / GUI world. The backend functionality is already all there, yet for some strange reason the application refuses to have the means to access that functionality, requiring instead for you to install "plugins". To me, a "plugin" should *enhance* functionality by adding what wasn't there before, but in this case, it seems to be more about removing artificial barriers to reveal what has already been there all along. Same thing goes with the iPhone emoji apps, and many other such examples. > > > >As a CLI-only person, I find this really hard to grok. > > With the popularity of XML build systems, that shouldn't be that hard for an IDE to provide you with a GUI to edit the targets and make complex build processes. I would have expected big IDEs like Eclipse to have that feature... Yeah, what with all the fancy code-editing features, you'd think having a built-in XML editor would be easy... XML is a pain to edit by hand, though, if your editor doesn't understand XML. It's sorta like Java and IDEs; in theory, you *can* write Java with nothing but pico, but in practice, it's so verbose that it's only tolerable if you use an IDE with autocompletion. T -- It said to install Windows 2000 or better, so I installed Linux instead. |
June 27, 2013 Re: Having a bit if fun on stackoverflow | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | On Thursday, 27 June 2013 at 21:56:00 UTC, H. S. Teoh wrote:
> On Thu, Jun 27, 2013 at 11:48:15PM +0200, Idan Arye wrote:
>> On Thursday, 27 June 2013 at 20:43:47 UTC, H. S. Teoh wrote:
>> >That's something I never really understood about the Windows / GUI
>> >world. The backend functionality is already all there, yet for some
>> >strange reason the application refuses to have the means to access
>> >that functionality, requiring instead for you to install "plugins".
>> >To me, a "plugin" should *enhance* functionality by adding what
>> >wasn't there before, but in this case, it seems to be more about
>> >removing artificial barriers to reveal what has already been there
>> >all along. Same thing goes with the iPhone emoji apps, and many other
>> >such examples.
>> >
>> >As a CLI-only person, I find this really hard to grok.
>>
>> With the popularity of XML build systems, that shouldn't be that
>> hard for an IDE to provide you with a GUI to edit the targets and
>> make complex build processes. I would have expected big IDEs like
>> Eclipse to have that feature...
>
> Yeah, what with all the fancy code-editing features, you'd think having
> a built-in XML editor would be easy...
>
> XML is a pain to edit by hand, though, if your editor doesn't understand
> XML. It's sorta like Java and IDEs; in theory, you *can* write Java with
> nothing but pico, but in practice, it's so verbose that it's only
> tolerable if you use an IDE with autocompletion.
>
>
> T
I'm not talking about an GUI XML editor here - I'm talking about a buildfile editor. One that is familiar with the common tasks and let you edit them with a GUI menu. Ofcourse, it should also be expandable with plugins to deal third party tasks, and have a way to deal generic tasks, but having a GUI for the common tasks is the key.
Eclipse already have an textual XML editor, and it has autocompletion for Ant's build.xml, but it's much easier to configure an Eclipse run configuration with a configuration menu than to edit the Ant target with a text editor, so most users will prefer to use the IDE's build function.
It's also important that the IDE will use this build system by default. When you open a new project, it should automatically create a buildfile, and only store in it data that needs to be shared between developers. If the IDE needs to store other data that is only relevant locally, it should be in a separate, unversioned file - otherwise this data will create redundant, hard-to-solve merge conflicts.
The IDE needs to have both features:
If the IDE does not use the proper build system by default, most developers will use the default build function, and when they reach it's limit they'll have a hard time switching to the proper build system - and many might choose not to switch, and will use bad solutions that introduce technical debt.
If the IDE does not have a GUI configuration tool people will simply not use the IDE. IDE users don't like having to edit the build configuration with a text editor - not when most IDEs have nice GUI for it. People who prefer the powerful edit-by-text build system over the crippled edit-by-GUI one usually prefer text editors over IDEs anyways.
|
Copyright © 1999-2021 by the D Language Foundation