• OK, it's on.
  • Please note that many, many Email Addresses used for spam, are not accepted at registration. Select a respectable Free email.
  • Done now. Domine miserere nobis.

What do you think will be the next programming paradigm?

walfin

Democrazy
Local time
Today 1:46 PM
Joined
Mar 3, 2008
Messages
2,439
-->
Location
/dev/null
For a long time, there was pure imperative programming. There was assembly, and then there was BASIC. Nothing existed except for the tetragrammaton of GOTO, sacred to the flying spaghetti (code) monster.

Then came functions, and with it came structured functional programming. C and Pascal bloomed.

Along the way a whole host of other paradigms came and went, they never came to pass. Then one day we saw the light. Verbs are not all that make up the world. Nouns took preeminence and objects came to the fore.

And the explosion of C++ development led the way, blazing a trail like no OOP language before and opening the way for other OOP languages to match and even surpass it in some ways.

OOP spawned a number of other progeny, like Context Oriented Programming and Aspect Oriented Programming. Alas, they were stillborn.

And so. Here we are, 30 years after C++ was christened as such, and the OOP model still reigns supreme, regardless of your programming tongue or runtime environment (in contrast, C++ became C++ only 11 years after the invention of C). What do you think the next big paradigm will be? What ways are there of thinking about this world, except as a collection of objects? What can make our code faster, better, stronger?
 

Architect

Professional INTP
Local time
Yesterday 11:46 PM
Joined
Dec 25, 2010
Messages
6,692
-->
Not to sound grandiose, but I have a set of ideas I've been working on for six years that I think might be this step. Unfortunately I'm not going to publish it here. It transcends (and incorporates) all the specific techniques you mention above.
 

Cognisant

Prolific Member
Local time
Yesterday 6:46 PM
Joined
Dec 12, 2009
Messages
9,008
-->
It's simple enough to explain software conceptually, I've never made a rendering engine but I enjoy reading about how they work, physics engines too, so I suppose the ideal would be a program that can understand a conceptual description of a program and turn it into working code, and while we're in fantasy world lets say this hypothetical program can understand speech, even respond with questions and explanations so that it can collaborate with the coder to construct a conceptual definition of the program to be built via natural conversation.

Oh wait, we have humans monkeys for that.
 

h0bby1

Active Member
Local time
Today 6:46 AM
Joined
Jun 3, 2013
Messages
103
-->
now it's all into vectorisation and parralisation, multi cpu, multi threading etc

physic it's mostly vector math, and soon it will be the era of real time raytracing, so lot of paralelisation and vector math =)

maybe also c++ can learn few things from AS3 to have more flexible typing, and also oriented toward asynchronous kind of calling and methods

but i think also mostly what can improve greatly any software would be to simplify a lot operating systems =)
 

h0bby1

Active Member
Local time
Today 6:46 AM
Joined
Jun 3, 2013
Messages
103
-->
some system like COM could be integrated to allow for loose typing, like if you would have some class that would have to call a member of a class of unknown type, like let say if you have a class window, and then you want this class to call a member 'onclick' of an unknown class, it would be nice to have something like that to define class member on the fly, and being able to call members of class of unknown type, with an object type that could be added members on the fly, or to be able to have something like COM that would not require any class definition at all in the client application, and some kind of nice smart pointer thing could be usefull as well, but some language already do that, so it's not really great paradigm shift

anything also from the intel math kernal integrated into the standard C math lib could be nice, to have all kind of dct, fft, quantization, advanced trigo, quaternions, vector type, recognised as fully part of the standard C lib would be great as well as features of openMP integrated into standard language

and a way to handle arborescent structure nativly, like some kind of tree system, that could go along with xml kind of thing, like to map easily some class structure to arbitrary xml type file, AS3 or other language can already do some of these though , but it's often the kind of thing that become quickly bothering with c++, there is always way to use some kind of template to avoid some of this problems, but the synthax can become easily obscure
 

Brontosaurie

Banned
Local time
Today 7:46 AM
Joined
Dec 4, 2010
Messages
5,646
-->
lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists
 

h0bby1

Active Member
Local time
Today 6:46 AM
Joined
Jun 3, 2013
Messages
103
-->
lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists of lists

a tree ? :)
 

Brontosaurie

Banned
Local time
Today 7:46 AM
Joined
Dec 4, 2010
Messages
5,646
-->
Are you explaining the idea of a system that can react to any possibility? Hence all the lists? :confused:


no i was just being stupid and flaunting my aristocratic lack of affinity with down-and-dirty engineering chores.

walfin's OP was really very good and i have nothing to add because i don't have a clue.
 

TheScornedReflex

(Per) Version of a truth.
Local time
Today 6:46 PM
Joined
Dec 9, 2012
Messages
1,917
-->
Oh. I knew that..!
 

walfin

Democrazy
Local time
Today 1:46 PM
Joined
Mar 3, 2008
Messages
2,439
-->
Location
/dev/null
Unfortunately I'm not going to publish it here.

Darn. General idea? Some vague hints?

a program that can understand a conceptual description of a program and turn it into working code

Mm, I understand what you mean. That is very difficult. I thought Lego Mindstorms/Alice/Kodu Game Lab were a step in this direction but I can honestly say it's so much harder and more tiresome to use those than any of your run-of-the-mill languages.

now it's all into vectorisation and parralisation, multi cpu, multi threading etc

Yeah, quite right. Proactor/reactor patterns perhaps?

I had actually felt once that the next programming language might be task-based (i.e. annotate parallel parts), with the underlying multithreading being abstracted away.

http://www.drdobbs.com/parallel/when-tasks-replace-objects/232700402

But C++ was originally meant to do this sort of stuff (in fact it was the subject of Stroustrup's PhD) but the current thread library is still limited, CC++ did not catch on, Java still uses the Threads-in-a-library model and Node.js doesn't seem to have quite taken off. So I don't know if this hype means anything.

maybe also c++ can learn few things from AS3 to have more flexible typing, and also oriented toward asynchronous kind of calling and methods
The former: void* :D
The latter: std::future
 

h0bby1

Active Member
Local time
Today 6:46 AM
Joined
Jun 3, 2013
Messages
103
-->
yeah with the void, actually even in com and directX at the end of the day lot of stuff get passed as void ** stuff, but it's pretty miserable lol and still have to have the .h definition of the class at both end, COM had good idea, but this whole thing of hardcore typing is also what make MFC so messy and bothering, ATL is ok thought, but also sort of complicated, if you look into directShow stull and ATL, it become very messy quickly the whole thing, because of hardcore typing, and void ** thing to get a pointer to a class, it's not that great

and even the void ** thing , it also do some sort of type casting to pre delcared class, not like you can pull out a method out of a class passed as void pointer if you don't have the definition, much like it would happen with importing a function from a dll module, or with the com system

but yeah you can do also paralisation or other in c++, using system libraries, but it's not very abstract, and there should be much more convenience for having asynchronous task with callback or much more convenience for working with asyncrhonous task embeded into the language, instead of having to deal with system specific functions

i mean after you can do pretty anything with assembly language as well, so it's more matter to have more things implemented directly as part of the language to avoid to have to recode things over and over again, like any kind of program nowdays will have to recode part of vector library, having to deal system function of multi threading, and making some good portable handling of asyncrhonous transfer can quickly become bothering

linux mostly stayed on C level for most things, so it doesn't bother as much with type casting and typing sometime =) all is made with streams, read/write/iotcl, but it's not that performant either, and it doesn't allow that much abstraction, compared to OLE in windows


and for the tree system, as i've been starting to develop my own OS, i integrated a tree system at the very base layer, just after the memory allocation system, and i found it incredibly usefull, as most of the ressource can be accessed throught a tree system, no need for enumerators, or list passed as double pointers, or anything like that, just a pointer to a tree structure can allow to pass complex information, about as well buses, file system, even the windowing system, all use the same base tree structure, and it's usefull anywhere, and it's very flexible way to represent data structure by adding typed name nodes in a flexiable tree structure, i really wonder why they don't use this form of stuff much more in base language

this tree system can even be used to represent a class object, and to dynamically add or remove members to the object, and accessing them by name afterward, AS3 do this sort of thing, and it can be very useful , it replace a void ** type thing anytime and is much more flexible and still allow some form of typing

it allow for a sort of meta definition of an object, you get that a bit in COM with the idl files, but it's not that efficient as it all together


and smart pointer, it would be also in the optic of allowing easy ressource relocation, maybe sort of associated with the virtual paging system, or to have ressource pointer without using a direct memory pointer, which could allow easily to relocate things, or to do remote access to structure somehow
 

walfin

Democrazy
Local time
Today 1:46 PM
Joined
Mar 3, 2008
Messages
2,439
-->
Location
/dev/null
and for the tree system, as i've been starting to develop my own OS, i integrated a tree system at the very base layer, just after the memory allocation system, and i found it incredibly usefull, as most of the ressource can be accessed throught a tree system, no need for enumerators, or list passed as double pointers, or anything like that, just a pointer to a tree structure can allow to pass complex information, about as well buses, file system, even the windowing system, all use the same base tree structure, and it's usefull anywhere, and it's very flexible way to represent data structure by adding typed name nodes in a flexiable tree structure, i really wonder why they don't use this form of stuff much more in base language
I wager that std::map is one of the most overused containers in C++.

Why aren't trees implemented in hardware?
 

Architect

Professional INTP
Local time
Yesterday 11:46 PM
Joined
Dec 25, 2010
Messages
6,692
-->
Darn. General idea? Some vague hints?

Elevating writing software out of writing code, to design, architecture and gamification of that.
 

h0bby1

Active Member
Local time
Today 6:46 AM
Joined
Jun 3, 2013
Messages
103
-->
I wager that std::map is one of the most overused containers in C++.

Why aren't trees implemented in hardware?

not sure it's necessary to implement this in hardware, but at least to have standard abstraction of the principle, as the interest of tree system also has to do with being able to use also higher level types, it's not necessarily to be implemented in hardware, but at least in the libC or the libstdcpp

like fpu math, it depend on hardware as well, or memory allocation, or stream system also depend on the system and/or hardware specific things, it still help to have them put in a standard language to have a standard form of abstraction for them that can be compiled on any system and hardware using the libc/libstdcpp,

they should do this for vector math and multithreading, if you consider a sse4 on a quadricore with hyperthreading, to do some kind of vector operation on large array, it can multiply the execution time by about 32 theorically, x4 for quadricore x2 for hyperthreading with 2 thread/core, x4 for vector operation being made on 4 scalar in the same time, and most hardware support that nowdays, so it would be nice to have this kind of things avaible in the libC

for the typing, with tree, and flexible meta-class, it's more to simplify whole lot of mess like the MFC/ATL/COM/DirectX/OLE kind of things
 

walfin

Democrazy
Local time
Today 1:46 PM
Joined
Mar 3, 2008
Messages
2,439
-->
Location
/dev/null
Elevating writing software out of writing code, to design, architecture and gamification of that.

Mmm. I believe you would have a prototype piece of software too?

they should do this for vector math and multithreading, if you consider a sse4 on a quadricore with hyperthreading, to do some kind of vector operation on large array, it can multiply the execution time by about 32 theorically, x4 for quadricore x2 for hyperthreading with 2 thread/core, x4 for vector operation being made on 4 scalar in the same time, and most hardware support that nowdays, so it would be nice to have this kind of things avaible in the libC
I think it's probably possible to write an STL implementation that specialises vector<float> and vector<double> to use both SSE4 & multithreading. Not sure if any already are.
 

h0bby1

Active Member
Local time
Today 6:46 AM
Joined
Jun 3, 2013
Messages
103
-->
yes you can do that even without stl at all, but it's not explicit, modern compiler can somehow detect that is you enable sse instruction set, but it's a bit random, the goal is for it recognize algorythm that can be vectorized, like if you have to add or blend, or normalize or multiply vectors, as the libc include float and double type and the fpu type and instructions, i don't see why it would not nclude the function and type that match the sse, and to have full use of sse, the vector must be aligned in memory which can sometime also be bothering, but compiler could mannage it, it's already half the case with the intrinsics header, but it would be better with a native c/c++ type

if you declare an stl vector float and then add them together, i'm not sure the compiler will compile that using the sse instruction =)

like vector <float> v,v1,v2;

v=v1+v2;

or to compute dot products, normalization, length, interpolation etc like to manipulate either 3D vector or rgba pixels

on a loop to do operation that can be vectorized like that using multihreading it can boost performance a lot

there is way to do that with the intrinsics functions, but it's a bit weird synthax, it would be better to have the type recognized as native C/C++ types, at least for vector up to 4D that can make full use of sse, a bit like shader language actually

there are many lib who do that, but to have code that are plainly cross compiler it can become messy easily

having something like that with native C/C++ types

http://fhtr.blogspot.fr/2010/02/4x4-float-matrix-multiplication-using.html

it could go as far as to have something easily transfered even to gpu

but it's not that hard to do in itself, but it need either macro to align stack vectors, or to use the intrinsics , and to have whole library coded a bit manually each time to handle all the vec3/vec4 and matrix operation which could be integrated into the libC and understood as native type by the compiler
 

h0bby1

Active Member
Local time
Today 6:46 AM
Joined
Jun 3, 2013
Messages
103
-->
what made C and c++ popular is as well modular design, ability to import/export symbols from dynamically linked libraries, to be able to load executable module and load symbol mapped in memory at runtime , and also the c/cpp runtime that allow to have abstract level to handle system functions, that the compiler can understand as part of the language, this is also what made c so easy to use

but if you look into c runtime library, specially in gcc runtime, you get tons of #ifdef, and preprocessor directive, to handle different type on different plateform, to make a glue with the abstraction of the language and the system and cpu/memory hardware, to be able to have software code that is as much cross platform as possible , because of the mannagment of memory allocation, math function, streams functions, exceptions, io ports etc is done by the libC and you can make C code that doesn't depend on the cpu/memory/system

and today if the sse is not well used, it's like getting no advantage from anything after pentium II, pentium III is basically pentium II + sse, and even latter cpu mostly improve the set of instruction and added number of cores, as frequency of cpu is mostly reaching a ceil, now the improvement goes with parallelization , pipelining and vectorization , to be able to do more at each clock cycle with vectorization and pipelining, and to be able to execute several core at the time to multiply what can be done in a cycle

and sse is not really well handled , the vector aspect is not only to make array of things, but to be able to execute same operation on all elements of the vector in the same time, and C doesn't allow to really define this, it can be done, but it's not defined on abstract level, either need to trust in compiler optimisation, or use assembler or the intrinsics header

nowdays any application that has to do with multimedia, either imaging, video, audio, 3D , will have to use vectors and matrix type, and cpus have been made to handle this pretty efficiently with sse, and there is not really standard way to make use of the SIMD features, on MAC they made the altiveC thing to handle vector math, but now it should be integrated directly in the libC , to have vec2/vec3/vec4 types in integer float and double, and all operation on them that is supported by sse in the libC, even integrating dct/fft routines wouldn't hurt, most of the time it's much more efficient on all level that the standard FPU, even for simple scalars

and regarding how far they pushed the dynamic linking of objects, it wouldn't hurt either to have a more flexible data type with meta class data to share information between module , and to be able to represent complex type in a flexible manner at runtime, cause the dynamic linking has many flaw and is very light regarding typing, dll export don't say much about the type of the function, the calling convention, the type and numbers of parameters passed to it, there is mostly the name of the symbol, and the client software need to have the correct declaration to be able to make sense of the imported symbols, it's why they did COM, to add bit more flexibility to this, but it's not all that great, and same there could be similar stuff integrated as native functionality, i'm not sure they had MFC/ATL/DirectX/OLE in mind at the time they developed c++ =)
 

h0bby1

Active Member
Local time
Today 6:46 AM
Joined
Jun 3, 2013
Messages
103
-->
also not sure it could fit into paradigm shift , but also if you take C and cpp in the context they have been made, they are mostly designed toward developing single threaded console application, in the sense if you want to develop a console application, you have everything you need wrapped in the libC and C language , file system, memory allocation, and the stdin/stdout, string manipulation and basic math function, so that you can code some console application in a completely cross platform manner

and C is pretty much the only language that can be used to develop operating system, from the lowest level, without using even any ASM or hardware/cpu specific thing, it was the goal of C originally, to provide a standard language to code operating system and application thing without having to depend on any cpu or architecture specific thing

now if you take it to nowdays, most application are windowed, and multi threading will become more and more common, and there is nothing to handle that in the libC, so you have to deal with system specific thing for that, which is not really in the original spirit of what C was for

it would be totally in the line of original thinking of the C language that you would have some kind of window and graphic management into it, like you have stdin/stdout, printf, any C application is supposed to be run in the context of some kind of console with an input and an output , which could be transposed to windowed environement, and having equavalent of the stdio made to handle windows, and to have C that are natively made to be run inside of a windows, to manage events , and output inside of a windowed environement

and the importance of the libC is not to be neglected, it's what make C so powerfull, and able to bypass assembly/system level of programming, and the STL is not equivalent, because the compiler doesn't understand the STL objects and functions, like for example if you do " c = cos(x); " you don't say 'call the sin function with the parameter x' but 'do a sinus', which make in sort that if you have "c=cos(x); s = sin(x); " the compiler is able to use the fpu instruction 'fsincos' to compute both, because he know what the function is about, and he can do global scale optimization to a whole routine, because he know what the function are about, moreoever, the compiler will often be able to replace some algorythm by a call the a libC function, like if you make a loop to set all element of an array, he can replace with a memset, , which is not the case with the STL, they are trivial example, but it's still important dimension of how C language works, as well as c++, to have a base tool box of basic operation a program
should use to avoid to have to use cpu or system specific code

like let say you have a calling convention like __async in the same manner that you have stdcall or cdecl , it could be able to insert automatically a call to the libc function to create thread, and mannage more or less automatically concurent access to the variable inside the loop, and having a whole compiler level understanding of the asynchronous dimension of the program, and handle himself all the lock and parallellization on a global program level, and being able to organize itself all the memory sharing and interlocking with some language specification, that could also include vectorization with a native C type if the cpu can handle it, it would not even be paradigm shit, but just apply the original C paradigm to modern hardware and system requirement, to be able to develop application in multi tasking multi threaded windowed environment with the same ease that C was supposed to provide for console app

nowday it would be rather great to have an equivalent of stdio , like win_in, win_out, and the basics of graphic rendering function embeded into the libC, or libstdcpp , even if they are not mapping the whole possibility of each system and not the most performent, it would really be cool to have some sort standard to manipulate image, dialog, basic rendering function, and windowing into it, even if it's bit different from console environment, because in console environment, the program is run from the console, whereas in windowing environment the program created his own windows by itself, and not run 'from it' , it would need that the user create windows and then launch application into them, but it could be made in sort to have an handle to the root node like the desktop, to be able to create windows and manipulate rendering context from within the C/C++ language with a standard interface , at least for basic operation of imaging, blitting, and some basic dialog, like it is with the stdio for console application
 

Architect

Professional INTP
Local time
Yesterday 11:46 PM
Joined
Dec 25, 2010
Messages
6,692
-->
Mmm. I believe you would have a prototype piece of software too?

Yes. I'm close to using the tool to develop the tool.
 

bartoli

Member
Local time
Today 7:46 AM
Joined
Jan 5, 2013
Messages
70
-->
Location
France
Elevating writing software out of writing code, to design, architecture and gamification of that.

I have had the thought for a few years, of something that would work by the idea 'Design is Implementation'. That if you can model all the behaviours you want in a visual way, then implement 'targets' for multiple architectures, you can skip the programming part of a project, and will obtain more robust programs (the design mistakes will still be there, but the mistakes of implementing the design in programming language will almost be null), with 'perfect' documentation (the design is the documentation, for someone who can read it).
I have the hope that with time, i can somehow put together all the function libraries and algorithms i use/create to grow such a tool
 

h0bby1

Active Member
Local time
Today 6:46 AM
Joined
Jun 3, 2013
Messages
103
-->
stuff like that already exists, like puredata or vvv

but i don't believe that much into that kind of things, because at the end of the day, it still require developer skill to understand how the different module works and can interact with each others, if you have large array of data to manipulate with that kind of thing it's most likely to be rather slow, and all together give software that are never very user friendly, neither very performent, and even if they are slightly less technical to make, it still require too much skill to be really accessible to neophyte users, and still require developer to manipulate them, and not sure there is that much of a net time gain vs scripting or direct programming all together, except if the thing is designed for a particular scope of function, like specific tools, but it can never efficiently replace programming language in term of the ratio of performance and the scope of possibility that they offer
 

John_Mann

Active Member
Local time
Today 6:46 AM
Joined
Feb 23, 2013
Messages
376
-->
Location
Brazil
To achieve strong AI what's the most important? Hardware or programming? Hardware it's almost reaching the speculated brain capacity in calculations per second. But are the software following the evolution of hardware at same rate?

And if strong AI is possible (in organic carbon is a reality at least), how it will emerge in silicon? By intentional programming or by chance?
 

walfin

Democrazy
Local time
Today 1:46 PM
Joined
Mar 3, 2008
Messages
2,439
-->
Location
/dev/null
To achieve strong AI what's the most important? Hardware or programming? Hardware it's almost reaching the speculated brain capacity in calculations per second. But are the software following the evolution of hardware at same rate?

And if strong AI is possible (in organic carbon is a reality at least), how it will emerge in silicon? By intentional programming or by chance?

Hardware wise the computer is too different from the brain. The brain is massively, massively parallel.

I think the first step is to increase concurrency and efficient task distribution in programming.
 
Local time
Yesterday 10:46 PM
Joined
Jan 19, 2010
Messages
118
-->
Location
California
I think the next big paradigm in programming will come out of bioengineering. There are teams working on programming genes and I think this world of chemical reactions, complex systems, and emergence will lead to new programming paradigms.

Then we all die because there was a bug.
 

scorpiomover

The little professor
Local time
Today 6:46 AM
Joined
May 3, 2011
Messages
1,862
-->
Elevating writing software out of writing code, to design, architecture and gamification of that.
FYI, applied for a job with a proprietary system that the interview said did all that, about 15 years ago. Pretty nifty system, by the sounds of it.
 

The Introvert

Goose! (Duck, Duck)
Local time
Today 1:46 AM
Joined
Dec 8, 2012
Messages
1,040
-->
Location
L'eau
This may not be relevant but I think it would be really cool if you could physically manipulate the program.

Like if you had a special pair of gloves that allowed you to interact with the technical world. Assign clipping to whatever and when you come in contact it would stop your progress.

Or a self-assembling system, based on previous code (like its own genetic code). Is that already a thing?
 

scorpiomover

The little professor
Local time
Today 6:46 AM
Joined
May 3, 2011
Messages
1,862
-->
This may not be relevant but I think it would be really cool if you could physically manipulate the program.

Like if you had a special pair of gloves that allowed you to interact with the technical world. Assign clipping to whatever and when you come in contact it would stop your progress.
Sounds cool. A friend was showing me pictures of his holiday on his iPhone. I could just use 2 fingers to expand the picture. I'd seen it done before. But it still reminds me of how Tom Cruise's character manipulated images in Minority Report. Be really cool if you could manipulate everything like that.

Or a self-assembling system, based on previous code (like its own genetic code). Is that already a thing?
Yes. But they're going away. I used to work on IBM's AS/400s. You could put in a basic program in RPG/400, and just reference the tables you were using at the top, which tables, and how you were using them, read only, write only, etc. The compiler would produce the finished progam, with a compilation report. The report showed that what the AS/400's compiler would do, was to add in all the fields automatically, and would generate the read and write functions automatically. Same for screens.

The nifty thing about this, was that the code would optimise based on what you'd already written. If you didn't use a field in your program, or you added 10 tables but only used 9 in your program, then the code would not bother to add the extra unnecessary code. So you could write your code, and the compiler would self-assemble, based on previous code.

If you needed to move some fields to a different table, you just did, and then recomiled the program. The compiler would automatically re-write the code for you.

It would also warn you of any potential errors that you forgot to check for, like if you forgot to add in a variable to hold the end-of-file indicator for 2 files, it would not compile, and the report would list both errors. That way, it forced you to add the variable to catch the end-of-file situation, which made you add in the code to cover the possibility. So it encouraged you to design code better.

You could also use "include" statements, like in PHP. Only with RPG, the includes would be automatically added by the compiler, and compiled for performance.

Performance was important for an AS/400, because one AS/400 would run 600 users. Recompiling the program while the system was in live operation, would normally take up enough CPU, that the workers could not do their jobs, and the compilation took ages as well.

SQL was available as well. But coding with RPG was just so easy, and so flexible, that it wasn't worth it, if you could write it in RPG.

Then IBM started pushing SQL, and moving away from RPG. I was not happy.

Microsoft's templates wizards do this as well. But with Microsoft's template wizards, you follow the wizard, and then the basic template is created, and then you have to change it to optimise it. Every time you have to change the code, you either do it manually, or you have to rewrite the code, or find some other way to get the wizard to not delete your existing work. Having used both, I prefer the AS/400 way.
 

Architect

Professional INTP
Local time
Yesterday 11:46 PM
Joined
Dec 25, 2010
Messages
6,692
-->
FYI, applied for a job with a proprietary system that the interview said did all that, about 15 years ago. Pretty nifty system, by the sounds of it.

The ideas' been around for a while but nobody has pulled it off yet.
 
Top Bottom