BerndPolDevelopment on &UNIX;development&UNIX;developmentSome Historical Remarkshistoryscripting languages&UNIX;history&UNIX;pipe&UNIX;shellshell&UNIX;
From the beginning, &UNIX; has maintained two very different development paradigms. One is the world of system and application programming languages, where some source code is translated to machine code by a translation program, usually a compiler or an interpreter. The programming language C is an example. &UNIX; was the first operating system kernel to be written in such a high level language instead of tightly machine-oriented assembler which was common before that time. (In fact, the C language once even was invented to write the &UNIX; kernel and associated programs on a DEC PDP-11 computer.)
The other paradigm is the world of scripting languages. This world evolved with the invention of the &UNIX; shell which was the user's interface to the operating system—and at the same time a very high level programming language. A shell script is built from a set of small utility programs like ⪚ grep, sed, or find. Each such utility is designed for some tightly defined job. The trick is that any such utility can be connected to another one via a simple transport mechanism, called a pipe, which directs the output of the foregoing utility into the input of the next processed one. This makes for a very powerful and highly flexible programming tool.
As time has gone by, both worlds have evolved. While C is still used mainly as a system programming language, C++ as a variant of C enriched by object-oriented and generic extensions has found its place for the development of complex applications in the 1990's. There are numerous other programming languages, even older ones keep their place—FORTRAN77 and Ada ⪚ still have their stronghold in numerical applications.
Contemporary Scripting Languages
In the scripting area, there has been a shift away from the shell, which suffers from portability concerns, to languages which unify all commonly needed functionality in their standard libraries, while still being able to interface to the outside through pipes when necessary.
All these scripting languages have in common that they are widely portable between &UNIX; variants, Microsoft &Windows;, &MacOS; or even VMS. Also, they all have implementations that are freely distributable.
&perl;Perlscripting languagesPerl&perl; has become popular as a text processing and system administration language. In the beginning of the World Wide Web, CGI scripts written in &perl; were a widely used method to create dynamic web pages from databases. Today, this method has been replaced mostly by the mod_perl plugin for the &apache; web server. Among &perl;'s strengths are its built-in support for advanced regular expression matching and its rich archive of freely distributed modules.
For more information see the Comprehensive Perl Archive Network (CPAN) website.
PythonPythonscripting languagesPython&python; shines by the elegance of its class system and the ease and flexibility with which external libraries can be wrapped in a way that they appear like standard &python; classes and functions. In contrast to &perl;, &python; has a clear and concise embedding &API;, which makes it the language of choice for making C and C++ programs scriptable.
PHPPHPscripting languagesPHP&php; was invented as a language directly embeddable into &HTML; pages and consequently has its main uses in delivering dynamic content on the web.
Higher-level Scripting
Higher-level &UNIX; applications usually miss the speed and flexibility of the traditional character-oriented shell scripting mechanisms. This is especially true in the world of graphical user interfaces (&GUI;) such as ⪚ &kde;.
There have been attempts to provide similar mechanisms which will work on a higher application level, most notably CORBA and, in the &kde; environment, &DCOP;.
The CORBA ProtocolCORBAscripting languagesCORBAcommunicationCORBACORBA (Common Object Request Broker Architecture) is an attempt to let computer applications work together over networks. It was devised by the private, vendor independent OMG (Object Management Group) standards comittee.
CORBA-based programs use the IIOP standard protocol to communicate. Implementations based on IIOP are available on a wide variety of operating systems, programming languages, and networks and are thus highly portable.
The main drawback of CORBA is its rather low speed. While this may be tolerable in networks, it is a real hindrance for inter-application communications in a non-networked environment such as &kde; running on a single computer.
The &DCOP; InterfaceDCOPscripting languagesDCOPcommunicationDCOP
Another evolution on &UNIX;-like scripting is the DCOP protocol which was devised for communication between &kde; applications to overcome the limitations of CORBA.
&DCOP; stands for Desktop Communication Protocol and is implemented as a simple IPC/RPC mechanism built to operate over sockets. In effect this provides facilities similar to the traditional &UNIX; pipe mechanism.
Traditional shell scripting is based on fairly small tool programs which were designed to work on a strictly textual basis. &DCOP; allows elaborate graphical programs to communicate with each other in a quite similar way. This enables ⪚ a &kde; program to send messages to another &kde; program, or receive data from it for its own purposes.
There are drawbacks, however. To use &DCOP; a program must be designed to contain a special &DCOP; interface. And the &DCOP; communication process runs somewhat slowly (although a lot faster than CORBA). But it returns much of the power and flexibility of &UNIX; scripting to high-level programs which are based on a graphical user interface.
For more information, see the DCOP: Desktop COmmunications Protocol paper or The &DCOP; Desktop Communication Protocol library &API; reference of the &kde; dcop library.
Build Systems
Except in very simple cases a programming project will consist of a lot of building blocks of source code each put into a separate file for easier maintenance. To make this running one has to effectively translate all this stuff into a few machine language units in a suiting format which allows the operating system to load and execute the program.
To accomplish this, the basic tools needed are
a text editor to write the source code files,
a translating program, usually a compiler to turn the source code into object files,
a librarian which collects object files into libraries to reuse them easily without the need to recompile,
a linker which binds several object files and libraries together into one executable,
a make system which provides some means to manage all this stuff and—not to forget
a debugger to (hopefully) find all errors in the program and possibly some other diagnostic tools to get everything running smoothly.
When you have a large project consisting of possibly hundreds of source code files, the process of compiling may become quite laborsome. You do not want to recompile all files each time you have changed only some of them. Instead, you only want to compile those files which are affected by the changes. In general, it is not always easily obvious which of the files have to be recompiled.
When you ⪚ change a function prototype in a header file, you need to compile every file which includes this header file. If your project contains many such files you may easily miss one or two of them if you have to do the job manually. Thus some means of automization is necessary.
The Make ProcessmakeMakefilerulerecompilationstargetdependenciescommands
A tool which takes care of recompilations is make. It keeps track of all work using a set of rules which describe what to do in case some piece of information (usually a source or object code file) was changed. All rules belonging to a certain project are stored in a so-called Makefile which is processed by make any time you want to update your work.
Each rule consists of several building blocks, namely
a target, &ie; the file to be built
a set of dependencies, basically the names of those files the target depends on (⪚ the name of a source file, where then the target will be the name of the object file to be built) and
the commands which are to be executed to make the target (&ie; to compile it or to link other object files together to build an executable progam file).
Basically the make command will read the rules one after another, check each file in the dependency list of a given target and make this target anew if any one of these files has changed, using the commands listed in that rule.
There are several additional possibilities to control such a make process, and a Makefile can thus grow very complex. We cannot go into the details here. However, we recommend that you make yourself accustomed to the syntax of make. Even if you do not normally use it directly, an understanding of the fundamentals of the build system can be useful. See the GNU Make Manual for more information.
For more &kdevelop; specific detail see the Building and Project Management chapter of this manual.
There are several tutorials available, see the references in the Building and project management chapter.
&GUI; DevelopmentGUIgraphical user interfaceuser interfaceGUI
Application developers become even more encumbered by having not only to create program libraries and logic, but also to provide an easy to use custom built user interface that is both intuitive and functional. Most programmers receive little to no training in &GUI; development, and as a result user interfaces often are poorly designed.
During the years some common design principles have evolved. It is strongly advised to adhere to them. This way your user interfaces will retain a common look and feel that the users of your application will gratefully appreciate.
For &kde; &GUI; development there is a style guide available. It is found in the &kde; User Interface Guidelines on the &kde; Developer's Corner page.
A short introduction to common &GUI; design principles can be found here.
Integrating Concepts and Tools – the IDEIDEintegrated development environmentdevelopmentIDEenvironmentIDE
There are separate tools available for almost any step in the programming process—planning, editing, managing files and compilation processes, debugging, documentation and the like. But once the projects grow the programming processes will most likely become quite cumbersome.
Much repetitive work has to be done when designing, compiling, and debugging a program. A lot of such work can be saved through the use of templates and scripts. And another lot by keeping these tools easily available and able to communicate with each other under a common &GUI;.
For example—would it not be convenient if a debugger were able to open the source file in question in an editor and place the cursor directly at the position of that bug just found?
To more easily accomplish such a scheme, Integrated Development Environments (&IDE;s) were devised. Such an &IDE; integrates all templates, tools, and scripts which are commonly needed in the development process into one single environment.
For the &kde; platform &kdevelop; is such an &IDE;. It provides a wide range of tools which ease program development and maintenance, even for different programming languages and across platforms.
Basic Features of &kdevelop; &kdevrelease;&kdevelop;featuresfeaturesManages all development tools needed for C++ programming, such as compiler, linker, debugger and build system.Provides an &appwizard; which generates complete, ready-to-go sample applications.Allows the user to select an integrated editor based on the &kde; programmer's editor &kwrite;, Trolltec's QEditor, or others.A class generator, for creating new classes and integrating them into the current project.File management for sources, headers, documentation &etc; to be included in the project.Assistance in creating application user manuals written with &kde; tools.Automatic &HTML; based &API; documentation for a project's classes with cross-references to the used libraries.Internationalization support, allowing translators to add their target language to a project easily, including support for &kbabel;.Support for managing a project via one of several versioning systems (⪚ &CVS;) by providing an easy-to-use frontend for the most needed functions.An integrated debugger frontend.An integrated shell console emulator.Syntax highlighting in source texts.An auto-code completion facility for class variables, class methods, function arguments and more.Templates for creating various projects (&kcontrol; modules, &kicker; (panel) applets, KIOSlaves, &konqueror; plugins and desktop styles).Four navigation tree views for easily switching between source files, header files, classes and documentation, obviating the need for an external file manager.Cross-compiling support, with the ability to specify different compilers, compiler flags, target architecture, &etc;Support for Qt/Embedded projects (such as the Zaurus and iPAQ).Inclusion of any other program you need for development by adding it to the Tools menu according to your individual needs.