dmc { { @responsefile } { file } { switches } }where the { } means "repeated 0 or more times". DMC with no arguments prints a short help file to stdout.
;Comments are lines where the first non-blank character ; is a ';' [Environment] INCLUDE=c:\dm\bin LIB=c:\dm\lib CFLAGS=-v -w ;Note that %PATH% gets replaced by the previous value of ; the environment variable PATH. PATH=c:\dm\bin;%PATH%The special environment variable @P gets replaced with the path to where the sc.ini file resides. For instance, the above can be replaced with (if sc.ini is in c:\dm\bin):
[Environment] INCLUDE=%@P%..\BIN LIB=%@P%..\LIB CFLAGS=-v -w PATH=%@P%..\BIN;%PATH%which makes the settings in sc.ini independent of where the DMC directory tree is installed.
If sc.ini is not there, no error results. This feature is useful for avoiding cluttering up AUTOEXEC.BAT with environment variable settings. Not only that, it will make running DMC independent of any existing environment variables set for other tools.
The environment settings in sc.ini do not prefix, augment, or append any existing settings in the environment. They replace the environment settings for the duration of running the IDDE or the compiler. If you wish, for example, to use sc.ini to append an INCLUDE path to the existing INCLUDE path, it can be written as:
[Environment] INCLUDE=%INCLUDE%;c:\dm\include
.c .cpp .cxx .cc .c++ .asm .s .rcThe extension of the file determines what is done with the file.
no extension any other extension .c Run C compiler .cpp .cxx .cc .c++ Run C++ compiler .asm .s Run assembler .lib .a Include file as a library to the linker .obj .o Include file as an object file to the linker .exe .com .sys .dll Include file as the output file to the linker .def Include file as the module definition file to the linker .rc Run resource compiler .res Include file as the resource file to linker
Alignment can also be controlled within a source file by using #pragma pack().
Effects on C and C++:
Effects on C:
Effects on C++:
char abc[4] = "1234";
where there is no room for the trailing 0 is invalid.
Because code generated for exception handling (EH) is not quite as good ('this' pointers are not placed in registers, tables are generated), the default is that EH is not enabled. To enable EH, compile with the -Ae switch. Compiling for ANSI C++ (-A) will also enable exception handling support.
To compile C code that is to be mixed into C++ code that handles exceptions, compile the C code with -Ae also.
Objects compiled with -Ae may be linked with objects compiled without -Ae, provided that no exceptions are thrown in the resulting program that are not handled entirely within objects compiled with -Ae.
The tables are generated in the same segment using the same rules as the virtual function pointer tables. Using -NV will place the tables in far segments (for 16 bit memory models).
For link compatibility with Microsoft C++ compiled DLLs, but still using RTTI, compile with -ER:
Virtual function call compatibility is not affected by RTTI.
It is best to avoid using RTTI on classes that are pulled in from a .LIB, .OBJ or .DLL where they were compiled with a different compiler.
#undef strcmp /* use runtime library version of strcmp */
file.obj : $(file.dep) file.c dmc -c file -d
#define macro "this is a string"
Win32 implementations have an emulator for the numeric coprocessor, so -f is the default for the -mn memory model.
No Pentium Pro or Pentium MMX chips have this bug.
-ff will also cause the folding of floating point constants even if they would generate an exception. For example, 0.0/0.0 will be folded at compile time instead of at runtime.
Use -ff -fd in combination to get fast inline code with the FDIV bug workaround.
The default, -g, only generates debug info for types of variables that are actually used by a module. This greatly cuts down on the size of the object file. In addition, since C++ class hierarchies tend to be very complex, -g generates class debugging information only for classes for which a corresponding virtual function table is generated.
-g is the right choice for programs that do not reference libraries or DLL's for which type info is needed.
Sometimes, though, if part of a class implementation lies inside a DLL, the debug info for that class may never get generated. Thus, the -gf flag exists to cause full debug info to be generated for each class that is reference by generated code.
Even -gf can get a bit problematic, since it can cause large quantities of debug info to be generated. It also never generates debug info for classes that only may be referenced in a linked-in library or DLL which were compiled separately without debug info.
-gh causes debug info to be generated for all global structs and typedefs, regardless of if they are referenced or not. One could just use -gh for all compilations, and it will work, but it will be a bit slow.
A better solution is to use -g for all the modules in the program. Then, create a special file TOTAL.C, which has a #include statement in it for each header that references a library or DLL. Compile TOTAL.C with -gh (and the other memory model, etc. switches), and TOTAL.OBJ will contain all the debug info you need for those DLLs and libraries. This will minimize compile/link times, which is important for fast development turnaround.
OBJ2ASM can be run on any .OBJ file, and it will format and pretty-print any debug info found in the file. This is handy if you're curious as to what the compiler is doing.
If a C++ inline function is outlined, it still remains static, even with the -gg option enabled. This is because name collisions in this case are difficult to prevent. A C++ inline function is outlined if:
__ptrchk must preserve all registers, and must pop its parameter prior to returning (pascal calling sequence). These measures minimize the code expansion resulting from using -gp. Thus, __ptrchk must be written in assembler or inline assembler.
__ptrchk is intended for use by a memory debugger. The idea is that __ptrchk validates the pointer. If the pointer is not valid, it notifies the user at the source of the problem.
There is a default version of __ptrchk in the runtime library.
void __far _trace_pro_n(void) // Prolog for near functions void __far _trace_pro_f(void) // Prolog for far functions void __far _trace_epi_n(void) // Epilog for near functions void __far _trace_epi_f(void) // Epilog for far functionsYou must provide the code for these functions; typically, they would be used to generate debugging information to support profiling. The functions must preserve all registers and, therefore, must be in assembly language.
The prolog function is called after the stack frame is set up, and the epilog is called just before the stack frame is destroyed.
Implementations of these functions are provided in the runtime library that implement dynamic execution profiling.
The -GTnnnn switch is ignored for near data models (t,s,m) and for all 32 bit memory models.
If using -GT, use the same value for all modules. For .ASM modules, be careful when referencing global data whether it is in DGROUP or in a far data segment. The symptoms of getting this wrong are frame errors from the linker.
Note that if -GT is used, arrays declared with empty [] like: extern int array[]; are assumed to be far. If they actually are near, declare them using the dimension, or mark them explicitly as __near.
Always be sure to use the same nnnn threshold value for all modules compiled with -GT.
-H also implies -HO (include files only once).
The -HC switch does not apply to -HX precompiled headers, which are always cached in memory for IDDE compiles. The -HC switch is ignored for the command line compiler, which always reads precompiled headers from disk.
The filename.obj file that is also generated should be linked in as well, as contains code and data generated when building the precompiled header. If -g is thrown, the .obj file will contain all the debug info for the header file(s).
This option is especially useful for including a precompiled header at the beginning of all the source files.
An alternative is to include the directive
#pragma oncein each include file that will never need to be parsed more than once.
The scph.sym file is placed in the current directory by default. The -HD switch sets an alternate directory for it.
Switch | Search |
---|---|
-Iabc | Subdirectory abc |
-Id: | Default directory in drive d: |
-Iabc;d: | Both of above directories |
-Ig:\cbx\include\ | Directory g:\cbx\include |
-I. | Current directory |
-I\ | Root directory |
-j, -j0 | Japanese |
-j1 | Taiwanese and Chinese |
-j2 | Korean |
-j- | no Asian language characters |
When the compiler is run under Win32, the switches correspond to the following operating system locale code pages:
-j0 | .932 |
-j1 | .936 |
-j2 | .949 |
These locales determine which bytes are the initial bytes of a multibyte sequence, and control how strings are converted to Unicode strings. If those locales are not supported by the operating system installed on your machine, or the DOS native version of the compiler is running, the lead byte of a multibyte character sequence is:
switch | language | prefix bytes |
---|---|---|
-j0 | Japanese | 0x81..0x9F and 0xE0..0xFC |
-j1 | Taiwanese and Chinese | 0x81..0xFC |
-j2 | Korean | 0x81..0xFD |
and the conversion to Unicode is done simply by 0 extending a regular character and stuffing a multibyte sequence into 16 bits.
The -j switch is obsolete. The #pragma setlocale() directive is a superior solution.
-Jb disable empty base class optimization The "empty base class optimization" causes base classes with no members to add no size to the derived class. Compilers 8.27 and earlier would add 1 to 4 bytes of size per empty base class (depending on alignment). Eliminating this can result in significant performance gains with STL. For link compatibility with code compiled with compilers 8.27 and earlier that used empty base classes, the -Jb flag will help. Alternatively, add a char reserved; data member to the empty base class definition, which will cause it to have the same storage layout as earlier compilers. -Jm relaxed type checking -Jm- strict type checking (default) Use for legacy C code which relied on the loose type checking of older C compilers. Do not use -Jm for new code development. -Jm is ignored for C++ compilations, since C++ is heavily dependent on strong type checking. -Ju char and unsigned char are the same type (unsigned char) -Ju- (default) char and unsigned char are different types This is useful in compiling some old legacy C code where the author arbitrarilly mixed up using char and unsigned char types. It is equivalent to: #define char unsigned char -Ju does not affect the behavior of runtime library functions like strcmp(), which always behave as if chars are signed. Do not use -Ju for new code development. Using -Ju causes all sorts of problems in C++ code, primarily goofing up overloading based on char types. Using -Ju for C++ is not recommended. -J char promotions are unsigned -J- char promotions are signed (default) This affects whether objects of type char are sign-extended or zero- extended when being promoted from char to a larger type. It does not affect the signed char or unsigned char types. char is still a type distinct from unsigned char and signed char. If code is sensitive to -J or -J-, then it is non-portable, and should be carefully checked to replace the sensitive types with signed char or unsigned char as needed. For instance: char c; int i = c; Should be replace with (if it matters): signed char c; int i = c; or: unsigned char c; int i = c; or: char c; int i = (signed char) c; int i = (unsigned char) c; int i = c & 0xFF; -J does not affect the behavior of runtime library functions like strcmp(), which always behave as if chars are signed. -J causes the macro #define _CHAR_UNSIGNED 1 to be defined. -l[listfile] Generate source listing file Generates a source listing file with the name listfile. listfile defaults to being the name of the input file with the extension .lst added. To show the effects of the preprocessor in this file, specify the -e option also. The compiler then inserts error messages and line numbers into the listing file.There is no space between -l and listfile. -L Using non-Digital Mars linker Digital Mars linkers have some extra features in them (like support for long command lines), -L means a non-Digital Mars linker will be run so those features cannot be used. -Llinker Use linker linker instead of default. This is useful if a special linker is being used, or if the linker is not on the PATH or in the directory where DMC resides. For example, -L\path\mylink tells the compiler to run the MYLINK linker in the \path directory. -L/switch Pass /switch to linker. For example, this command: sc test -L/packcode -L\test\prog adds the option /packcode and the library search path \test\prog to the linker's command line. -Masm specify assembler to use MASM is normally the assembler called by DMC when an assembler source file needs to be assembled. Use -M to specify an alternate assembler asm. For example, -M\path\masm tells the compiler to run the MASM in the \path directory. -M/switch Pass /switch to assembler -m[tsmclvfnrpxz][do][w][u] Select memory model 16 Bit Models t Tiny: small model program linked specially to create .COM file s Small: small code and data (default) m Medium: large code, small data c Compact:small code, large data l Large: large code and data v VCM (Virtual Code Manager) (obsolete) r Rational Systems 16 bit DOS Extender z Zortech ZPM 16 bit DOS Extender (obsolete) 32 Bit Models f OS/2 2.0 32 bit flat model (not supported) n Win32 32 bit flat model (default) p Pharlap 32 bit DOS Extender (obsolete) x DOSX 32 bit DOS Extender (obsolete) Memory Model Modifiers d DOS 16 bit (default) o OS2 16 bit (obsolete) Indicates code will be targeted towards an OS/2 16 bit executable. w assume SS != DS Use the w flag in combination with the -m option to make sure that the stack segment (SS) is not equal to the data segment (DS). Certain types of executables require that the DS and the SS be different. These are: