Sunday, May 27, 2007

The Road to KDE 4: Konsole Gets an Overhaul

Again, after a delay , we return to bring you updates on the state of Konsole, KDE's UNIX terminal program. Konsole has been a staple of KDE since KDE 1.0, as has been showing signs of a little bit of clutter and wear. So, Robert Knight ,the authority in charge has stepped in to clean up the program's code, and more than anything else, fix a cluttered and difficult interface.

When the KDE 4 development series commenced, Robert Knight took over maintainership of Konsole. The end result is a Konsole for KDE 4 that is visually very similar, functionally improved and with a settings system you can actually stomach. The screenshot shows that the main window has not changed too much. .The once-intimidating settings menu now becomes very simple. It may look like the configuration options are gone, but they are still all available in a sanely organized fashion.

And lastly, the appearance people will appreciate this dialog: its implementation is effective and its use becomes obvious. Additionally, Robert has implemented style previews in an intuitive manner. As you mouse over the style, the Konsole window in the background automatically applies that style in an active preview. So you can very rapidly look through and appreciate the styles just by hovering over the list.Side-by-side comparisons aside, Konsole also offers a number of other improvements. Among them, split-view mode, faster scrolling (thanks to a smarter line redrawing scheme), hotkeys and more.

Saturday, May 26, 2007

Wubi makes Linux on Windows simple

Computer giant Dell made big news yesterday when it began shipping desktops and laptops pre-installed with Ubuntu, a popular Linux distribution. In terms of prepackaged operating systems, users have basically been stuck for a long time with the solitary choice of Windows vs. Mac. The fact that consumers can now purchase a Linux machine with support from the distributor certainly validates the open-source movement and Ubuntu in particular.

Yet if you're curious about Linux or Ubuntu, it hasn't been very simple to install it on your Windows machine. Up until recently, the process has usually involved partitioning your hard drive for the Linux install and creating a boot CD from a downloaded ISO file.

Luckily, for those of us who are Linux-curious and either too lazy or too inexperienced to install a distro on our own, beta software called Wubi literally takes all of the hassle out of running Ubuntu on your Windows machine, and it can be removed from your system as easily as any other Windows application. All it takes to run Wubi is a recommended 1GHz CPU, 128MB RAM, and 3GB of disk space for the initial Ubuntu installation.

Monday, May 21, 2007

VISTA, FACTS OVERMINED....

After the much anticipated release of “the safest version of Windows ever”, Windows Vista, both business and home users alike --were amazed by the striking similarities between the Windows desktop client and Apple’s Mac OS X. And what followed corroborated that this was not a mere concurrence.

1GHz 32bit(x86) or 64bit(X64) processor, minimum 1GB RAM, DirectX 9 supported WDDM driver graphics card with at least a graphics memory of 256MB --if you ever fancy playing games. And not to mention the demanding system memory speed of 1800meg per second. In short, be ready to get your system up-upgraded and ruin yourself.

WHAT VISTA OFFERS:

Aero, the much touted 3D user interface experience with translucent icons,looks cool, but it cannot be the reason to invest so large in Vista for home users. The business and tech-savvy users who are interested in how-stuffs-work cannot be impressed by such flashy appearances. Vista will reduce to Windows Vista Basic mode, without Aero, unless you get an advanced graphics card installed.

Another concerning aspect for all users is regarding security. Vista was released as the most secure Windows ever with features like Windows Defender, anti-spam and phishing filters and Parental Controls. Vista is first OS that the Redmond giant released assuming that it would be attacked. Vista boasts of not wanting any antivirus applications; with its built-in facilities like Automatic backups, Performance Self-Tuning, and built-in diagnostics that help you keep your data protected and your PC running smoothly. The claim was thwarted with the detection of bugs in Vista, immediately after its release. A serious error was discovered in Internet Explorer 7 which will be of aid to hackers. Windows admitted these security threats but still claimed the OS to be the most secure OS one can get.


Connecting to various networks is a key capability in today's OSes. Vista’s Network Center makes it effortless not only to create secure ad hoc wireless networks, multiple networks and wireless peripherals, but manage network connections in a resourceful way. Then again, all OSes in the market does all of these and Vista offers nothing to give it an edge over others. The Unix based Linux OS clearly have an upperhand over all operating systems since it was primarily developed for multiuser and multitasking capabilities and over the years it has proved its merit.

Internet Explorer has got a major version change since the release of IE 6.0 in 2001--thanks to the phenomenal incarnation of the Open Source lad Firefox, is now the integrated browser with Vista. . Vista has an Opt-in block for ActiveX and Firefox does not authenticate ActiveX at all. As far as security is concerned, Firefox uses Google’s database to inform you of sites of suspected forgery and has anti-phishing features as well. And for your kind information, IE7 is competing with the previous version, Firefox 2.0, of the Mozilla Foundation and according to its media release the latest version of the browser, Firefox 3.0, is due towards the end of this year.

The backup facilities of Windows has changed little in the past decade, when the Secondary data storage technology explored new grounds. On the event of a catastrophic virus infection or failed software installation, system restores these copies thereby protecting your valuable data. To be loud and clear, you will get a multitude of softwares that equips you with the same facilities for your XP system and these features are built-in for UNIX-based systems.

Vista claims to protect kids online through its centralized Parental Controls that permits to set browsing controls and thereby restricting access to inappropriate sites. You are aware that even XP has a Parental Control feature and that how ineffective it is in its purpose. The enhanced version in Vista too; relies on websites, forums and personal websites to implement the proprietary tags and rating systems. If you want real protection look for free/open source applications such as CensorNet that are actively developed by those with an incentive to do so, the same people who use them!

The Windows Collaboration module uses peer-to-peer technology to let Vista users work together in a shared workspace. You can form informal workgroups and jointly work on documents, present applications and share viruses too.

Whether you like it or not, Gates &Co will withdraw the XP version of Windows within two years. You can either commence the process of switching over to Vista or plunge into the world of Open Source .

Creating a Dial-Up Connection

In this age of broadband and wi-fi I know it is only those unlucky guys go for a dial-up one. Anyway Linux is meant for all and with that simple note let me begin, first explaining what is a dial-up connection. A dial-up connection requires your computer's modem to dial a phone number that connects to your Internet Service Provider (ISP). Your computer may have a modem built in, or you may have to use a modem card if your system is a laptop.

To create a new modem connection in a Red-Hat system:

1. From the Applications (the main menu on the panel) menu, choose System Tools => Internet Configuration Wizard.

2. A window opens requesting your root password. Enter it and click OK.

3. The Add new Device Type window opens.

4. The left pane lists each of the possible connections. Choose Modem connection and click the Forward button.

5. Red Hat Enterprise Linux probes for a working modem installed on your computer. If it does not find one, make sure that you have a hardware modem, and that it is installed correctly.

6. The next screen contains three drop-down boxes.

o Modem Device — This menu lists all modems installed on your system. Select the correct modem for this connection.

o Baud Rate — Select the speed of your modem. This list is in bytes: a 56.6K modem is listed as 56700.

o Flow Control — Select Hardware if your modem is a hardware device installed on your system.

When finished making your selections, click the Forward button.

7. The next screen requires information about your ISP. For users in the United States, type the correct information in each field:

o Prefix — Enter any prefix necessary to reach your access number. This might be "1" if the call is long distance, or "9" to reach an outside line.

o Area Code — Enter the area code of your access number, if necessary.

o Phone Number — Enter the access number for your ISP. Do not enter "dashes" in the number.

o Provider Name — Enter the name of your ISP.

o Login Name — Enter the login your ISP has given you.

o Password — Enter the password for your account.

Finally, click the Forward button.

8. On the next screen, select whether your system must automatically obtain your IP address settings or whether your system must use the same IP address every time you connect. To use the same address every time, enter the address your ISP gave you into the Manual IP Address Settings fields. Click the Forward button to continue.

9. The final screen is a summary of the values you have entered. Review them to be sure that all settings are correct. Click the Apply button to create your new connection.

10. The Network Configuration screen appears, showing your new connection along with a list of all other connections on your machine.

11. Test your new connection by clicking the Activate button. If your connection works, go to File => Save from the Network Configuration main toolbar, then exit the application. If your connection does not work, click the Edit button to change the settings.

To access your connection later, select Applications (the main menu on the panel) => System Settings => Network. The Network Configuration window opens. Select the appropriate connection and click Activate.

Sunday, May 20, 2007

WAT TO DO WHEN XWINDOW FREEZES

If your Xwindow freezes sometimes, here are two ways that you may try to kill your system. The first is the simple simple way of killing your X server the key combination: Ctrl+Alt+Backspace

The second way is a little more complicated, but it works most of the time. Hit Ctrl+Alt+F2 to startup a virtual console, then log in with your user name and password and run:

# ps -ax | grep startx

This will give you the PID of your Xserver. Then just kill it with:

# kill -9 PID_Number

To go back to your first console, just hit Alt-F1


Friday, May 18, 2007

C PROGRAMMING IN LINUX

If you are new to Linux, you will be in dilemma about how to code in C. Remember that Linux is fully coded in C !!!!!. and C is developed in an intention to develop UNIX !!!!. So coding in C is not a myriad task at all. The following steps will help you to achieve it.


1) Create a new file with extension .c in your folder.

2) Open the file and write the program. Finish it by saving (for eg; file1.c here).

3) Right click on the desktop and open terminal

4) In the terminal, change into your folder by using cd command.

5) Type gcc file1.c and press enter. If there are errors, they will be listed here.

6) Successful compilation would lead to the creation of an executable file a.out . Run that by using the command ./a.out . The result will be displayed in the terminal.


NOTE:- In Linux, header files like CONIO.H are not present. Including it in your program would cause compilation errors. Also main should return a value to the O.S.

THE SHELL INTERFACES


Common ways to get into a shell are

  • No GUI - if your system has no GUI, you must enter commands from shell.
  • Terminal - if GUI is loaded , open terminal by right clicking and selecting terminal

Default prompts:

  1. If you are a regular user ,while opening the terminal ,you will see a dolor sign as $
  2. If you are a root user ,you will see a hash prompt as #

The concept of virtual consoles:

In Linux there are six virtual terminals, in which you can log in .These terminals can be accessed by pressing CTRL+ALT+ (F1/F2/ ... F6). Your graphical window is a graphical shell (CTRL+ALT+F7). These virtual terminals can be useful when a GUI program hangs. In such occasions open any one of the virtual terminals and login as that user (GUI). Then run ps –e followed by kill command. You can login as the same user or any other user in any other terminal. The virtual console proves the concept of multi-user and multitasking. This is possible by running the same processes in more than one virtual terminal. Once in a virtual console, you can switch back to GUI by pressing ALT+F7. Or you can start a new GUI from any one of the virtual consoles.

BASIC LINUX COMMANDS

The following are the basic Linux commands, and are listed with their uses.

1) ls -: list the contents of the current directory. Only files that are not hidden are listed.

2) ls –la -: list the hidden files too, with sole permissions displayed

3) cd :- chage directory to the specified one . (eg: cd /tmp , where tmp is the specified directory)

4) cd.. :- change to the parent directory

5) pwd :- list the present working directory

6) ps –e :- list all the currently running processes with process numbers

7) kill :- kill a process ( eg : kill 4709 where 4709 is the process no)

8) mkdir :- make a new folder. ( eg: mkdir /new )

9) su :- if you are a regular user, typing su , then root password will give you permissions of root user.

10) df –h :- list the current usage status of all drives associated with your Linux system and the available free spaces.

Tuesday, May 15, 2007

ABOUT LINUX DISTRIBUTIONS

Distribution

Free software projects, although developed in a collaborative fashion, are often produced independently of each other. However, given that the software licenses explicitly permit redistribution, this provides a basis for larger scale projects that collect the software produced by stand-alone projects and make it available all at once in the form of a Linux distribution.

A Linux distribution, commonly called a "distro", is a project that manages a remote collection of Linux-based software, and facilitates installation of a Linux operating system. Distributions are maintained by individuals, loose-knit teams, volunteer organizations, and commercial entities. They include system software and application software in the form of packages, and distribution-specific software for initial system installation and configuration as well as later package upgrades and installs. A distribution is responsible for the default configuration of installed Linux systems, system security, and more generally integration of the different software packages into a coherent whole.

A typical general purpose distribution includes the following:

  1. A boot loader: A piece of software that can be loaded by the systems firmware (bios in the case of a PC) and then perform the actions needed to load and start the linux kernel. Often a menu is presented that will allow the user to select which operating system to load. The most common bootloaders for the PC architecture are LILO or GRUB.
  2. The Linux kernel: The core or heart of the operating system. The name of the OS comes from here.
  3. Boot scripts, disk/storage maintenance tools, authentication tools, and scripting languages: They are administration tools, usually considered part of the operating system.
  4. GNU C Library and, optionally, the GNU Compiler Collection: The development tools, used to assist or develop applications.
  5. GNU bash shell, X Window System networking and display protocol and an accompanying desktop environment such as KDE, GNOME,or Xfce: The shells and graphic systems, used for interacting with the user.
  6. Application software packages: There are hundreds of them in most distributions (thousands in bigger distributions, like Gentoo, Fedora, Debian, etc.) , from office suites to webservers to media players to 3D computer graphics software to text editors, and scientific programs. They may come in some storage medium, like a DVD, or, more commonly, be available in on-line repositories.
  7. Package management software: These are created specifically for the distribution, for organizing all software, seamless downloading and installing, upgrading and managing security issues.

As well as those designed for general purpose use, distributions may be specialized for different purposes including: computer architecture support, embedded systems, stability, security, localization to a specific region or language, targeting of specific user groups, support for real-time applications, or commitment to a given desktop environment. Furthermore, some distributions deliberately include only free software. Currently, over three hundred distributions are actively developed, with about a dozen distributions being most popular for general-purpose use.

MAJOR MILESTONES & ACHEIVEMENTS

Milestones

The Linux and GNU foundation formed the basis for an operating system which has since been completed by the efforts of numerous members of the free and open source software community. Significant milestones include:

  • The launch of the KDE desktop environment by Matthias Ettrich in October 1996 followed by the comparable GNOME alternative by Miguel de Icaza in August 1997, both based on the X11 windowing system developed at MIT. GNOME and KDE became Linux operating system shells, responsible for the direct contact with users.
  • The release of the Netscape browser's source code on March 31, 1998, which kicked off the Mozilla project that would eventually give birth to the popular Mozilla Firefox browser.

OpenOffice.org 2.2 - Writer : Word processor component of the multi-platform free software office suite.

  • The release of StarOffice by Sun Microsystems which in June 2000 became the base for the free software OpenOffice.org office suite, a major event in the open source office world.
  • The growth of commercial interest in Linux is similarly marked by notable events: the launch in February 1998 of the Open Source Initiative; the announcement in July 1998 by Oracle Corporation that it would port its well-known database software to Linux and provide support for it; the IPOs of Red Hat on November 11, 1999 and VA Linux the following month which would create a speculative bubble; the wide-scale support of technology giant IBM that would spend millions of dollars on Linux, employing in 2005 close to 300 developers of the Linux kernel, and would organize starting in 2003 the legal defense for the SCO vs. Linux controversy against the attacks of the SCO Group that claimed copyright over the Linux kernel; and finally the acquisition in October and November 2003 of Ximian and then SuSE by the American technology company Novell.

Today Linux is used in numerous domains, from embedded systems to supercomputers, and has secured a place in server installations with the popular LAMP application stack. Torvalds continues to direct the development of the kernel. Stallman heads the Free Software Foundation, which in turn develops the GNU components. Finally, individuals and corporations develop third-party non-GNU components. These third-party components comprise a vast body of work and may include both kernel modules and user applications and libraries. Linux vendors combine and distribute the kernel, GNU components, and non-GNU components, with additional package management software in the form of Linux distributions.

Development


A graphical history of Unix systems. Linux is a Unix-type system but its source code does not descend from the original Unix.

A 2001 study of Red Hat Linux 7.1 found that this distribution contained 30 million source lines of code. Using the Constructive Cost Model, the study estimated that this distribution required about eight thousand man-years of development time. According to the study, if all this software had been developed by conventional proprietary means, it would have cost about 1.08 billion dollars (year 2000 U.S. dollars) to develop in the United States.

Most of the code (71%) was written in the C programming language, but many other languages were used, including C++, Lisp, assembly language, Perl, Fortran, Python and various shell scripting languages. Slightly over half of all lines of code were licensed under the GPL. The Linux kernel itself was 2.4 million lines of code, or 8% of the total.

In a later study, the same analysis was performed for Debian GNU/Linux version 2.2. This distribution contained over fifty-five million source lines of code, and the study estimated that it would have cost 1.9 billion dollars (year 2000 U.S. dollars) to develop by conventional means.

Programming on Linux

Most Linux distributions support dozens of programming languages. Core system software such as libraries and basic utilities are usually written in C. Enterprise software is often written in C, C++, Java, Perl, Ruby, or Python[citation needed]. The most common collection of utilities for building both Linux applications and operating system programs is found within the GNU toolchain, which includes the GNU Compiler Collection (GCC) and the GNU build system. Amongst others, GCC provides compilers for C, C++, Java, and Fortran. The Linux kernel itself is written to be compiled with GCC.

Most distributions also include support for Perl, Ruby, Python and other dynamic languages. Examples of languages that are less common, but still well-supported, are C# via the Mono project, and Scheme. A number of Java Virtual Machines and development kits run on Linux, including the original Sun Microsystems JVM (HotSpot), and IBM's J2SE RE, as well as many open-source projects like Kaffe. The two main frameworks for developing graphical applications are those of GNOME and KDE. These projects are based on the GTK+ and Qt widget toolkits, respectively, which can also be used independently of the larger framework. Both support a wide variety of languages. There are a number of Integrated development environments available including Anjuta, Eclipse, KDevelop, MonoDevelop, NetBeans, and Omnis Studio while the traditional editors Vim and Emacs remain popular.

Although free and open source compilers and tools are widely used under Linux, there are also proprietary solutions available from a range of companies, including the Intel C++ Compiler, PathScale, Micro Focus COBOL, Franz Inc, and the Portland Group.

HISTORY OF LINUX

History



Linus Torvalds, creator of the Linux kernel.

In 1991, Linus Torvalds began to work on the Linux kernel while he was attending the University of Helsinki. Torvalds originally intended Linux to be a non-commercial replacement for Minix, an educational operating system developed by Andrew S. Tanenbaum. Linux was dependent on the Minix userspace at first.

Richard Stallman, founder of the GNU project for a free operating system.

The GNU Project, with the goal of creating a UNIX-like, POSIX-compatible operating system composed entirely of free software, had begun development in 1984, and a year later Richard Stallman had created the Free Software Foundation and wrote the first draft of the GNU General Public License (GPLv1). By the early 1990s, the project had produced or collected many necessary operating system components, including libraries, compilers, text editors, and a Unix shell, and the upper level could be supplied by the X Window System, but development of the lower level, which consisted of a kernel, device drivers and daemons had stalled and was incomplete.

The GPL allowed GNU code to be used in other projects, so long as they too were released under the GPL. In order to allow GNU code to be integrated with Linux, Torvalds changed his original license (which prohibited commercial redistribution) to the GPLv2. Linux and GNU developers worked to integrate GNU components with Linux. Thus Linux became a complete, fully functional free operating system.

Naming

In 1992, Torvalds explained how he pronounces the word Linux:

'li' is pronounced with a short [ee] sound: compare prInt, mInImal etc. 'nux' is also short, non-diphtong, like in pUt. It's partly due to minix: linux was just my working name for the thing, and as I wrote it to replace minix on my system, the result is what it is... linus' minix became linux.



Linux and the GNU project

The Free Software Foundation views Linux distributions which use GNU software as "variants" of the GNU system, and they ask that such operating systems be referred to as GNU/Linux or a Linux-based GNU system. However, the media and population at large refers to this family of operating systems simply as Linux. While some distributors make a point of using the aggregate form, most notably Debian with the Debian GNU/Linux distribution, its use outside of the enthusiast community is limited. The distinction between the Linux kernel and distributions based on it plus the GNU system is a source of confusion to many newcomers, and the naming remains controversial.

Copyright, licensing and the Linux trademark

The Linux kernel and most GNU software are licensed under the GNU General Public License (GPL), version 2. The GPL requires that all distributed source code modifications and derived works also be licensed under the GPL, and is sometimes referred to as a "share and share-alike", "copyleft", or, pejoratively, a viral license. In 1997, Linus Torvalds stated, "Making Linux GPL'd was definitely the best thing I ever did." Other software may use other licenses; many libraries use the GNU Lesser General Public License (LGPL), a more permissive variant of the GPL, and the X Window System uses the MIT License. After more than ten years, the Free Software Foundation announced that they would be upgrading the GPL to version 3, citing increasing concerns with software patents and digital rights management (DRM).In particular, DRM is appearing in systems running copyleft software, a phenomenon dubbed "tivoization" after digital video recorder maker Tivo's use of DRM in their Linux-based appliances. Linus Torvalds has publicly stated on the Linux Kernel Mailing List that, based on the drafts of the license, he would not move the Linux kernel to GPL v.3, specifically citing the DRM provisions.

In March 2003, the SCO Group filed a lawsuit against IBM, claiming that IBM had contributed parts of SCO's copyrighted code to the Linux kernel, violating IBM's license to use Unix. Also, SCO sent letters to several companies warning that their use of Linux without a license from SCO may be actionable, and claimed in the press that they would be suing individual Linux users. Per the Utah District Court ruling on July 3, 2006; 182 out of 294 items of evidence provided by SCO against IBM in discovery were dismissed.

See also: SCO-Linux controversies

In 2004, Ken Brown, president of the Alexis de Tocqueville Institution, published Samizdat, a highly controversial book which, among other criticism of open source software, denied Torvalds' authorship of the Linux kernel (attributing it to Tanenbaum, instead). This was rebutted by Tanenbaum himself.

In the United States, the name Linux is a trademark registered to Linus Torvalds. Initially, nobody registered it, but on August 15, 1994, William R. Della Croce, Jr. filed for the trademark Linux, and then demanded royalties from Linux distributors. In 1996, Torvalds and some affected organizations sued to have the trademark assigned to Torvalds, and in 1997 the case was settled. The licensing of the trademark has since been handled by the Linux Mark Institute. Torvalds has stated that he only trademarked the name to prevent someone else from using it, but was bound in 2005 by United States trademark law to take active measures to enforce the trademark. As a result, the LMI sent out a number of letters to distribution vendors requesting that a fee be paid for the use of the name, and a number of companies have complied

ABOUT LINUX..... ALL U NEED

Linux


It has been suggested that Criticism of Linux be merged into this article

This article is about operating systems that use the Linux kernel



Linux (IPA pronunciation: /ˈlɪnʊks/) is a Unix-like computer operating system family. Linux is one of the most prominent examples of free software and of open source development; its underlying source code is available for anyone to use, modify, and redistribute freely.

After the Linux kernel was released to the public on 17 September 1991, the first Linux systems were completed by combining the kernel with system utilities and libraries from the GNU project, which led to the coining of the term GNU/Linux.From the late 1990s onward Linux gained the support of corporations such as IBM, Sun Microsystems, Hewlett-Packard, and Novell.

Predominantly known for its use in servers, Linux is used as an operating system for a wider variety of computer hardware than any other operating system, including desktop computers, supercomputers,mainframes, and embedded devices such as cellphones. Linux is packaged for different uses in Linux distributions, which contain the kernel along with a variety of other software packages tailored to requirements.

VISTA... FACTS OVERMINED

"I am not sure how the company lost sight of what matters to our customers (both business and home) the most, but in my view we lost our way. I think our teams lost sight of what bug-free means, what resilience means, what full scenarios mean, what security means... I would buy a Mac today if I was not working at Microsoft"

After the much anticipated release of “the safest version of Windows ever”, Windows Vista, both business and home users alike --were amazed by the striking similarities between the Windows desktop client and Apple’s Mac OS X. And what followed corroborated that this was not a mere concurrence.

1GHz 32bit(x86) or 64bit(X64) processor, minimum 1GB RAM, DirectX 9 supported WDDM driver graphics card with at least a graphics memory of 256MB --if you ever fancy playing games. And not to mention the demanding system memory speed of 1800meg per second. In short, be ready to get your system up-upgraded and ruin yourself.

Does Vista really worth that much? Do the functionality, manageability, productivity and security of the OS outweigh its cost of deployment? Or should we stick to XP at least for a year or so? To answer these we have to know what Vista offers.

Aero, the much touted 3D user interface experience with translucent icons, program windows, 3-D scrollable task manager and other elements, looks cool, but it cannot be the reason to invest so large in Vista for home users. The business and tech-savvy users who are interested in how-stuffs-work cannot be impressed by such flashy appearances. A much similar graphical user experience was introduced by Apple OS X but they scarped it, since its Panther OS X due to the poor response. The tech community has not witnessed any change in the scenario for these fancy features to become a hit. And Vista will reduce to Windows Vista Basic mode, without Aero, unless you get an advanced graphics card installed.

Another concerning aspect for all users is regarding security. Vista was released as the most secure Windows ever with features like Windows Defender, anti-spam and phishing filters and Parental Controls. Vista is first OS that the Redmond giant released assuming that it would be attacked. Vista boasts of not wanting any antivirus applications; with its built-in facilities like Automatic backups, Performance Self-Tuning, and built-in diagnostics that help you keep your data protected and your PC running smoothly. The claim was thwarted with the detection of bugs in Vista, immediately after its release. A serious error was discovered in Internet Explorer 7 which will be of aid to hackers. Windows admitted these security threats but still claimed the OS to be the most secure OS one can get. Windows XP has at least been patched and can be used with a plethora of reasonably effective security tools that work now, without waiting for an update. Or switch over to the virtually virus free world of open source OS.

The new OS tightly integrates instant desktop search, with powerful indexing and user-assignable metadata, that make searching for all kinds of data--including files, e-mails, and Web content a lot easier. This feature, that Mac OS X has had since Jaguar in 2002, is really handy --could be the bottom entry of a list with more productive features on top. The database-like WinFS filesystem, an attribute that would have appealed corporate customers, is dropped altogether.

Connecting to various networks is a key capability in today's OSes. Vista’s Network Center makes it effortless not only to create secure ad hoc wireless networks, multiple networks and wireless peripherals, but manage network connections in a resourceful way. Then again, all OSes in the market does all of these and Vista offers nothing to give it an edge over others. The Unix based Linux OS clearly have an upperhand over all operating systems since it was primarily developed for multiuser and multitasking capabilities and over the years it has proved its merit.

Internet Explorer, the Windows browser; has got a major version change since the release of IE 6.0 in 2001--thanks to the phenomenal incarnation of the Open Source lad Firefox, is now IE7 and comes as the integrated browser with Vista. It has Firefox-inspired tabbed browsing, better privacy management policies, anti-phishing and anti-spoofing facilities all packed up to provide both pleasant and safer browsing experience. Well, here is the expert comment-Gates and Mozilla browsers basically have almost similar feature set but for ActiveX support. ActiveX, the Microsoft technology to install on-the-fly programs for displaying certain web pages, is found in many places --from Outlook Web Access to some Web-based applications. Nevertheless, hackers use ActiveX as a tool to install applications with the intention of risking the security of the system. Vista has an Opt-in block for ActiveX and Firefox does not authenticate ActiveX at all. As far as security is concerned, Firefox uses Google’s database to inform you of sites of suspected forgery and has anti-phishing features as well. And for your kind information, IE7 is competing with the previous version, Firefox 2.0, of the Mozilla Foundation and according to its media release the latest version of the browser, Firefox 3.0, is due towards the end of this year.

The backup facilities of Windows has changed little in the past decade, when the Secondary data storage technology explored new grounds. Vista, in order to catch up with the recent developments, has improved its built-in Back-up utility and System Restore facilities that makes “shadow copies” of files and folders and stores in protected regions of the disk. On the event of a catastrophic virus infection or failed software installation, system restores these copies thereby protecting your valuable data. To be loud and clear, you will get a multitude of softwares that equips you with the same facilities for your XP system and these features are built-in for UNIX-based systems.

Vista claims to protect kids online through its centralized Parental Controls that permits to set browsing controls and thereby restricting access to inappropriate sites. You are aware that even XP has a Parental Control feature and that how ineffective it is in its purpose. The enhanced version in Vista too; relies on websites, forums and personal websites to implement the proprietary tags and rating systems. If you want real protection look for free/open source applications such as CensorNet that are actively developed by those with an incentive to do so, the same people who use them!

The Windows Collaboration module uses peer-to-peer technology to let Vista users work together in a shared workspace. You can form informal workgroups and jointly work on documents, present applications and share viruses too.

Whether you like it or not, Gates &Co will withdraw the XP version of Windows within two years. That says you can afford to sit back and relax only for the time being. You can either commence the process of switching over to Vista or plunge into the world of Open Source or at worse, get an Apple system --the last alternative is suicidal as you will be stuck to one hardware vendor, not withstanding its high cost.

Jim Allchin, Microsoft executive and Coordinator of all the developmental activities of Vista, whined for the company “losing sight” and “losing way” in an e-mail to Mr.Gates, in 2004, when Vista was still Longhorn. The confidential mail leaked out recently and the Rich man is searching in dark for explanation. The striking semblance of the new OS with Mac OS X can be attributed to this I-would-buy-a-Mac mail.

Monday, May 14, 2007

How to mount other file systems

Right click on the desktop if you are using GNOME desktop. Then open Terminal

1) type "su" , followed by the root password
2) type "fdisk -l"
3) find out the partition to be mounted ( say viz. /dev/hda10 )
4) type " mount -t 'filesystem type' 'partition name' 'mount point'


filesystem type - if u r using fat 32 file system, it will be "vfat". ntfs will not be supported by default.

mount point= its where u need to associate your windows partition (drive) into linux. you can create several mount points ( simply FOLDERS) . for eg, if u want to create one do it as,
/mnt/win , where /mnt would be present by default and you are just making a folder named win where u can put all ur data in

Sunday, May 13, 2007

SQUID tweaks

I assume you have already set up a Squid proxy and a firewall and that both work correctly. You may now wish to force all your users to use the Squid proxy for surfing the WWW. This is what "transparent proxying" is about: your users surf, even without having defined a proxy in their browser settings, but they in fact all use the transparent proxy and don't notice it.

To enable transparent proxying with Squid, insert the following lines in the configuration file (squid.conf, usually in /etc) at the aproppriate place (search the configuration file for the respective keywords, httpd_accel_host, httpd_accel_with_proxy and httpd_accel_uses_host_header):

httpd_accel_host virtual
httpd_accel_with_proxy on
httpd_accel_uses_host_header on

You will also need to accept and redirect the WWW traffic to port 3128 of the Squid proxy:

ipchains -A input -p TCP -d 127.0.0.1/32 www -j ACCEPT
ipchains -A input -p TCP -d 192.168.0.0/32 www -j ACCEPT
ipchains -A input -p TCP -d any/0 www -j REDIRECT 3128

or, if you use SuSEfirewall2

FW_REDIRECT_TCP="192.168.0.0/24,0/0,tcp,80,3128"

Restart Squid and the firewall. Transparent proxying should be working now. However there are some issues associated with the above settings. You can read about them in the corresponding comments in the squid.conf file.

Inside the Linux boot process


According to the kernel-meister himself:

Ok, the merge window has closed, and 2.6.22-rc1 is out there.

The diffstat and shortlogs are way too big to fit under the kernel mailing list limits, and the changes are all over the place. Almost seven thousand files changed, and that's not double-counting the files that got moved around.

Architecture updates, drivers, filesystems, networking, security, build scripts, reorganizations, cleanups.. You name it, it's there.

You want a new firewire stack? We've got it. New wireless networking infrastructure? Check. New infiniband drivers? Digital video drivers? A totally new CPU architecture (blackfin)? Check, check, check.

That said, I think (and certainly hope) that this will not be nearly as painful as the big fundamental timer changes for 2.6.21, and while there are some pretty core changes there (like the new SLUB allocator, which hopefully will end up replacing both SLAB and SLOB), it feels pretty solid, and not as scary as ripping the carpet from under the timer infrastructure.

So give it a good testing. We'll see how the regression tracking ends up working, but in order to actually track that, we want people actively testing -rc1 and making good reports!




Finally, Smart City project inked

Sunday, May 13, 2007
16:09 IST



Thiruvananthapuram: The much-awaited Rs.15 billion ($350 million)-Smart City project between the Kerala government and Dubai Internet City (DIC), aimed to put Kerala on the global IT map, was inked here Sunday.


The project was signed Sunday, nearly three years after it was first mooted. It is different from the one proposed by the previous Oommen Chandy government and comes a few days before Chief Minister V.S. Achuthanandan completes his first year in office.


The agreement was inked by Ahmad Bi Byat, executive chairman of Tecom Investment, and Lizzie Jacob, chief secretary of Kerala.


Smart City, to be set up in Kochi, is a joint venture company of Tecom Investments and Sama Dubai.


Top officials of DIC arrived in a charter flight around 1 p.m. and drove straight to the state-owned Mascot Hotel to sign the agreement. They will leave in the evening.


"This is a historic ceremony and a happy occasion for me. This has materialised after long discussions and would be mutually beneficial and economically a boost for Kerala. More importantly this would strengthen the relationship between Kerala and Middle East," said a beaming Achuthanandan.


The development is seen as a moral victory for Achuthanandan as he had earlier opposed the terms for the project worked out by the then Congress-led United Democratic Front (UDF) government.


The agreement with DIC does not include transfer of the Infopark campus at Kochi, which had been agreed upon by Chandy.

While the former chief minister had agreed to sell 236 acres of land at a cost of Rs.260 million (Rs.26 crore) and given DIC full ownership, Achuthanandan made them agree to 246 acres of land at a cost of Rs.1.06 billion (Rs.106 crore), which would be given on lease for 99 years.


Further, Chandy had agreed to 33,000 new jobs being created by Smart City, but Achuthanandan got them increased to 90,000.


Ahmad Bi Byat said Smart City would be home to the best companies of India and abroad.


"This project would certainly put Kerala on the global IT map. It would just not be the IT sector alone that would benefit but others too," said Byat.


The project would have 8.8 million square feet of building of which 70 percent would be for IT and IT enabled services.


Later speaking to reporters, Achuthanandan said the master plan of the project would be ready in a year and the construction would begin soon after. Only then would they be able to know the exact project investment.


A close aide of Achuthanandan told IANS that despite rewriting the agreements on seven different occasions, the real pressure on DIC came last month when the chief minister began to take steps to float a global tender for the project.


"When the file for that was ready, the DIC officials announced the clearing of the project last month," said the aide.


State Congress president said they were happy that the Left government had signed the agreement for the project initiated by them.


"They were saying first that DIC is nothing but a real estate company, and see today the agreement has been signed. Tomorrow, Chandy would be speaking to the press on this," he said.

Tuesday, May 1, 2007

how to install softwares when source code is available

you may have noticed that some softwares are coming as source codes, ie a bundle of '.c' files .Most of them will be in zipped form( .tar.gz) , so you need to right click & extract to some directory. To install these into your system, i recommend the following way.


1) most softwares will have a backend called ' library 'and a front end called ( the name itself or ui ..etc) .Unzip these into the folder.

2) if library is present , use ' cd ' comand to enter the directory . there after you need to enter three commands to install that package.

a) ./configure
b) make
c) make install

3) do this for both