Saturday, October 16, 2010

C# Coding Standards

Purpose:

  • To develop reliable and maintainable applications

Types of Casing:

  • Camel
    (First character of all words, except the first word are Upper Case and other characters are lower case e.g. variableName)
  • Pascal

(First characters of all words are Upper Case and other characters are lower case. E.g. VariableName )

  • Hungarian

(The prefix gives extra information of the variable. E.g. pAccountName, strName etc.)

Naming Convention

1) Variables and Methods

  • Local variables-> Camel Case
  • Private Member Variables->Camel Case beginning with Underscore(‘_’)
  • Public Member Variables-> Pascal
  • Private Method Name-> Pascal
  • Public Method Name-> Pascal
  • Method Parameters-> Camel
  • UI Controls-> Pascal
  • Event Handler->Pascal(suffix with EventHandler)
  • Exception-> Pascal( Suffix with Exception)

2) Namespace, Class & Interface

  • Interface-> Pascal (start with ‘I’)
  • Class Name-> Pascal
  • Namespace->

...

  • File Name-> Pascal(Match with class Name)

Note:

  • Do not use variable names that resemble keywords
  • Do not use single character variable names
  • Do not use Hungarian notation to name variables
  • Do not use underscore(‘_’) for local variable names

Indention & Spacing

1) Use TAB for indention. DON’T use spaces. Use TAB size=4.

2) Comments should be in the same level as the code.

3) Curly brackets ({}) should be in the same level as the code outside the bracket.

4) Curly brackets should be in separate line, not in the same line.

5) Use // or /// comments. Avoid using /*…..*/

6) Use single space before and after operator and braces.

7) Logical groups of code should be separated by a blank space.

8) Use region to group the related pieces of code together.

9) Keep private member variables, properties, methods at the top and public member variables, properties and methods at the bottom.

10) There should be one and only one blank line between methods inside a class.

Good Programming Practices

1) Always use multilayer (N-Tier) architecture.

2) Use relevant function names and variable names.

3) Do not write comments if the code is easily understandable without comments.

4) Write as fewer lines of comments as possible.

5) Always log every exception with detail information. But give friendly message to user.

6) Don’t write very large try-catch blocks.

7) Prevent all possible exceptions, so that we need handle exception.

8) Don’t create more than one class in a file.

9) A method should do only one job, not more than that.

10) Don’t create a very large file; if file is more than 1000 lines of code, then it is necessary to split the file into multiple files.

11) Limit the size of method to 25 lines, if more than that, think of splitting the method.

12) Do not use large objects in session.

13) Always use style sheets to control the look and feel of the pages.


(Prepared for Deerwalk Inc.)

Your comments are welcome.

Wednesday, October 6, 2010

Courses I have chosen for my masters

Updated:
1st Period
1) Advanced Computer Studies in Sweden
2) Artificial Intelligence
3) Object Oriented Design

2nd Period
4) Medical Informatics
5) Human Computer Interaction
6) Optimization

3rd Period:
7) Machine Learning
8) User Interface Programming I
9) Secure Computer Systems I

4th Period:
10) Machine Learning
11) Software Architect with Java
12) Secure Computer Systems II
13)Computer Assisted Image Analysis I

In Uppsala, one semester is divided into two periods. In each period, we have to select a number of courses. The good point is that we students have freedom to select the courses.
Here are the list of courses I have selected:

1st period
1) Advanced computer studies in sweden
2) Artificial Intelligence
3) Object oriented design
2nd Period
4) Computer Assisted Image Analysis II
5) Medical Informatics

3rd period
6) Machine learning
7) Human computer Interaction
8) Software Engineering

4th period
9) IT, ethics and Organization
10) Machine learning
11) Software architecture with Java

Monday, October 4, 2010

Looking for intelligence

We can see many types of software installed within our computer. The installed software varies from just calculating values to doing something similar to human brains. I have a game program “Chess Titans” which came with operating system “Windows 7” which plays with human just like another human is playing. If a machine completely calculates and searches all the states, then, it can’t be said it is doing with complete intelligence. If computer checks all the states and then performs steps, it is true that we never beat computer because it has thousands times calculation capacity and has power to think hundreds of steps ahead which is out of reach to us. But, in contrast, we can beat computer chess program if we play logically. So, “Chess Titans” plays intelligently with human so that it doesn’t always win human.

Not only this, there are many applications which, we don’t notice, but are doing their task intelligently. For example: google search. The supplied search keywords are manipulated by google using some sort of intelligence. It tracks all search patterns, then it suggests the most optimized keyword for searching. Also, google has achieved its popularity in the field of language translation. The feature of auto identification of the language and translating the text from one language to another language includes lots of intelligence tasks and google is success to achieve this. Similarly, word processing program of Microsoft is also implementing the intelligence while typing a word. If the word is mistaken, then it replaces it with appropriate word.

The implementation of intelligence in computer is an undergoing process and we can get more and more applications implementing intelligence to make our tasks more sophisticated. I think if computer implements voice recognition system and performs tasks by the commands provided by speaking, then it would be great achievement. The computer receives the voice signal, interprets the voice and then does respective action. Voice of different person (frequency, loudness etc.) is different, so computer should learn the command to interpret the command. There would be one problem: the same word is pronounced differently, so it may take longer time to learn perfectly and then it would be very easy, we don’t need to type any word in keyboard, just stay in front of computer and speak! It would not be very difficult if commands are provided in only one language. The sentence structure of the language and how the words are pronounced may affect the complexity of the development of the voice recognition system. These activities lie on natural language processing, a broad field of artificial intelligence.

My area of interest in intelligence includes the field of computer vision. Actually, I have started a small project which manipulates the face images to identify a person. Although, it is based on mathematical calculations, I have implemented some intelligent algorithms to identify a person by his face. The central task of that project was image analysis and pattern recognition. The PCA (principle component analysis) algorithm was used to extract the information of the components of face like mouth, eye, nose etc. The application, first of all carries out processes from a set of faces, and then extracts information and stores it. Then, when a new face is provided, it compares with the previously stored information. Then it identifies the face by calculating the difference of the stored face information and newly fetched face.

The project is almost complete except I have not integrated the learning module in the application. If there is new face, and machine recognizes the face, then it should update with the information of newly recognized face i.e. the machine should be capable of learning new things. The learning module development may be challenging, but if I have a plenty of time, I will research on that topic so that I can implement the learning module and my application will work perfectly. This includes a broad area of artificial intelligence: Machine Learning.

Artificial Intelligence: my view

Artificial intelligence is a growing and demanding field in computer science. We can see almost the modern equipments can perform some sort of intelligence behavior. So, today’s human society is being been used to the intelligence in the modern machines and equipments.

Actually, what is the definition of artificial intelligence? Before we begin to define the artificial intelligence, we must first analyze the natural intelligence. The natural intelligence is the god gifted intelligence and only our brain is able to show the natural intelligence and in other words, we can’t replicate the natural intelligence because it is the god gifted thing. So, scientists have been trying to make the machines intelligence by using artificial intelligence, although, machines can’t perfectly do as human brains do.

Interestingly, human brain functions and machine functions are different and to some extent complementary. Human brain can perform the tasks that is very hard to simulate in machines, like seeing, hearing, sensing are very general things for human, but it is far more complex to develop a machine which can perform the above tasks. But the computational tasks which human brain takes more than one year, by using machines, it is very easy for machines, and can be solved within a minute. So, artificial intelligence actually studies on how the tasks which are very easy for humans can be transferred to artificially created machines so that machines starts working using acquired intelligence.

But, till now, there is not any machine developed that works with 100% intelligence, and that task is very complex. Nowadays, we can see many machines which use some extent of artificial intelligence to perform their tasks. But, here we must know, the artificial intelligence is not the field of computational science. If a machines functions with calculations, though machine seems doing tasks with intelligence, we can’t say it intelligence since we can’t see any intelligence in the computation because it is straightforward process. So, to acquire artificial intelligence by a machine, the machine should have capacity to take decisions based on heuristics, inferences, and fuzzy logic and should be able to learn from the previous activities so that it can take a better decision.

Blu-ray Technology and Its Future

Blu-ray Technology and Its Future

Krishna Paudel

Student, Masters in Computer Science

Uppsala University

Email: Krishna.Paudel.5383@studen.uu.se

Abstract

Blu-ray technology is very modern technology to store huge amount of data and there is, no doubt, that Blu-ray Disk (BD) will replace other previously using storage devices like CDs and DVDs. A Blue-ray disk can store data from 25 GB to 50 GB, which is more than 5 times larger than the storage of DVD disks. There are some modifications of in mechanism of storing and reading the data in the Blu-ray technology so that it is capable of storing very large amount of data.

Introduction

Blu-ray disc was developed by the Blu-ray Disc Association, a group representing makers of consumer electronics, computer hardware and motion pictures. Although Blu-ray disk specification was officially announced in Feb 2002, it had started to market from the year 2006 and it is gaining popularity in high definition television (HDTV) video format and processing the video data for presentation purpose. The standard BD can store 25 GB in single layer and up to 50 GB in dual layer and we can get mini BDs which can store up to 7.8 GB and 15.6GB in dual layer. The further enhancement of Blu-ray technology is going on, and in future there is possibility of multilayer Blu-ray disks more than 100 layers which can store up to hundreds of terabytes!

Technical Details

There is one prominent difference in the Blu-ray technology and other disk technologies. The older disk technologies use red laser light to read the data stored in the disk, however, the Blu-ray disks use blue violet laser light (so it is called Blu-ray). Since, the wavelength of blue ray (405nm) is less than red light (650nm), the laser spot can focus with greater precision which allows data to be packed more tightly and stored in less space. So, it is possible to store more data than the same sized DVDs and CDs. And Blu-ray technology is backward compatible means it can read both the DVD and CD formats just like a DVD reader can also read CD data. [IBRAHIM (2007)]

The Blu-ray technology has adopted its success by changing the disk layer specifications. Since, the disks have more and more compressed data, so the cover layer of the Blu-ray disks is significantly decreased up to 100 µm and numerical aperture (NA) value of objective lens increased to 0.85. Since the transparency of the plastic disk material decrease when the wavelength is decreased below than 400nm, so we should be careful so that the wavelength of the laser beam does not decrease below 400nm because even the temperature change may cause the wavelength change of the laser beam. However, a 400 nm GaN(Gallium Nitride) laser, which has been currently used to read BDs, seems to have a wavelength with a very small temperature dependency.

Blu-ray uses phase change film as a recording media. The phase change film can be divided into two types: 1) GST type 2) Eutectic type. The former GST (Ge Sb Te stoichiometrical composition) type is used for DVD RAM and the second one is used for CD-RW, DVD-RW. The recording of blue ray supports both the above mentioned recording types. [Blue-ray Disc Founders (2004)].

Interestingly, the structure of Blu ray dual disk is different to that of the DVD dual layer disk. In DVD, the rear and the front layers are formed separately on two substrates and, the substrates are attached one on the top of another with a UV-hardening resin adhesive. In contrast to this, in BD, the rear layer consisting of multi-layered films is formed on a 1.1 mm thick polycarbonate substrate with a guide groove for tracking, and then the 25µm thick spacer layer made of resin is formed. The front layer is formed on it and finally the 75 µm thick layer is formed.

Future of Blu-ray

The future of Blu-ray technology is very high and demanding. Like, nowadays, just like DVDs are replacing the use of CDs, the similar condition that BDs will be replacing today’s DVDs demands, although it will take some time to completely replace the use of DVDs. Similarly, modifications and new researches, like Blue-ray 3D, are going on this technology to make it more robust and stable.[Wikipedia (2010)].Not only this, BDs can also be used to provide huge programs to the customers and provide the updates to save the network bandwidth. The prominent possibility of further enhancement includes the concept of multilayer disc technology, in which many companies are thinking of the possibilities of increasing as much layers as possible so that we can get as much disk space as possible.

Conclusion

Thus, Blu-ray technology has introduced revolution in the field of disc technology and media storage. It has some drawbacks, like other technologies. Since it is very new technology, so purchase of Blue-ray devices is expensive. I think after some time production will be more, then the cost will reduced drastically, and in terms of space, it will be far cheaper than DVDs.

Acknowledgements

I would like to thank Mr. Ivan Christoff and Mrs.Yunyun for their valuable advice in writing this report. Again, I am thankful to my friend Mr. Hari Prasad for assisting me by reviewing my report.

References:

  1. IBRAHIM, K.F. (2007). Newness guide to Television & Video Technology.4th Edition.Pages:412-425. Jordan Hill, Oxford OX2 8DP, UK:Elsevier.978-075068-165-0

  2. Blu-ray Disc Founders. (2004). White Paper Blu-ray Disc Formats [available at http://www.blu-raydisc.com/en/Technical/TechnicalWhitePapers/general_bluraydiscformat-15263.pdf] [accessed on 2010/09/23]
  1. Wikipedia.(2010).Blu-ray Disc[available at http://en.wikipedia.org/wiki/Blu-ray] [accessed on 2010/09/30]

(Assignment: Advanced Computer Studies in Sweden, I/I/I)