Wednesday, April 24, 2013

Curating Thoughts, Becoming Productive

While developing good ideas is essential to any success, curating your ideas or thoughts and applying them at correct time with correct mental set-up is also equally important.

Generally,

We get an idea, we are excited about the idea, we develop use-cases surrounding that idea, we start developing applications, wait for generating traction for those applications, start feeling that idea is probably not as grand as it seemed during the start,  start reducing the time and effort for nurturing that idea, get another idea, get excited about it ......... a vicious circle.

While trying out new ideas is not a bad thing, the amount of time and energy that such ideas can consume can be enormous and the end result can be both frustrating (and impoverishing :D) !!

Ash Maurya has written a wonderful article on how to separate the above process into two phases :
  •   Ideation : Where we spend our resources only on developing, nurturing and validating the idea
  •   Execution : Where we only concentrate our vitals on implementing the idea
Such kind of clear separation of processes not only helps us focus on one task at a time with complete efficiency and increases the productivity of these tasks.

Few more things which I have learned through experience which really help in maximising the usage of our thoughts would be :

1) Unthink : 
Just like positive thoughts, negative thoughts are also concatenating wherein one thought leads to another and inspite of understanding that such thoughts are purely detrimental, we feel incapable of getting rid of them. Here, we need to develop a major resistance techniques.
Consider an analogy : In the world of computers, whenever we delete a file, those files don't go anywhere. Its just that all the active references to that file are removed.
A file without any active references to it is (as good as, infact) a deleted file !
Similarly, when you get such thoughts (negative, distractive, discouraging, etc) apply your energy in telling yourself that this thought is NOT Important, not worthy of time and attention. While developing such kind of resistance needs patience and skill, it is very much worth the time , I feel.
Stop giving attention to such thoughts and the thoughts would start waning away automatically.
Another tip (which everyone already knows :D) for getting rid of such thoughts is to engage yourself in other interesting activities,, but I would prefer the first method, personally.

Being in control of your thought processes is really an achievement which can help you go a long way in your life by saving you from negativities.

2) Time slice your thoughts:
If you have lots of things in your mind, it can really help to give a dedicated slot to those thoughts.
Working on various thoughts/ideas simultaneously is bound to be less fruitful that working on one idea/task with a dedicated attention.
I will dedicate my morning time only for these A thoughts and in the afternoon, I am going to work only on B thoughts

3) Be Patient with learnings:
Lot of times, we are really excited about learning new technologies, new ways of getting smarter, useful and productive and we end up reading a lot of articles in a very short span of time, reading way too much one after another, downloading lot of articles, ebooks, poring through extensive tutorials/videos. slides and PDFs. While we definitely learn something through all our actions, how long such kind of learnings will last remains proportional to the amount of time spent in digesting such ideas. I feel its better to read one authoritative article, understand it effectively, thinking of ways to apply the ideas in it and spend sometime validating the idea in various ways rather than reading five to ten articles in Jiffy without giving grasping the gravity of their content. When you come across a good article, read it thoroughly, read all the associated comments (Sometimes comments have more knowledge than the actual article), let the ideas SINK IN. Channeling your energy on one thing at a time is going to make your memory much sharper  and stronger. In case you have 10 articles open in your browser and it is really difficult for you to focus on one, bookmark all the tn articles, close nine of them, and read the one left. FOCUS is a keyword if you want to develop your mental strengths.

4) Search for effective sources of knowledge,bookmark and follow them:
Reading is arguably the best way of increasing your knowledge. So, it is imperative that we should read a lot but at the same time read what is really useful. So, whenever, you come across a blog which you feel is really informative, bookmark it and follow it. Apart from the the technical concepts, lot of blogs also provide really useful information for helping you develop your work related philosophies, thought processes and some of them can be life-changing. Apart from public article repositories, lot of people also carry personal blogs which can have really significant and useful content. Go after them :D

Few which I follow are :
Smashing Magazine
A List Apart


I know, generally, people discuss about how to follow various process in terms of tangible tasks, deliverable processes, p2p interactions, reporting etc and developing such kind of processes for abstract thoughts may seem queer, but we should remember that all good,great,irresistible things started from an idea. Ideas are important ! Give them the importance they deserve !



Sunday, June 10, 2012

Website Client side (Browser) Optimization


Hello, and welcome back!!
Lot of websites, even very successful ones, consistently and unforgivably ignore the website client side optimization.

And it is unbelievably painful to sit through such websites.

My Very Simplified Definition of Website Optimization:
 A set of practices which enable a webpage render faster on browsers, be meaningful to crawlers, reduce server load and enhance user experience.

May be many of those biggies don’t care about website optimization because they already have enough users with high degree of loyalty but I suggest if you have a website, you should always go for client side optimization.

My simple reasons would be:
      1) There are lots of websites around these days, and the number is sure to grow exponentially. So, if you make a website which has lot of customers but you fail to optimize it and then someone else makes a website with all optimizations applied, believe me, loyalties are going to shift :D even if your competitor offers much lesser features.
 
    2) With extremely optimized websites like Facebook which load in a snap (in spite of pushing so much data),  the expectations of an average user from any website have increased a lot.

    3) Even if the server sends the response in a second, unoptimized websites can take as much as 5-30 seconds to load (render). As of today, it’s a lot of time for an impatient internet user.

    4) The search engines may find it hard to decipher a complicated, heavy & unoptimized website and consequently many not index such websites. Even if indexed, the ranking of such sites may be affected.

So,  let’s dive straight into finding out what are the key areas for websites to be slow, where can we focus, what is the correct attitude for website optimization, tools to use and what are the most common mistakes people make.

Which means, that I we will discuss both technical and non-technical aspects of website optimization here.

(Note: I have my major experiences on Tomcat sever for Java based applications, hence, hereafter server => tomcat )

There are essentially 2 aspects that need to be addresses by website optimization:
     1)      Reducing load time for all files on Browser
     2)      Reducing rendering  time for all components on Browser

So, what are the main reasons for a website to be slow?
     1)      Wrong presumptions
     2)      Wrong attitudes
     3)      Insufficient knowledge of website optimization (Obvious, huh?).
     4)      High No Of Http Requests
     5)      Heavy Requests
     6)      Heavy responses
     7)      Request Limits
     8)      Insufficient Caching
     9)      CSS Expressions
    10)   Placement of JS files & synchronization
    11)   Usage of inappropriate Javascript frameworks
    12)   Inappropriate usage of Javascript features
    13)   Environment issues
    14)   Ignoring worst case scenarios
    15)   Heavy technologies and HTML elements

Let’s start discussing above points & finding solutions for them:
Wrong Presumptions:

i)                This is the best I could have done:
People tend to think that their websites cannot be optimized after a certain extent. Such presumptions can develop if a person is working on a website for a very long time & doesn’t keep in touch with market (Read technologies)-Over indulgence.
Solution: Get out of your cocoon, read and enhance yourself. Times are changing.
ii)               It’s a heavy website with so many things on:
Undeniably, lots of websites have lots of features and sometimes people can feel that with so many features & data, there is hardly any scope for optimization.
Solution : Think like this => Optimizations always focus on individual web pages, not entire websites and the amount of data & features a single webpage is going to deal with, will always be limited (Until you have goofed it all up really bad !)
iii)             I am a simple man and things like optimization are not my cup of tea. They require so much knowhow of technologies and processes:
Not a completely incorrect presumption but surprisingly, most of the optimization techniques don’t necessarily need the knowhow of Big & Complicated technologies. Solution: Common Sense is the most prominent requirement to optimize a website. Tools & technologies will definitely help but treat them as a plus

Wrong Attitudes:
i)                  It’s a waste of time to client optimize my website. I can instead focus on adding more features in my website and improving server performance: Yeah right, add more features & then use them yourself (Bear Bombing :D)!!
ii)               Best user experience can come only with latest & greatest of technologies: Similar to point iii of Wrong Presumptions, the difference being, this time, its arrogance and overconfidence talking.
iii)              Intellectual overindulgence: Usage of most complicated frameworks & technologies to prove your intellectual capacity. Taking the entire website on a ride for personal satisfaction!

Insufficient knowledge of website optimization:

Well, only way to handle this case. Read more, apply more and test more. It will take much less time than you are thinking and the ROI (Return on investment) will be much higher.

I recommend the following awesome book:
O’reilly website optimization


For Indian Users,


Following blogs are also doing a commendable job digging in and out of website optimization and other challenges.

http://www.smashingmagazine.com

High No Of Http Requests:


Browser gets all the stuff for creating the webpage from a server and ALL requests that it makes are Http Requests. By ALL, I mean ALL.
Whether it be a JS file request, image file request, or CSS file request
To reduce the no of HTTP requests, combine all JS files into 1, all CSS files into 1, all possible image files into 1 sprite image.

Heavy Requests:

This is, generally, a trivial issue. But sometimes what may happen is, along with lot of request parameters, the browser may send excessive amount of cookies with the request if the server is not on a cookie free domain.
Imagine, if you have hundred images loading on your web page, all those image request will also, have cookies passed with them which can lead to traffic congestion and eventually slower rendering time.
It is advised to place all static components of your website which don’t need cookies in your request on a static cookie-free domain.

Heavy responses:

The amount of response sent by server for different HTTP Requests definitely needs to be trimmed down so that the response can be lighter and faster.
It can be achieved by:
i)                G-zipping the file. Configure your server to send zipped files. When the browsers send a request to server for files, they also inform the server if they can accept the zipped files and the format of zipped files.  In tomcat it can be achieved by modifying the server.xml file :
     compression="on"  compressionMinSize="300"  noCompressionUserAgents="gozilla, traviata"  compressableMimeType = "text/html,text/xml,text/css,text/javascript, application/x-javascript, application/javascript" />
ii)               Minifying CSS and JS files. There are very smart websites & tools which can shrink the CSS and JS files. However, when minifying JS files, do take care of safety v/s size aspect. If the compression ratio achieved is less than 8%, it is probably better to use the original JS file as the risks associated with minifying JS files do exist and they are real business risks ad some minifiers can alter the JS in such a way that the behavior of script may change!! I would deal at length with CSS & JS minification in my upcoming articles. Stay tuned !
iii)             Sending much lighter Sprite images instead of sending multiple images. Making sprite images is pretty simple. Using them is also fairly simple as the co-ordinates of the image to be used need to be specified from the sprite image.
iv)             Sending images with minimum possible size that can be accepted by the user
v)              Checking if the amount of data being sent by the server Is actually needed on the client side. Only send significant and useful data from server to client.
Try using CSS for developing stylish buttons, separators etc.
http://www.cssbuttongenerator.com is an excellent place to begin with.

Also, try to reduce the amount of code in your HTML by putting all the styles in a separate CSS file. Inline styling can make the code look really horrible and scare the Search Engine Crawlers away.

Request Limits:

Till 2009, most of the browsers used to be limited to making 2 maximum connections with the server in parallel. It means, that at the most, only 2 files could be downloaded in parallel from the server
While most browsers psot-2009 can now make 6+ parallel requests, it is still advisable to keep the no of files to be fetched from server to be as less as possible.
Also, if you have a huge user base, many of them may still be using pre-2009 browsers which will leave them susceptible to the dreaded 2-connecton limit.
Ideally, combine all Javascript JS files into one JS file and all CSS files into one CSS file.
Similarly, combine all background-images into 1 Sprite image and use them.

Insufficient Caching:

Add “expiry” information with all resources so that the browsers can cache all the cacheable resources and reduce the sever calls.
Refer this brilliant piece of simple code and you should be done!
http://juliusdev.blogspot.in/2008/06/tomcat-add-expires-header.html

CSS Expressions:

CSS Expressions need to be addressed by the browsers before rendering process begins just like any other resource which is placed in head tag. Hence avoid using CSS Expressions in your CSS files as well as STYLE tags

Placement of JS files & synchronization:

If the JS files are placed inside the head tag, the browser will wait for downloading all the JS files before starting the rendering process. In fact any resource which is placed inside the head tag is processed first and then the control goes to body tag. Hence, it is advised to placed all resources which are not important/needed for rendering in the end, preferably just before body tag ends.
Also, load all the JS related resources asynchronously, and prioritize them in such a manner that the files which tend to be used more & earlier are loaded first.

Usage of inappropriate Javascript frameworks:

If you are in awe of Javascript Frameworks like Ext-JS & Dojo, and want to get your hands dirty, drink a glass of water, breathe & ask following questions to yourself:
i)                 Is this framework really required & useful for the webpage?
ii)               What amount of complexity is going to be introduced in the application because of this framework?
iii)             What is the extra size of JS files that I will need to deal with for utilizing these Javascript frameworks?
iv)             How many bugs this framework has? How may extra files I will need to add to utilize this framework?
v)               Will the end user feel some value out of this framework?
vi)             Is this framework fully supported by all browsers used by all existing users?

If your answers to above questions are positive, I would suggest you go with the framework. However, introducing excess JS frameworks can load your system in such a manner, that the optimization can be affected very badly and lot of features may require excessive handling.
If you think that the framework can actually add value, recognize those pages, where this framework will be useful. Don’t include these JS files on all web pages where they may be unutilized.

Inappropriate usage of Javascript features:

Some features of Javascript frameworks can be extremely substandard & they may affect the performance very badly.
E.g. If you are using DOJO for Internationalization, then, all the internationalization related property files will be loaded first by your browser and rendering process will begin only after that. This is an unsolvable delay. In my opinion, it is always better to address internationalization from server side rather than client side frameworks.
Similarly other Javascript features which are crippling the performance should be got rid of! It will require some testing & analysis though.

Environment issues:

Many a times, it happens that the developers are in a completely different environment than users of the website. It is contingent on developers and testers of an application to be in sync with the actual users of a system. For e.g., the internet speed (bandwidth) and machines (Computers), Browsers (Browser versions) which are being used for development and testing an application may be way more advanced than what are being used by a proportion of users resulting in completely different experiences by developers & end-users of a website.


Ignoring worst case scenarios:

Many people always tend to see the best cases while developing an application and end up ignoring user concerns.
Always keep the worst case in mind while optimizing (developing & testing) a website. Few pointers would be:
i)                    Lowest versions of browsers
ii)                   Capacity (Speed & memory) of computers
iii)                 Capacity of user to understand, navigate & connect to the website
iv)                 Visibility aspects of a website

Heavy technologies & Elements:

Usage of certain heavy technologies can severely impact the usefulness of a website:
For e.g. If Flash files are used for UI, they make take a lot of time in loading & rendering. Also the response time with flash elements (e.g. mouse clicks) may be very high.
Similarly if you are using Tables in your HTML file, it can severely impede the rendering time as the browsers need to understand the table structures completely before rendering. Nested tables can exponentially increase the browser rendering time even more.
Use Div tags instead of tables with appropriate application of CSS

Tools to consider for Website Optimization:

Efficiency with following tools can really help you accurately point out the pain points in your website and understand which portions need an immediate fix:
      1) Firebug: This Firefox extension does not need any introduction. In the net panel, you can easily see which resources are taking how much time to load.
     2) YSlow: This free extension offered by Yahoo, directly gets to the point by all giving a rating between A and F to various parameters affecting the website optimization along with giving a solution.
     3) PageSpeed: This new extension form Google is fats making its mark because of efficient data it generates and insights it gives in analyzing the website load & render processes.

Summary of Website Optimization processes:
     1) Reduce the amount of data being sent to Browsers.
     2) Reduce the number of files that the browser needs to fetch (No of Http requests)
     3) Place all the Javascript related files just before the body tag ends. DO not place the Javascript files in the head tag as all these files will be loaded before rendering process starts, hence the rendering process can get delayed a lot.
\
     4) Avoid heavy stuff like Flash elements, images. Try using CSS.
     5) Reduce the size of files (Javascript & CSS etc files) by minifying and G-zipping. Reduce image sizes. Use sprite images for background-images

Website optimization is a continuous process & hence, it is not possible to cover everything in a single article. I have deliberately not ventured into topics like AJAX but they may also requires lot of optimization. If you are using AJAX, better use correct libraries and optimize. My suggestion would be to not use libraries as they bring lot of unused code and increase both complexity and size of the files. However, cross-browser and cross-domain related issues may exhaust you and force you to use Javascript frameworks.

Ok, then.

Drop your suggestions & feedback, if you found this post useful (or if you expected something more).

I will discuss Server side optimizations also soon.

Monday, December 6, 2010

Search Engine Optimisation : Most effective tips for SEO

Here, I digress from Server and Client related issues and discuss for a  while about another important prospect related to websites : Marketing via Search Engine optimisation

I found that there is lot of discussion on internet about the Search Engine Optimisation but there is very little relevant data on What you should do and more importantly What you should not do to make your websites relevant for Search Engines.

I am, herein, going to share some of my views and experiences on SEO, and hope every one can make use out of it...

More so, I would be focussing on the thinking patterns involved while optimizing your websites for SEO rather than the complete details which can take ages... and books...

The optimising strategies you should chose depend upon the following factors :

1) Size of the site : The strategy to use for your website depends upon the size of the website i.e. the number of pages that are present in your website..

If your site contains only few pages ( less than 10 ) and those pages are static, optimising the site should be very easy.

On the other hand, if the website contains more than 10 pages, and the pages are generated dynamically, it calls for more focussed and creative ways of optimisation.

2) Nature of the website :  What is the nature and intent of the website ?
Is it a social networking site? Is it a blog kind of website ? Does the website sell ? Is the website a repository of downloadable data?

3) Targeted user base and nature of users : It may sound a fallacy ( considering that SEO means targeting as many users as possible ) but for a lot of sites it may make a lot of sense. Some sites may be regional and target users only from a certain region. Other sites may have a certain specific usage..for e.g sites selling something may expect only consumers who are interested in buying something where other sites may be for plain gossipping. The strategy chosen for SEO should be in line with expected user base.

Alright,

Assuming that you have identified by now that which categories you website belongs to, I hereby, mention the steps you should follow :

1) Submit your site and the site map manually : All major search engines have a "Add URL" page where you can go and submit your website so that Search Engines start keeping a watch on your website and start crawling them.
You should also submit the site map (in format prescribed by Search Engines) to tell the search engines the way links flow in your website.

2) Think from the crawler's perspective : All search Engines have crawlers (Crawlers are programs which come to a website page, read the page, find all the links which are present in the page, and then go to all those linked pages, read all those pages, find all the links present on those pages, and then read ........).

So crawlers are the most important thing which you should have in your mind.

The link for the website you added as a first step is like the starting page for the crawler. The crawler will read the page and then go to all the links present on the page and read them too...

3) Page reachability : All the links should be reachable from any link on your website.
For static websites, where there are limited pages, it should be easy. Just put a link for all possible pages in all the pages as a part of menu and also submit a proper sitemap.

For dynamic pages, the fun begins..:D

For website which are like blogs ( e.g. Loud Letters ) where new users can register themselves continuously and write posts (Letters, in case of Loud Letters) the challenge is in giving visibility to all new entries (new letters and users) so that they are crawled by Search Engines continuously.

The crawler monitors all the websites on a regular basis. So, if the pages on your website  couldn't make to the crawler the first time, they will get another chance soon.(approx in around 7-8 days, depending upon the search engine).

There are multiple things you can do:

A) Put links for all the recent pages on the starting page.
For e.g. The section Recent Letters on Loud Letters has been made to give all recent letters visibility to search engines

B) Link all the content from all the possible pages.
For e.g if you read the letter Lady gaga on Loud Letters, you will see a section called Similar Letters where all the related letters (Shweta Salve, Pamela Anderson, etc) have been intelligently put so that when the crawler comes on the Lady gaga page, it automatically crawls the similar letters also.
Similarly you can see, that a link for writer's profile is also present on all the letters which enables the crawler to crawl the profiles of as many users as possible.

4) Nature of links:
The way you form links is extremely important
e.g if the link is of the type www.loudletters.com/rest/letters?letterid=154 the crawler will tend to remove all the parameters and fire the link (www.loudletters.com/rest/letters in this case), which can result in No Page Found error (404 in this case). Such kind of links are called "Broken Links" and if the crawler finds few of them on your website, it can tend to believe that your site doesn't exist or it is full of gaps and exit form your website (extremely unwanted situation but very likely), a disaster.
So, the links should be of the form where the site doesn't return No return found error even if no parameters are provided with the link. Infact, it may be better to make parameters a part of the link.
For e.g. www.loudletters.com/rest/letters/154 is a good link as the letter ID has been made a part of the link.

5) Meta Tags: Meta tags are not much considered these days as users can write any thing in the meta tags and fool the Search Engine.(Google has officially announced that it doesn't consider meta tags while ranking the pages it crawled) However, some search engines still consider meta tags. So, you should provie relevant keywords in the meta tags. You can also use some automated tools and APIs (e.g. Alchemy API) which will give you relevant tags for the content and you can use those tags.

6) Starting points for the crawler: You can give multiple starting points for the crawler by posting your links on relevant pages. For e.g. you can associate a blog with your website and add all new entries for your website on that blog
For e.g (Loud Letters Blog is associated with Loud Letters and all entries made on Loud Letters are regularly posted on that blog) Blogger.com is a good place to start as it is associated with Google and Google regularly monitors all links posted on the Blogger blogs. You can also share your pages extensively on other major services like Twitter, Digg, Facebook etc. Some of them provide APIs which you can use for automated posting of links for new pages on your website as soon as they are created. AddThis.com provides a very neat and concise button which you can put on your website and allow users to share the page themselves.

7) Content: If you want your pages to be highly regarded by Search Engines, they should have good quality content. Search Engines have their own ways of extracting data and entities form your pages and the amount of visibility your page gets in comparison to other pages depends upon the validity and effectiveness of your data. Say away from copying data as you will get caught way early than you expect and your website may face everlasting damnation.

8) Client content: Things like javascript content and css interfere with crawler. So they should be put into separate js and css files.

9) Title: the title of all your pages should be very relevant to content and the queries you expect to be fired in Search.

10) Ad campaign : In case you want to use an ad campaign, chose it based on the size of user base you are targeting, which we already discussed.

11) Use standard services: Their are lot of standard services (many of which are free) which will tell you about the errors in your website which you should immediately rectify for enabling proper crawling. I recommend Google Webmaster. And yes, you can use Google Analytics to find how many users are visiting your website (i.e. how successful your SEO strategies are.!!).

Lastly, chose a relevant domain name..

There is a lot more to be said on SEO, but they may make enough meat for me to write a separate post.. Just know, one thing SEO is an ongoing process, not a set of steps...just like our lives :D. Live it, enjoy it,and work hard...

Your site will surely gain the recognition it deserves.

Wednesday, August 11, 2010

Providing parameters to REST based Spring web app using Enunciate

If you have made a web-app using Spring, you should be gladly aware that you can pass the parameter using the annotation @RequestParam

But suppose you want to make a web-app with REST standards and you try using the @RequestParam object, you are more than likely to fail.

The JAX-RS doesn't support the @RequestParam.
You need to use @QueryParam


Shockingly Spring doesn't support @QueryParam...

So, how to mix both these things together, basically making a spring based web-app which follows JAX-RS standards..

Simple, apply both the annotations to the variable :D

Instead of defining the function like
Model getData(@QueryParam("username") String userName));

OR

Model getData(@RequestParam("username") String userName));

define the function as below

Model getData(@QueryParam("username") @RequestParam("username") String userName));

And you are done...!!

Hope that Spring and JAX-RS will break the ice soon....:D

I encountered the above problem when I was trying to generate documentation (api docs) for an application using enunciate....

And solved it pretty soon too...:)

HashMap reverse sort

We all know that we can sort a map by using TreeMap.
But what if you have to reverse sort the same map, assuming that the key is an Integer?

Generally people do it by converting the map in some list etc...

Well, there is a much simpler approach.

Multiply your key with -1 :D and the natural order of sorting will get reversed on its own and you can then use your favourite TreeMap for sorting :)

I got this idea form one of my mentors in Glomantra..

Saturday, August 7, 2010

Memory and collection: performance and garbage collection

Lately,
I have been working on an application which require me to load a large no of objects in memory..
Lemme define large... Around 25 lakh objects in memory.. all distributed in around 5 HashMaps

And then, I need to refresh this data in every 15 minutes, basically loading a completely new set of data and removing the existing data.

I faced the following problems:

1)OutOfMemoryException
2)Performance

Let me tell you how I handled each:
1) The reason for occurance of this error is that Java space allocated for objects runs out of memory...because my refresh interval was very short (15 mins) and the garbage collector runs at its own luxury.....

Not necessarily, though you can force Garbage Collector to be at your beck and call, you can force it to behave in a manner that can solve a lot of problems. In my case, I did the following..
I instructed Garbage Collector to run incrementally, so that i runs every once in a while instead of waiting for a large no of objects to become garbage-able....

the java command you need to set up while starting your server is

-XincGC

Ofcourse, you also need to allocate enough heap space etc so that you dont run out of memory.

I suggest you keep the Ms and Mx values same

-XMs1024M -XMx1024m

This will ensure that all the memory gets allocated at the same time and cycles of incremental memory allocation are avoided.

2)
The performance problem was occuring because of many reasons and evidently, there are many wasy to solve problem

The fastest map, when dealing with large number of objects is definitely HashMap (not sorted maps like TreeMap or any other map) forall kind of objects. So, use HashMap

I was dealing with Double objects and storing the same in HashMap.Believe me, changing the operations from Double to double can increase the performance to more than 200%. However, you can not store double primitive in HashMap directly. So, I made a wrapper object and stored in HashMap.

The wrapper looked something like class DoubleWrapper{ private double value; }

I stored the objects of DoubleWrapper in HashMap and used the double value directly in all calculations...the performance increase multifold..

Bascially, use primitives everywhere possible.....The results wil be unbelievable...


So, Use incremental garbage clustering, hashmaps and primitive variables, and your application is golden...:D

See you later