| 12:25 pm on Feb 16, 2004 (gmt 0)|
I can only think of a 2 step method.
If you have a column.. for example 10 items long, in column A
and you paste the following into cell B1
and then copy that cell into a column in B, one cell short of the column in A.. (ie B1 to B9) then this should iterate the formula down to
you can then sum the total of the list in column B = number of duplicates in A.
| 12:27 pm on Feb 16, 2004 (gmt 0)|
Pivot tables perhaps
pivot tables [techtv.com]
either that or whack them into Access and do a group by and count function on your columm
| 12:35 pm on Feb 16, 2004 (gmt 0)|
Perhaps going another direction?
1) export it as CVS
2) transform it in an MySQL DB via PHP MyAdmin
3) do a query for duplicates
| 12:47 pm on Feb 16, 2004 (gmt 0)|
Thanks guys....i have to admit that doing formula's are my weak point. im gonna mess around a bit and try out the options you guys posted. im really pressed for time as well. because i have over a thousand key phrases in column A and i cant actually change the order they are in its making it even more difficult:(
| 12:54 pm on Feb 16, 2004 (gmt 0)|
no worries guys...got it:)
| 1:19 pm on Feb 16, 2004 (gmt 0)|
When I am looking for duplicates, I find the subtotal feature is good. Sort the colummn you want to find the dupes in, use the subtotal>count feature on that column. Reduce the count to the #2 level. Highlight the area (not the whole sheet) with the info on it again and sort by the column with the subtotal. Any dupes will have a two or higher.
| 1:26 pm on Feb 16, 2004 (gmt 0)|
this is a good one
I'll keep it for my own use
| 4:19 pm on Feb 16, 2004 (gmt 0)|
A spreadsheet app is a wrong place for doing the sort of queries you want. It wasn't designed for it.
Consider converting your data from Excel into Access database and then use a "group by" query.
| 4:30 pm on Feb 16, 2004 (gmt 0)|
>A spreadsheet app is a wrong place for doing the sort of queries you want.
Actually, Excel is a pretty powerful data analysis tool, although I agree for duplicate finding it leaves something to be desired. I use an add-in called digDB that really improves duplicate handling in Excel. And unlike most such tools, there is a free trial available.
| 4:59 pm on Feb 16, 2004 (gmt 0)|
This isn't a formula. Formulas are too difficult. But I would do this..
Select your column, including heading.
Data - filter - advanced filter - copy to another location - unique records only.
Click in the Copy To box and clear it if it has anything in it, click on an empty cell (and make sure its at row one).
That will select all the uniques and paste them to the new location. Subtract that number from your total records.
| 5:27 pm on Feb 16, 2004 (gmt 0)|
Hmmm. I consider myself pretty good on excel and I've never done that. Thanks for the info.
| 7:36 pm on Feb 16, 2004 (gmt 0)|
I consider myself pretty useless on Excel, so I have to find work-arounds. One day, when I grow up, I will do a macro and perhaps even edit a graph by myself!
I will let you all know, in case you want to congratulate me and arrange a medal.
| 8:13 pm on Feb 16, 2004 (gmt 0)|
that's a great method 1cicle, I had no idea. I do have an excel macro to eliminate duplicates that I've been using for a while, had to slightly modify something else from the net but it has worked great for me.
But not having to bring that macro with me, priceless.
| 9:28 am on Feb 17, 2004 (gmt 0)|
thanks guys. now i know exactly where to post if i have any excel issues again:)
| 3:13 am on Feb 20, 2004 (gmt 0)|
To expand on 1cicle's response, I believe your original column has to be sorted first. I could be confusing this with a Vlookup requirement... I always sort first anyway.
The Data > Filter > Advanced Filter > Copy to another location > Unique doesn't actually tell you what's unique, just gives you a list without duplicates.
Couldn't live without that little gem.