How can I use grep with a date format and a unique value?












0















i have huge data list



My data looks like this



"[01/Dec/2011:20:53:04 +0900] ","COMZ","90.663.65.61","21.123.31.100","250","CONNECT","t.ierz.er:443","13127","836"
"[01/Dec/2011:22:20:01 +0900] ","COMZ","90.663.65.61","21.123.31.100","250","CONNECT","t.ierz.er:443","13127","836"
"[02/Dec/2011:24:33:04 +0900] ","COMZ","20.663.65.61","2.123.91.100","220","CONNECT","t.ierz.er:443","13127","836"


How can I get a data format like unique value data or IP address



01/DEC/2011 90.663.65.61 21.123.31.100


Because I have get the same value and cant get the unique value



[01 / Dec / 2011: 20: 53: 04 0900] 90.663.65.61 21.123.31.100
[01 / Dec / 2011: 20: 53: 04 0900] 90.663.65.61 21.123.31.100


code:



file.csv | awk -F" '{print $2,$6,$8}' | sort | uniq -c | sort -n









share|improve this question

























  • can you share few more lines in examples data?

    – msp9011
    Jan 23 at 7:28






  • 1





    You probably want uniq -u? Also, it's not clear from your question what you're trying to accomplish. So you want to see unique IPs and unique dates?

    – Panki
    Jan 23 at 7:28











  • yes Unique IP address with date Because the seconds in that date format are different so cant get uniq values

    – warezers
    Jan 23 at 7:36
















0















i have huge data list



My data looks like this



"[01/Dec/2011:20:53:04 +0900] ","COMZ","90.663.65.61","21.123.31.100","250","CONNECT","t.ierz.er:443","13127","836"
"[01/Dec/2011:22:20:01 +0900] ","COMZ","90.663.65.61","21.123.31.100","250","CONNECT","t.ierz.er:443","13127","836"
"[02/Dec/2011:24:33:04 +0900] ","COMZ","20.663.65.61","2.123.91.100","220","CONNECT","t.ierz.er:443","13127","836"


How can I get a data format like unique value data or IP address



01/DEC/2011 90.663.65.61 21.123.31.100


Because I have get the same value and cant get the unique value



[01 / Dec / 2011: 20: 53: 04 0900] 90.663.65.61 21.123.31.100
[01 / Dec / 2011: 20: 53: 04 0900] 90.663.65.61 21.123.31.100


code:



file.csv | awk -F" '{print $2,$6,$8}' | sort | uniq -c | sort -n









share|improve this question

























  • can you share few more lines in examples data?

    – msp9011
    Jan 23 at 7:28






  • 1





    You probably want uniq -u? Also, it's not clear from your question what you're trying to accomplish. So you want to see unique IPs and unique dates?

    – Panki
    Jan 23 at 7:28











  • yes Unique IP address with date Because the seconds in that date format are different so cant get uniq values

    – warezers
    Jan 23 at 7:36














0












0








0








i have huge data list



My data looks like this



"[01/Dec/2011:20:53:04 +0900] ","COMZ","90.663.65.61","21.123.31.100","250","CONNECT","t.ierz.er:443","13127","836"
"[01/Dec/2011:22:20:01 +0900] ","COMZ","90.663.65.61","21.123.31.100","250","CONNECT","t.ierz.er:443","13127","836"
"[02/Dec/2011:24:33:04 +0900] ","COMZ","20.663.65.61","2.123.91.100","220","CONNECT","t.ierz.er:443","13127","836"


How can I get a data format like unique value data or IP address



01/DEC/2011 90.663.65.61 21.123.31.100


Because I have get the same value and cant get the unique value



[01 / Dec / 2011: 20: 53: 04 0900] 90.663.65.61 21.123.31.100
[01 / Dec / 2011: 20: 53: 04 0900] 90.663.65.61 21.123.31.100


code:



file.csv | awk -F" '{print $2,$6,$8}' | sort | uniq -c | sort -n









share|improve this question
















i have huge data list



My data looks like this



"[01/Dec/2011:20:53:04 +0900] ","COMZ","90.663.65.61","21.123.31.100","250","CONNECT","t.ierz.er:443","13127","836"
"[01/Dec/2011:22:20:01 +0900] ","COMZ","90.663.65.61","21.123.31.100","250","CONNECT","t.ierz.er:443","13127","836"
"[02/Dec/2011:24:33:04 +0900] ","COMZ","20.663.65.61","2.123.91.100","220","CONNECT","t.ierz.er:443","13127","836"


How can I get a data format like unique value data or IP address



01/DEC/2011 90.663.65.61 21.123.31.100


Because I have get the same value and cant get the unique value



[01 / Dec / 2011: 20: 53: 04 0900] 90.663.65.61 21.123.31.100
[01 / Dec / 2011: 20: 53: 04 0900] 90.663.65.61 21.123.31.100


code:



file.csv | awk -F" '{print $2,$6,$8}' | sort | uniq -c | sort -n






linux grep






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jan 23 at 7:38







warezers

















asked Jan 23 at 7:13









warezerswarezers

11




11













  • can you share few more lines in examples data?

    – msp9011
    Jan 23 at 7:28






  • 1





    You probably want uniq -u? Also, it's not clear from your question what you're trying to accomplish. So you want to see unique IPs and unique dates?

    – Panki
    Jan 23 at 7:28











  • yes Unique IP address with date Because the seconds in that date format are different so cant get uniq values

    – warezers
    Jan 23 at 7:36



















  • can you share few more lines in examples data?

    – msp9011
    Jan 23 at 7:28






  • 1





    You probably want uniq -u? Also, it's not clear from your question what you're trying to accomplish. So you want to see unique IPs and unique dates?

    – Panki
    Jan 23 at 7:28











  • yes Unique IP address with date Because the seconds in that date format are different so cant get uniq values

    – warezers
    Jan 23 at 7:36

















can you share few more lines in examples data?

– msp9011
Jan 23 at 7:28





can you share few more lines in examples data?

– msp9011
Jan 23 at 7:28




1




1





You probably want uniq -u? Also, it's not clear from your question what you're trying to accomplish. So you want to see unique IPs and unique dates?

– Panki
Jan 23 at 7:28





You probably want uniq -u? Also, it's not clear from your question what you're trying to accomplish. So you want to see unique IPs and unique dates?

– Panki
Jan 23 at 7:28













yes Unique IP address with date Because the seconds in that date format are different so cant get uniq values

– warezers
Jan 23 at 7:36





yes Unique IP address with date Because the seconds in that date format are different so cant get uniq values

– warezers
Jan 23 at 7:36










4 Answers
4






active

oldest

votes


















1














You should use sed to complete your request.



Here is a command that should work for your case :



 cat file.csv | awk -F" '{print $2,$6,$8}' | sed 's#(:[[:digit:]]{2}){3} +0900##' | sort | uniq -c | sort -n


It will remove the date to keep only this format : [01/DEC/2011] 90.663.65.61 21.123.31.100.






share|improve this answer































    1














    Try this,



     awk -F '[:"' '{print $3" "$10" "$12}' file.csv | sort | uniq 





    share|improve this answer


























    • That's better. but you should include the commands that were first there. uniq -c | sort -n.

      – jayooin
      Jan 23 at 7:45











    • @jayooin does it required? I hope...OP's requirement is to get the unique value rather than its count...

      – msp9011
      Jan 23 at 8:12



















    1














    As your data seems to be in CSV format you might be able to use csvsql from csvkit, see https://csvkit.readthedocs.io/en/1.0.3/scripts/csvsql.html#



    Assuming your file is named data.csv



    csvsql -H --query 'SELECT a,c,d FROM data GROUP BY c,d' data.csv


    prints



    a,c,d
    [02/Dec/2011:24:33:04 +0900] ,20.663.65.61,2.123.91.100
    [01/Dec/2011:22:20:01 +0900] ,90.663.65.61,21.123.31.100


    See also https://unix.stackexchange.com/a/495010/330217






    share|improve this answer































      1














      I always recommend using a CSV parser for CSV data. Here's ruby:



      ruby -rcsv -ne 'CSV.parse($_) do |row|
      puts [row[0][1..11].upcase, row[2], row[3]].join " "
      end' | sort -u




      01/DEC/2011 90.663.65.61 21.123.31.100
      02/DEC/2011 20.663.65.61 2.123.91.100





      share|improve this answer























        Your Answer








        StackExchange.ready(function() {
        var channelOptions = {
        tags: "".split(" "),
        id: "106"
        };
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function() {
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled) {
        StackExchange.using("snippets", function() {
        createEditor();
        });
        }
        else {
        createEditor();
        }
        });

        function createEditor() {
        StackExchange.prepareEditor({
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: false,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: null,
        bindNavPrevention: true,
        postfix: "",
        imageUploader: {
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        },
        onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        });


        }
        });














        draft saved

        draft discarded


















        StackExchange.ready(
        function () {
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f496143%2fhow-can-i-use-grep-with-a-date-format-and-a-unique-value%23new-answer', 'question_page');
        }
        );

        Post as a guest















        Required, but never shown

























        4 Answers
        4






        active

        oldest

        votes








        4 Answers
        4






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        1














        You should use sed to complete your request.



        Here is a command that should work for your case :



         cat file.csv | awk -F" '{print $2,$6,$8}' | sed 's#(:[[:digit:]]{2}){3} +0900##' | sort | uniq -c | sort -n


        It will remove the date to keep only this format : [01/DEC/2011] 90.663.65.61 21.123.31.100.






        share|improve this answer




























          1














          You should use sed to complete your request.



          Here is a command that should work for your case :



           cat file.csv | awk -F" '{print $2,$6,$8}' | sed 's#(:[[:digit:]]{2}){3} +0900##' | sort | uniq -c | sort -n


          It will remove the date to keep only this format : [01/DEC/2011] 90.663.65.61 21.123.31.100.






          share|improve this answer


























            1












            1








            1







            You should use sed to complete your request.



            Here is a command that should work for your case :



             cat file.csv | awk -F" '{print $2,$6,$8}' | sed 's#(:[[:digit:]]{2}){3} +0900##' | sort | uniq -c | sort -n


            It will remove the date to keep only this format : [01/DEC/2011] 90.663.65.61 21.123.31.100.






            share|improve this answer













            You should use sed to complete your request.



            Here is a command that should work for your case :



             cat file.csv | awk -F" '{print $2,$6,$8}' | sed 's#(:[[:digit:]]{2}){3} +0900##' | sort | uniq -c | sort -n


            It will remove the date to keep only this format : [01/DEC/2011] 90.663.65.61 21.123.31.100.







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Jan 23 at 7:33









            jayooinjayooin

            3347




            3347

























                1














                Try this,



                 awk -F '[:"' '{print $3" "$10" "$12}' file.csv | sort | uniq 





                share|improve this answer


























                • That's better. but you should include the commands that were first there. uniq -c | sort -n.

                  – jayooin
                  Jan 23 at 7:45











                • @jayooin does it required? I hope...OP's requirement is to get the unique value rather than its count...

                  – msp9011
                  Jan 23 at 8:12
















                1














                Try this,



                 awk -F '[:"' '{print $3" "$10" "$12}' file.csv | sort | uniq 





                share|improve this answer


























                • That's better. but you should include the commands that were first there. uniq -c | sort -n.

                  – jayooin
                  Jan 23 at 7:45











                • @jayooin does it required? I hope...OP's requirement is to get the unique value rather than its count...

                  – msp9011
                  Jan 23 at 8:12














                1












                1








                1







                Try this,



                 awk -F '[:"' '{print $3" "$10" "$12}' file.csv | sort | uniq 





                share|improve this answer















                Try this,



                 awk -F '[:"' '{print $3" "$10" "$12}' file.csv | sort | uniq 






                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited Jan 23 at 7:37

























                answered Jan 23 at 7:31









                msp9011msp9011

                4,23144065




                4,23144065













                • That's better. but you should include the commands that were first there. uniq -c | sort -n.

                  – jayooin
                  Jan 23 at 7:45











                • @jayooin does it required? I hope...OP's requirement is to get the unique value rather than its count...

                  – msp9011
                  Jan 23 at 8:12



















                • That's better. but you should include the commands that were first there. uniq -c | sort -n.

                  – jayooin
                  Jan 23 at 7:45











                • @jayooin does it required? I hope...OP's requirement is to get the unique value rather than its count...

                  – msp9011
                  Jan 23 at 8:12

















                That's better. but you should include the commands that were first there. uniq -c | sort -n.

                – jayooin
                Jan 23 at 7:45





                That's better. but you should include the commands that were first there. uniq -c | sort -n.

                – jayooin
                Jan 23 at 7:45













                @jayooin does it required? I hope...OP's requirement is to get the unique value rather than its count...

                – msp9011
                Jan 23 at 8:12





                @jayooin does it required? I hope...OP's requirement is to get the unique value rather than its count...

                – msp9011
                Jan 23 at 8:12











                1














                As your data seems to be in CSV format you might be able to use csvsql from csvkit, see https://csvkit.readthedocs.io/en/1.0.3/scripts/csvsql.html#



                Assuming your file is named data.csv



                csvsql -H --query 'SELECT a,c,d FROM data GROUP BY c,d' data.csv


                prints



                a,c,d
                [02/Dec/2011:24:33:04 +0900] ,20.663.65.61,2.123.91.100
                [01/Dec/2011:22:20:01 +0900] ,90.663.65.61,21.123.31.100


                See also https://unix.stackexchange.com/a/495010/330217






                share|improve this answer




























                  1














                  As your data seems to be in CSV format you might be able to use csvsql from csvkit, see https://csvkit.readthedocs.io/en/1.0.3/scripts/csvsql.html#



                  Assuming your file is named data.csv



                  csvsql -H --query 'SELECT a,c,d FROM data GROUP BY c,d' data.csv


                  prints



                  a,c,d
                  [02/Dec/2011:24:33:04 +0900] ,20.663.65.61,2.123.91.100
                  [01/Dec/2011:22:20:01 +0900] ,90.663.65.61,21.123.31.100


                  See also https://unix.stackexchange.com/a/495010/330217






                  share|improve this answer


























                    1












                    1








                    1







                    As your data seems to be in CSV format you might be able to use csvsql from csvkit, see https://csvkit.readthedocs.io/en/1.0.3/scripts/csvsql.html#



                    Assuming your file is named data.csv



                    csvsql -H --query 'SELECT a,c,d FROM data GROUP BY c,d' data.csv


                    prints



                    a,c,d
                    [02/Dec/2011:24:33:04 +0900] ,20.663.65.61,2.123.91.100
                    [01/Dec/2011:22:20:01 +0900] ,90.663.65.61,21.123.31.100


                    See also https://unix.stackexchange.com/a/495010/330217






                    share|improve this answer













                    As your data seems to be in CSV format you might be able to use csvsql from csvkit, see https://csvkit.readthedocs.io/en/1.0.3/scripts/csvsql.html#



                    Assuming your file is named data.csv



                    csvsql -H --query 'SELECT a,c,d FROM data GROUP BY c,d' data.csv


                    prints



                    a,c,d
                    [02/Dec/2011:24:33:04 +0900] ,20.663.65.61,2.123.91.100
                    [01/Dec/2011:22:20:01 +0900] ,90.663.65.61,21.123.31.100


                    See also https://unix.stackexchange.com/a/495010/330217







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered Jan 23 at 9:04









                    BodoBodo

                    71817




                    71817























                        1














                        I always recommend using a CSV parser for CSV data. Here's ruby:



                        ruby -rcsv -ne 'CSV.parse($_) do |row|
                        puts [row[0][1..11].upcase, row[2], row[3]].join " "
                        end' | sort -u




                        01/DEC/2011 90.663.65.61 21.123.31.100
                        02/DEC/2011 20.663.65.61 2.123.91.100





                        share|improve this answer




























                          1














                          I always recommend using a CSV parser for CSV data. Here's ruby:



                          ruby -rcsv -ne 'CSV.parse($_) do |row|
                          puts [row[0][1..11].upcase, row[2], row[3]].join " "
                          end' | sort -u




                          01/DEC/2011 90.663.65.61 21.123.31.100
                          02/DEC/2011 20.663.65.61 2.123.91.100





                          share|improve this answer


























                            1












                            1








                            1







                            I always recommend using a CSV parser for CSV data. Here's ruby:



                            ruby -rcsv -ne 'CSV.parse($_) do |row|
                            puts [row[0][1..11].upcase, row[2], row[3]].join " "
                            end' | sort -u




                            01/DEC/2011 90.663.65.61 21.123.31.100
                            02/DEC/2011 20.663.65.61 2.123.91.100





                            share|improve this answer













                            I always recommend using a CSV parser for CSV data. Here's ruby:



                            ruby -rcsv -ne 'CSV.parse($_) do |row|
                            puts [row[0][1..11].upcase, row[2], row[3]].join " "
                            end' | sort -u




                            01/DEC/2011 90.663.65.61 21.123.31.100
                            02/DEC/2011 20.663.65.61 2.123.91.100






                            share|improve this answer












                            share|improve this answer



                            share|improve this answer










                            answered Jan 23 at 13:24









                            glenn jackmanglenn jackman

                            51.3k571110




                            51.3k571110






























                                draft saved

                                draft discarded




















































                                Thanks for contributing an answer to Unix & Linux Stack Exchange!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid



                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.


                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function () {
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f496143%2fhow-can-i-use-grep-with-a-date-format-and-a-unique-value%23new-answer', 'question_page');
                                }
                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                How to make a Squid Proxy server?

                                Is this a new Fibonacci Identity?

                                Touch on Surface Book