Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re^5: Perl Contempt in My Workplace

by marto (Cardinal)
on May 21, 2021 at 09:45 UTC ( [id://11132844]=note: print w/replies, xml ) Need Help??


in reply to Re^4: Perl Contempt in My Workplace
in thread Perl Contempt in My Workplace

"For DataTables this is especially true. First of all, the JS part has tons of options and tons of plugins. So, a completely generic backend would have to replicate everything, potentially making it a big mess of spagetti code that moves at the speed of a glacier."

I don't believe this is true, you could write code code to produce these requirements based on options provided via a perl constructor if required.

"And secondly, if you use paging or scrolling in DataTables, especially in combination with filters and JOINs over multiple tables, this will just not work with some SQL statement thrown together by a generic module."

I often find this to be the case in various products, of course your mileage may vary. The user in question is inflexible in terms of this issue however, and efforts to convince them otherwise seems like a waste of time and effort.

Update: slight rewording.

Replies are listed 'Best First'.
Re^6: Perl Contempt in My Workplace
by vkon (Curate) on May 28, 2021 at 08:12 UTC
    with all the respect - why inflexible?

    I've read your post Re^6: Perl Contempt in My Workplace carefully - what "inflexible" I made?
    In that reply you've stated that you only return JSONs but JSONs are not possible for 10_000_000 records.

      JSONs are not possible for 10_000_000 records

      JSON is a data format. It has no limitations on the size of the data.


      🦛

        Technically, yes. But nearly all JSON parsers i've seen are designed to slurp in everything all at once and turn it into a in-memory data structure. So for very large files, you might (or might not) have to cobble together a custom parser that can do a stream-as-you-go approach.

        Of course, that's where Perl comes into its own. Munching insanely huge text files is what it was designed for in the first place ;-)

        perl -e 'use Crypt::Digest::SHA256 qw[sha256_hex]; print substr(sha256_hex("the Answer To Life, The Universe And Everything"), 6, 2), "\n";'
        A reply falls below the community's threshold of quality. You may see it by logging in.

      JSONs are not possible for 10_000_000 records

      As hippo says, JSON is just a data format.

      Let's give it a try, testing with some generated md5 values:

      testdb=# select count(*) from vkon_json; count ---------- 10000000 (1 row) testdb=# select * from vkon_json limit 3; js | id --------------------------------------------+---- {"f1": "c4ca4238a0b923820dcc509a6f75849b"} | 1 {"f1": "c81e728d9d4c2f636f067f89cc14862c"} | 2 {"f1": "eccbc87e4b5ce2fe28308fd9f2a7baf3"} | 3 (3 rows) -- retrieve: testdb=# select * from vkon_json where js @> '{"f1": "d1ca3aaf52b41acd68ebb3bf69079bd1"}'; js | id --------------------------------------------+---------- {"f1": "d1ca3aaf52b41acd68ebb3bf69079bd1"} | 10000000 (1 row) Time: 0.679 ms

      Less than half a millisecond. What do you think? Possible?

      (I could have put everything into a 1-column, 1-row table but that just seems too dumb. Possible, though, and will perform just as fast)

      A reply falls below the community's threshold of quality. You may see it by logging in.

      Inflexible in that you make claims that have no basis in reality, besides your own stance e.g. Re^9: Perl Contempt in My Workplace, and keep asserting the same flawed responses, despite previous corrections, as demonstrated above.

        My own recent experience of a problem do have basis in reality, because this is what have happened in my recent experience.

        I assume that you do not think that what I said was not true (because I haven't lied)

        I honestly tried to find an acceptable solution for me in Perl but decided to find elsewhere because Perl solution seemed to me incomplete.
        It could be that my decision to switch to another solution was wrong - I can accept that I have weak search-fu or my intuition failed this time.

        2 questions to you:

        • this above statement - is it also flawed? if so - in what way it flawed?
        • the more important question:
          can you point me to a ready Perl solution for single large SQL table on server with datatables frontend with Excel-like filtering and searches?
          I will switch to it immediately.
          A reply falls below the community's threshold of quality. You may see it by logging in.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11132844]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others avoiding work at the Monastery: (6)
As of 2024-04-18 07:01 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found