Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re: Anyone familiar with DBD:Mock?

by InfiniteSilence (Curate)
on Feb 09, 2025 at 05:41 UTC ( [id://11163957]=note: print w/replies, xml ) Need Help??


in reply to Anyone familiar with DBD:Mock?

From the POD: "There is no distinct area where using this module makes sense..." and I tend to agree. I would make my real data for tests in CSV and use DBD::CSV in my test scripts instead.

Celebrate Intellectual Diversity

Replies are listed 'Best First'.
Re^2: Anyone familiar with DBD:Mock?
by cavac (Prior) on Feb 17, 2025 at 12:57 UTC

    DBD::SQlite might also work fine. There are lots of tools to edit a SQLite database by hand. I use sqlitebrowser for that.

    Of course, it rather depends on what OP is trying to achieve. Very basic SQL statements should works the same (more or less) in every database. Mocking a specific SQL dialect is a lot harder, and so is error handling. And trancactions.

    Unless there are specific reasons to simulate a database, my recommendation would be to use the real thing for development. Many databases have support for bulk insertion of data, so it would not be too complicated to load a bunch of test data. See my use of COPY for PostgreSQL: Re: [OT] The Long List is Long resurrected

    PerlMonks XP is useless? Not anymore: XPD - Do more with your PerlMonks XP
    Also check out my sisters artwork and my weekly webcomics

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11163957]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others cooling their heels in the Monastery: (2)
As of 2026-04-19 22:38 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.