Beefy Boxes and Bandwidth Generously Provided by pair Networks Russ
P is for Practical
 
PerlMonks  

Net::LDAP question

by seeker (Curate)
on Dec 13, 2000 at 11:58 UTC ( [id://46401]=perlquestion: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.

seeker has asked for the wisdom of the Perl Monks concerning the following question:

I have been asked to prepare a load tester for a LDAP implementation. This implementation has added some new elements to a directory. I am very unfamiliar with LDAP. Will Net::LDAP allow me to use these new elements, or will I have to write this thing in C(ugh!)? If someone has tried something similar, please let me know if it can be done.

Replies are listed 'Best First'.
Re: Net::LDAP question
by t0mas (Priest) on Dec 13, 2000 at 12:50 UTC
    I'm not really sure what you mean by "load tester". Is it a program to load data to the database or is it a program to generate load on the ldap server?

    If you want to test to load data into your database, Net::LDAP will do just fine. You can access any attribute in your schema with it.

    I tried to write a load generator with Net::LDAP but ldap servers are generally _very_ fast so perl have trouble to keep up with them. My program ate all resources from the test-machine, which was 3 times as fast as the server, and the server was idle :)
    I finally had to do a multi-threaded c thing...

    If you want to do a load generator, do it in c :( sad to say ), or try out the Net:LDAPapi (it's pretty old but might be a way to do it in perl).

    /brother t0mas

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://46401]
Approved by root
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.