Beefy Boxes and Bandwidth Generously Provided by pair Networks Cowboy Neal with Hat
Keep It Simple, Stupid
 
PerlMonks  

Re^4: CGI table problems

by rashley (Scribe)
on Nov 10, 2006 at 08:49 UTC ( [id://583338]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Re^3: CGI table problems
in thread CGI table problems

Actually, the problem is within renderPartReadView.

It creates a verticle list of label/value pairs, which is fine.

The problem is when the value is a list, it puts the values side by side, and I want them stacked.

Replies are listed 'Best First'.
Re^5: CGI table problems
by pKai (Priest) on Nov 11, 2006 at 11:09 UTC
    I see no problem for the same argument to apply (replace the td-content with a table of rows of one td each):
    $data .= $cgi->td({-valign=>'top'}, $cgi->table( map{ $cgi->Tr($cgi->td($_) ) } @{render +PartReadView($cgi, $lca_dbm, $am)} );
      I finally got this to work just the way I wanted. I checked the result to see if it was an array ref, then formed a sub-table if it was:
      #massage multi-values for formatting if ( ref $encodedvalue eq 'ARRAY' ) { my $listtable .= $cgi->start_table({-border=>0}); for ( my $i=0; $i<scalar( @$encodedvalue) ; $i++ ) { $listtable .= $cgi->Tr({}, $cgi->th(@$encodedvalue[$i]) ); } $listtable .= $cgi->end_table(); $data .= Tr({}, $cgi->td({-class=>'label'}, $attrHash->{ATTRNAME}), $cgi->td({-class=>'value'}, ($listtable || $cgi->p('&nbsp;')))); } else { $data .= Tr({}, $cgi->td({-class=>'label'}, $attrHash->{ATTRNAME}), $cgi->td({-class=>'value'}, ($encodedvalue || $cgi->p('&nbsp;')))); }}

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://583338]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.