[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Xen-devel] [PATCH] xenctld - a control channel multiplexing daemon



On Mon, 2005-01-24 at 09:33, Ronald G. Minnich wrote:
> On Fri, 21 Jan 2005, Jared Rhine wrote:

> Again, this is not an issue of esthetics, it's an issue of measured 
> performance. 

Where's the performance issue?  For instance, if you had one daemon that
forked off two children, would that be better than having two individual
processes?  Is it simply a matter of having that extra file on disk?

I'm not sure I understand how this could create a performance problem. 
I simply don't know that much about cluster-performance and am very
curious :-)

It seems like the only thing that would make sense is that it's multiple
processes but unfortunately this is the exact opposite performance
behavior of a n-way system...

> ron
> 
> 
> -------------------------------------------------------
> This SF.Net email is sponsored by: IntelliVIEW -- Interactive Reporting
> Tool for open source databases. Create drag-&-drop reports. Save time
> by over 75%! Publish reports on the web. Export to DOC, XLS, RTF, etc.
> Download a FREE copy at http://www.intelliview.com/go/osdn_nl
> _______________________________________________
> Xen-devel mailing list
> Xen-devel@xxxxxxxxxxxxxxxxxxxxx
> https://lists.sourceforge.net/lists/listinfo/xen-devel
-- 
Anthony Liguori
Linux Technology Center (LTC) - IBM Austin
E-mail: aliguori@xxxxxxxxxx
Phone: (512) 838-1208




-------------------------------------------------------
This SF.Net email is sponsored by: IntelliVIEW -- Interactive Reporting
Tool for open source databases. Create drag-&-drop reports. Save time
by over 75%! Publish reports on the web. Export to DOC, XLS, RTF, etc.
Download a FREE copy at http://www.intelliview.com/go/osdn_nl
_______________________________________________
Xen-devel mailing list
Xen-devel@xxxxxxxxxxxxxxxxxxxxx
https://lists.sourceforge.net/lists/listinfo/xen-devel


 


Rackspace

Lists.xenproject.org is hosted with RackSpace, monitoring our
servers 24x7x365 and backed by RackSpace's Fanatical Support®.