torsdag 13 juli 2017

JSON fitting

It turned out that the FBX fitting method had flaws as well. However, since then I have come up with yet another way to extract the information about mesh and bone locations, and this time it seems to work, at least for all files that I have tried.

The idea is simply to write an export script for Daz Studio, that exports exactly the necessary information to a custom file. There were some initial hurdles, because I have not written scripts for Daz Studio before. The scripting language is not Python but something similar to Java or QT, which is not terribly difficult although something I have little experience with. What was tricky was that the API is poorly documented, but I managed to find some examples on the Internet that were sufficient for my simple script.


The script is located in the folder to_daz_studio and is called  export_basic_data.dsa. Before we can use it, we need to tell Daz Studio about it. This is done as follows.


In Daz Studio, select Window > Workspace > Customize.


In the customize windows that opens, right-click on Custom and select Create New Custom Action.

Change the Menu Text to Export Basic Data (or whatever you want to scripts name to be). Make sure that DAZ Script File is selected and press the ... button to the right.

A file selector opens. Navigate to the .dsa file and Open it.

The file name now appears in File field. Press Accept to close the dialog window.

The script now appears as a subitem under Custom. In the pane to the right, choose the Menus tab, and drag the script to the place you want it to appear. Since it is an export script, I find it natural to group it with the Import and Export menu items. Press Apply to make the changes take place, and then press Accept to close the dialog.


The new export script now appears in the File menu. Select it to export the basic data for the current scene.
Export the scene with the same name as the .duf file, but with the extension .json instead. My scene was unimaginably called test.duf, so the file name must be test.json. The script actually ignores the extension, so if you just call the file test it will still get the right extension automatically.


A dialog informs when the file has been exported. I don't know what happens if the script fails for some reason, because it has actually never happened so far.


Finally in Blender, select Json File for Mesh Fitting. The scene should now be imported, and the meshes and bones should look as they do in Daz Studio.


Here is the same character that we imported with Obj, Collada, and Fbx fitting in the previous post. There are, as far as I can tell, no obvious glitches. In particular, the feet fit correctly into the high-heeled boots, something that always required manual tweaking with the other fitting methods.

With this addition, I think that the Daz Importer is ready for the next stable release. I have not spent much time improving the code recently, but I have actually used the importer quite a bit myself, and to me the tool set feels rather finished.













fredag 30 juni 2017

FBX fitting

During the last few months I felt rather fed up with 3D, but very recently I started to play with the Daz Importer again, and there are a few new things to report.

First, Daz has released a new character base, Genesis 8, which is available if you update Daz Studio to the most recent version. I must have been doing something right, because adding this character to the Daz Importer was surprisingly painless, even if there probably are some glitches left. The main problems had to do with the rest pose; Genesis 3 is in T-pose but Genesis 8 is in A-pose.

The other news is that character fitting can now be done with fbx files. In some cases this works better than obj or Collada (but in other cases it is worse). In particular, there has been a problem with the skeleton when some body parts are scaled. The Daz Importer uses the preview field in the duf file to determine the bone locations, and whereas this field takes most of the transformations into account, some scale transformations are ignored. However, they are included in the skeleton exported to fbx, so the importer can use this information to deduce the correct bone locations.



Here is a character in Daz Studio. The chest, hands and feet have been scaled, transformations that are not stored in the preview field in the duf file. In addition, the leg length and breast size have been increased.












Export the character as an fbx file. It is important that you export it as an ascii fbx file; the importer can not read binary fbx and will generate an error if you try. The fbx format seems to change for every year, but it does not seem to matter which fbx version you select, as long as it is ascii. The other settings do not seem to be important, except that Allow Degraded Skinning and Scaling should be selected, because otherwise the exporter complains.














 Here is the character imported with Mesh Fitting set to Obj File (blue clothes). The mesh contains all morphs, but the skeleton does not fit the mesh: the arms and head  are to low, because chest scaling is ignored, and the hands and the feet are too small.











If we instead import the character with Mesh Fitting set to Dae file (red clothes), the armature fits the character mesh, but this is only because the scale transformations are ignored for the mesh as well. They are weirdly present in the clothes meshes, however.



The red Collada character compared to the blue obj character.







There are also a problem with the boots. This must be a bug in Daz Studio's Collada exporter, because the boots look equally ugly when the dae file is imported directly with the Collada importer.







Finally, here is the character imported with Mesh Fitting set to Fbx File (green clothes).












The mesh is the same as the blue obj mesh, and the skeleton fits the mesh.


















When writing this post, I realize that there still is a problem with the bones. The duf file specifies both the bone head (center_point) and tail (end_point), but the fbx format only contains formation about the bone's head and orientation. For simplicity, I chose to translate the tail the same distance as the head, but for the posed foot this gave the rather strange result show here.

This is probably not such a huge problem, and it goes away if you choose to Merge Toes.



However, there are other where fbx fitting produces worse results than obj or dae. In some cases the solution might be to import several versions of the character, with different types of Mesh Fitting, and only keep the meshes and rigs that are most faithful to the original.





måndag 20 mars 2017

Low-poly versions

 

Assets from Daz Studio are usually awesome, but they often have a high polygon count. This can give problems with performance, especially if you like me use an old Win 7 box from 2009 with only six gigs of ram. Often it is possible to reduce the polygon count with little effect on the final renders.

This is where the Low-poly Versions section at the bottom of the Setup panel comes in. In the unstable version it looks like this



 Here is the original character, weighting in

Verts: 21556, Edges: 42599, Faces: 21098



The Make Quick Low-poly button reduces her footprint to

Verts: 7554, Edges: 17800, Faces: 10311

i.e. the number of vertices is almost reduced to a third.

However, there are a number of problems, e.g. the black spots in the should areas. The quick low-poly uses Blender's Decimate modifier, which does not respect UV seams. A decimated face can contain vertices in different UV islands, making the middle of the face stretch over random parts of the texture.

Moreover, since the quick low-poly applies a modifier, the mesh must not have any shapekeys. This is not such a big problem for Genesis 3, where facial expressions are implemented with bones, but for Genesis and Genesis 2 this means no face expressions. The Apply Morphs button is duplicated in this section to get rid of any shapekeys before making a quick low-poly.

To deal with these problems, a second algorithm has been implemented, the faithful low-poly. It is faithful in the sense that UV seams are respected, so there are no faces stretching over unknown parts of the texture.

The footprint of the faithful low-poly is

Verts: 11885, Edges: 21136, Faces: 9306

This is slightly higher than the quick low-poly, but still a nice reduction.

There are still some problems. In the illustration the shoulders become very jagged when posed. This can be fixed by adding a Subdivision Surface modifier. If the number of view subdivisions is set to zero, the viewport performance should not suffer too much.







Another problem is that the algorithm produces n-gons, which sometimes leads to bad results. This can be fixed by the Split n-gons button.

After triangulating n-gons the character weighs

Verts: 11885, Edges: 31111, Faces: 19281

The number of  edges and faces has gone up considerably, but I'm not sure that this affects performance, since complicated n-gons have been replaced by simple triangles. The number of vertices stays the same, which is what I think is most important for performance.




Hair can be particularly problematic for performance. The Aldora hair, which came with some earlier version of Daz Studio, has the impressive footprint

Verts: 123506, Edges: 240687, Faces: 117264

Reducing the weight of a 21,000 verts character makes little sense if we leave her with 123,000 verts worth of hair.


Making a faithful low-poly of the hair reduces the footprint to

Verts: 32574, Edges: 61862, Faces: 29370

without any notable reduction of quality.






A second iteration of the Faithful Low-poly button reduces the hair further

Verts: 9281, Edges: 16743, Faces: 7544

Compared to the original 123,000 verts, the footprint has gone down with more than a factor of ten!

We now start to see some bald spots on the head, but it should not be too difficult to fix them in edit mode.

If we instead made a quick low-poly in the last step, the footprint became

Verts: 10081, Edges: 20047, Faces: 10048

The baldness problem is perhaps a little less pronounced, but some manual editing is still needed.





Another way to reduce the poly-count for some hair types is provided by the Select Random Strands button. Here is another hair with an impressive footprint:

Verts: 114074, Edges: 167685, Faces: 55553


Not all of the strands are really needed, but it would difficult to select a suitable set manually.

We don't want the skull cap to be selected, so we hide it (select one vertex on the skull, use ctrl-L to select connected vertices, and H to hide).



Select Random Strands to make the selection, and then press X to delete the verts. The Keep Fraction slider was set to the default 50% in this case. There is some tendency to baldness in edit mode, but that is because the skull cap is still hidden. The render looks quite ok but the footprint has been reduced to

Verts: 56116, Edges: 82748, Faces: 27574











söndag 19 februari 2017

DAZ Importer version 1.1.


A new version of the DAZ Importer to Blender is finally available. The are many improvements compared to version 1.0, including
  • Multiple instances of the same asset are treated as different objects.
  • Object transformations have been improved.
  • Roll angles are chosen to make X the main bend axis, necessary for the MHX and Rigify rigs.
  • Improved material handling, utilising DAZ Studio materials.
  • Experimental import only using information in DAZ files.
  • and many, many bug fixes.

Download link: https://www.dropbox.com/s/i4aoh578u6ss2hz/import-daz-v1.1-20170222.zip

Instructions:
  1. Save the zip file somewhere on your computer.
  2. In Blender, go to File > User Preferences > Add-ons
  3. Press Install From File... and select the zip file.
  4. Enable the DAZ importer.
For more information see: http://diffeomorphic.blogspot.se/p/daz-importer-version-11.html






And here I added a quick animation with MakeWalk.

tisdag 27 september 2016

What's Missing in Quantum Gravity IV: Objections

Today I sum up some of the more obvious objections to the arguments presented in the previous posts, and explain why these objections are not relevant.

1. The diffeomorphism algebra in $d$ dimensions has no central extension at all when $d>1$

This is correct, and would appear to be fatal blow to the existence of a multidimensional Virasoro algebra since the the ordinary Virasoro algebra is precisely the central extension of the diffeomorphism algebra on the circle. However, we saw in the previous post that the Virasoro extension in higher dimensions is not central, i.e. it does not commute with all diffeomorphisms. The Virasoro algebra in $d$ dimensions is essentially an extension of the diffeomorphism algebra by its module of closed $(d-1)$-forms. When $d=1$, a closed zero-form is a constant function, and the extension is a constant. In higher dimensions, a closed $(d-1)$-form does not commute with diffeomorphisms, but we still have a well-defined Lie algebra extension, not just a central one.

2. In QFT, there are no diff anomalies at all in four dimensions

Again this statement is correct and apparently fatal, because an extension of the diffeomorphism algebra is a diff anomaly by definition. However, the caveat is the phrase "in QFT". Virasoro-like extensions of the diffeomorphism algebra in four dimensions certainly exist, but they cannot arise within the framework of QFT. The reason is simple: as we saw in the previous post, the extension is a functional of the observer's trajectory $q^\mu(t)$, and since the observer does not explicitly appear in QFT, such a functional can not be written down within that framework. To formulate the relevant anomalies, a more general framework which explicitly involves the observer is needed.

3. Diff anomalies are gauge anomalies which are always inconsistent

In contrast to the first two objections, this statement is blatantly wrong. Counterexample: the free subcritical string, which according to the no-ghost theorem can be consistently quantized despite its conformal gauge anomaly. Of course, this does not mean that every kind of gauge anomaly can be rendered consistent; the free supercritical string and the interacting subcritical string are examples where the conformal anomaly leads to negative-norm states in the physical spectrum and hence to inconsistency. But the crucial condition is unitarity rather than triviality.

A necessary condition for a gauge anomaly to be consistent is clearly that the algebra of anomalous gauge transformation possesses non-trivial unitary representations. The Mickelsson-Faddeev (MF) algebra, which describes gauge anomalies in Yang-Mills theory, fails this criterion. It was shown by Pickrell a long time ago that the MF algebra has no "nice" non-trivial unitary representations at all. Therefore, Nature must abhor this kind of gauge anomaly, which of course is in agreement with experiments; gauge anomalies in the Standard Model do cancel. But this argument has no bearing on the situation where the algebra of gauge transformations does possess "nice" unitary representations.

4. Gauge symmetries are redundancies of the description rather than proper symmetries

This is only true classically, and quantum-mechanically in the absense of a gauge anomaly. A gauge anomaly converts a classical gauge symmetry into a ordinary quantum symmetry, which acts on the Hilbert space rather than reducing it. As an example we consider again the free string in $d$ dimensions. Classically, the physical dofs are the $d-2$ transverse modes, and this remains true in the critical case. In the subcritical case, however, there are $d-1$ physical dofs, because in addition to the transverse modes the longitudinal mode has become physical; time-like vibrations remain unphysical.

5. There are no local observables in Quantum Gravity

This is essentially the same objection as the previous one, and it assumes that there are no diff anomalies. There can be no local observables in a theory with proper diffeomorphism symmetry because the centerless Virasoro algebra has no nontrivial unitary representations. If the diffeomorphism algebra acquires a nontrivial extension upon quantization, there is no reason why local observables should not exist.

Note that the same argument applies to Conformal Field Theory (CFT). There are no local observables in a theory with infinite conformal symmetry, but that is not a problem in CFT because the relevant symmetry is not infinite conformal but Virasoro; the central charge makes a difference.

lördag 17 september 2016

Barefoot dancer

Since the latest release of the DAZ-Blender importer I have mainly been tweaking materials. It is pretty important that the importer produces attractive materials automatically, because DAZ meshes contain lots of materials, and modifying all of them manually involves a lot of work. E.g., I think that a human uses some seventeen different materials for different parts of the body. This has some advantages in DAZ Studio, because you can easily add makeup by modifying the lip, eyelash, fingernail or toenail materials, but having so many materials does complicate editing afterwards in Blender

As a benchmark I will use the barefoot dancer is a standard product that ships (or used to ship, at least) with DAZ Studio. You can find pictures of her with Google, but since I don't know if it is legal to include other people's art on your blog (it is at least impolite), you have to look them up yourself.

So here is what I did. First I created to DAZ Studio (.duf) files, one containing the character in rest pose and the other the stage (Shaded haven). Then I created two Blender files. In the first I imported the stage, which required some tweaking because the importer does not always get object transformations right (working on this). Into the other I imported the character and merged the rigs (character, hair and each piece of clothing have separate rigs). Actually, every import was made twice, once with Blender Render being the render engine, in order to get the Blender Internal (BI) materials, and once with Cycles to get Cycles materials.

Finally, the set file was linked into the character file and the animation was loaded. I also loaded expression and chose something suitable, and created a simple three-point lighting setup. Here is a render using Blender Internal.

If you change the render engine to Cycles, the importer creates a Cycles material instead. Now, I have never really used Cycles and I don't understand how to control lights in it - it seems like most of the lighting comes from a world surface light whereas the actual lamps play a very marginal role. But even if the lighting is extremely dull, I think the materials look reasonably good.


The renders were made with the unstable version of the DAZ Importer, which can be downloaded from
https://bitbucket.org/Diffeomorphic/import-daz/downloads.

 

onsdag 14 september 2016

What's Missing in Quantum Gravity III: The Connection Between Locality and Observer Dependence

In the two first posts (1, 2) in this series I argued that the missing ingredients in Quantum Gravity (QG) are locality and observer dependence, respectively. Now it is time to make the connection between these concepts. But before making this connection at the end of this post, we need some rather massive calculations. The main reference is my CMP paper (essentially the same as arXiv:math-ph/9810003). That paper is somewhat difficult to follow, even for myself, so I recently updated it using more standard notation, arXiv:1502.07760.

Local operators in QG are only possible if the spacetime diffeomorphism algebra in $d$ dimensions, also known as the Lie algebra of vector fields $vect(d)$, acquires an extension upon quantization. This turns $vect(d)$ into the Virasoro algebra in $d$ dimensions, $Vir(d)$. In particular, $Vir(1)$ is the ordinary Virasoro algebra, the central extension of the algebra of vector fields on the circle, $vect(1)$. $Vir(1)$ is also a crucial ingredient in Conformal Field Theory (CFT), which successfully describes critical phenomena in two dimensions.

There is a simple recipe for constructing off-shell representations of $Vir(1)$:
1. Start with classical densities, which in the context of CFT are called primary fields.
2. Introduce canonical momenta for all fields.
3. Introduce a vacuum state which annihilates all negative-frequency operators, both the fields and their momenta.
4. Normal order to remove an infinite vacuum energy.

The last step introduces a central charge, turning $vect(1)$ into $Vir(1)$.

Intuitively one could try to duplicate these steps in higher dimensions, but that does not work. The reason is that normal ordering introduces an unrestricted sum over transverse directions, which makes the putative central extension infinite and thus meaningless. Instead we introduce a new step after the first one. The recipe now reads:

1. Start with classical tensor densities.
2. Expand all fields in a Taylor series around a privileged curve in $d$-dimensional spacetime, and truncate the Taylor series at order $p$.
3. Introduce canonical momenta for the Taylor data, which include both the Taylor coefficients and the points on the selected curve.
4. Introduce a vacuum state which annihiles negative frequency states.
5. Normal order to remove an infinite vacuum energy.

A $p$-jet is locally the same thing as a Taylor series truncated at order $p$, and we will use the two terms synonymously. Since the space of $p$-jets is finite-dimensional, the space of trajectories therein is spanned by finitely many functions of a single variable, which is precisely the situation where normal ordering can be done without introducing infinities coming from transverse directions.

Locally, a vector field is of the form $\xi^\mu(x) \partial_\mu$, where $x = (x^\mu)$ is a point in $d$-dimensional space and $\partial_\mu = \partial/\partial x^\mu$ the corresponding partial derivative. The bracket of two vectors fields reads
\[
[\xi, \eta] = \xi^\mu \partial_\mu \eta^\nu \partial_\nu -
\eta^\nu \partial_\nu \xi^\mu \partial_\mu.
\]
$vect(d)$ is defined by the brackets
\[
[{\cal L}_\xi, {\cal L}_\eta] = {\cal L}_{[\xi,\eta]}.
\]
The extension $Vir(d)$ depends on two parameters $c_1$ and $c_2$, both of which reduce to the central charge in one dimension:
\[
[{\cal L}_\xi, {\cal L}_\eta] = {\cal L}_{[\xi,\eta]} +
\frac{1}{2\pi i} \oint dt\ \dot q^\rho(t) \Big(
c_1 \partial_\rho \partial_\nu \xi^\mu(q(t)) \partial_\mu \eta^\nu(q(t)) \\
\qquad\qquad + c_2 \partial_\rho \partial_\mu \xi^\mu(q(t)) \partial_\nu \eta^\nu(q(t)) \Big), \\
{[}{\cal L}_\xi, q^\mu(t)] = \xi^\mu(q(t)), \\
{[}q^\mu(t), q^\nu(t')] = 0.
\]
In particular when $d=1$, vectors only have a single component and we can ignore the spacetime indices. The extension then reduces to
\[
\frac{1}{2\pi i} \oint dt\ \dot q(t) \xi^{\prime\prime}(q(t)) \eta^\prime(q(t))
= \frac{1}{2\pi i} \oint dq\ \xi^{\prime\prime}(q) \eta^\prime(q),
\]
which has the ordinary Virasoro form. The terms proportional to $c_1$ and $c_2$ where first written down by Rao and Moody and myself, respectively.

To carry out the construction explicitly for tensor-valued $p$-jets is quite cumbersome, but in the special case that we deal with $-1$-jets (which depend only on the expansion point, and not on any Taylor coefficients at all), formulas become manageable. Consider the Heisenberg algebra defined by the brackets
\[
[q^\mu, p_\nu] = i\delta^\mu_\nu, \qquad
[q^\mu, q^\nu] = [p_\mu, p_\nu] = 0.
\]
Clearly we can embed $vect(d)$ into the universal enveloping algebra of this Heisenberg algebra as follows:
\[
{\cal L}_\xi = i\xi^\mu(q) p_\mu.
\]
This embedding immediately yields a representation of $vect(d)$ on $C^\infty(q)$, the space of smooth functions of $q$, and also on the dual space $C^\infty(p)$. However, neither of the representations are particularly interesting because they do not exhibit any extension. However, with a slight modification of the construction, we immedately obtain something interesting. Consider the infinite-dimensional Heisenberg algebra where everything also depends on an extra variable $t$, which lives on the circle. It is defined by the brackets
\[
[q^\mu(t), p_\nu(t')] = i\delta^\mu_\nu \delta(t-t'), \qquad
[q^\mu(t), q^\nu(t')] = [p_\mu(t), p_\nu(t')] = 0.
\]
The new embedding of $vect(d)$ reads
\[
{\cal L}_\xi = i \oint dt\ \xi^\mu(q(t)) p_\mu(t).
\]
This operator acts on the space $C^\infty[q(t)]$ of smooth functionals of $q(t)$, and, after moving the momentum operator to the left, on the dual space $C^\infty[p(t)]$. Neither of these representations yields any extension.

However, there is a more physical way to split the Heisenberg algebra into creation and annihilation operators. Since the oscillators depend on a circle parameter $t$, they can be expanded in a Fourier series, and we postulate that the components of negative frequency annihilate the vacuum state. If $p^>_\mu(t)$  and $p^<_\mu(t)$ denote the positive and negative frequency parts of $p_\mu(t)$, respectively, and analogously for $q^\mu(t)$, the state space can be identified with $C^\infty[q_>(t), p^>(t)]$. We still have a problem with a infinite vacuum energy, but this can be taken care of by normal ordering. Hence we replace the expression for ${\cal L}_\xi$ by
\[
{\cal L}_\xi = i \oint dt\ :\xi^\mu(q(t)) p_\mu(t): \\
\equiv i \oint dt\ \Big(\xi^\mu(q(t)) p^<_\mu(t) + p^>_\mu(t) \xi^\mu(q(t)) \Big).
\]
This expression satisfies $Vir(d)$ with parameters $c_1 = 1$, $c_2 = 0$.

Finally, we can make the connection between locality and observer dependence. The off-shell representations of $Vir(1)$ act on quantized fields on the circle, and may therefore be viewed as one-dimensional Quantum Field Theory (QFT). In higher dimensions $Vir(d)$ act on quantized $p$-jet trajectories instead, so the theory deserves the name Quantum Jet Theory (QJT). $p$-jets (Taylor series) do not only depend on the function being expanded but also on the choice of expansion point, i.e. the observer's position.

The following diagram summarizes the argument:

Locality => Virasoro extension => $p$-jets => observer dependence.