• RSS
  • Twitter
  • FaceBook

Security Forums

Log in

FAQ | Search | Usergroups | Profile | Register | RSS | Posting Guidelines | Recent Posts

(PMC) Polymorphic Encryption

Users browsing this topic:0 Security Fans, 0 Stealth Security Fans
Registered Security Fans: None
Goto page 1, 2  Next
Post new topic   Reply to topic   Printer-friendly version    Networking/Security Forums Index -> Cryptographic Theory and Cryptanalysis - Internal and Transmission Security

View previous topic :: View next topic  
Author Message
Aether
Just Arrived
Just Arrived


Joined: 21 Aug 2003
Posts: 0


Offline

PostPosted: Mon Sep 22, 2003 3:05 am    Post subject: (PMC) Polymorphic Encryption Reply with quote

I was looking for some good encryption utils when I came across this.
http://english.cyprotect.com/main0110.php

PMC, it sounds like an interesting concept, but if it's so great why not implement it in other algorithms? Personally I just think that CyProtect AG is just snake oil. Something to make them look different (or better) then similar products.

Can anyone tell me if PMC has any merit? If so can anyone point me to any good algorithms.
Back to top
View user's profile Send private message
chewiepm
Just Arrived
Just Arrived


Joined: 05 Jul 2003
Posts: 3
Location: hellbound

Offline

PostPosted: Mon Sep 22, 2003 2:26 pm    Post subject: Reply with quote

Looks like crap, sounds like crap...probably is crap. Proprietory algos aren't worth sh*t. Because the developer can't break it doesn't mean that it can't be broken. Big terms like self compiling code usually good indication of snake oil.
________
MOTORCYCLE TIRES


Last edited by chewiepm on Fri Feb 18, 2011 1:50 am; edited 1 time in total
Back to top
View user's profile Send private message
squidly
Trusted SF Member
Trusted SF Member


Joined: 07 Oct 2002
Posts: 16777215
Location: Umm.. I dont know.. somewhere

Offline

PostPosted: Mon Sep 22, 2003 9:26 pm    Post subject: Reply with quote

I agree with Chewie. It looks like snake oil. Also they dont provide anything for you work with to verify. I have had thoughs of cypher-jumping.. but that is not like what they are talking about.
Back to top
View user's profile Send private message Visit poster's website AIM Address Yahoo Messenger
broellgen
Just Arrived
Just Arrived


Joined: 02 Dec 2003
Posts: 0
Location: Germany

Offline

PostPosted: Tue Dec 02, 2003 6:43 pm    Post subject: (PMC) Polymorphic Encryption Reply with quote

Sorry, I just saw this a bit late.

I'm familiar with PMC and I'm content that more and more people are interested in it. Since 1999, when the method was declassified in Germany, quite a bit has happened. Besides from its classic use, the algorithm is even employed to prevent reverse engineering of software that gathers realtime stock market data from a satellite.

With the years passing by and still nobody coming up with a more or less serious attack, polymorphic encryption seems to become a winner model. In the meantime other algorithms have been cracked. Here's a list:
http://www.asiertech.com/secondary/cracked.htm


The polymorphic encryption algorithm is proprietary, which of course does not improve credibility. On the other hand, we encrypt data at 5Gbit/s with an AMD Athlon XP1800+ processor and this is most likely the world speed record!
Who in this world gives an important base technology away for free? This thing is 10 times faster than AES 256. That's an order of magnitude!

Here's a paper about polymorphic encryption that describes a mathematic approach to this class of ciphers. The English language may be a bit bulky, but the mathematics are free of major bugs.
http://www.pmc-ciphers.com/technology/roellgen02generalizedPMCmodel.pdf

Hope that the discussions here are bit more profound than statements like: "Big terms like self compiling code usually good indication of snake oil".
The truth is that the crypto code is compiled by the algorithm. It's not much more but also not less. The strength of a cipher has nothing to do with excessive understatement by the developers, has it?

C.B. Roellgen
Back to top
View user's profile Send private message Visit poster's website
cisco student
Just Arrived
Just Arrived


Joined: 07 Sep 2003
Posts: 8
Location: SFDC USA: Chico, California

Offline

PostPosted: Tue Dec 02, 2003 11:14 pm    Post subject: Reply with quote

I have found that is a monster encryption. 2048 bit encryption, dang that would kill 256 bit. I will try encrypting my files with polymorphic encryption.
Back to top
View user's profile Send private message
flw
Forum Fanatic
Forum Fanatic


Joined: 27 May 2002
Posts: 16777215
Location: U.S.A.

Offline

PostPosted: Wed Dec 03, 2003 3:15 am    Post subject: Reply with quote

I noticed that site noted doesn't list the old man of crypto 3DES or the new kid AES.

Also it not just the bit size that makes crypto easily breakable or very difficult. Boiling it down to a single factor like bit size is an over simplication and lack of understanding of encryption. Thats what has made 3DES last as long as it has. Layers.......
Back to top
View user's profile Send private message Visit poster's website
JustinT
Trusted SF Member
Trusted SF Member


Joined: 17 Apr 2003
Posts: 16777215
Location: Asheville, NC, US / Uberlāndia, MG, Brazil

Offline

PostPosted: Wed Dec 03, 2003 8:43 am    Post subject: My take on PMC methodology. Reply with quote

If you are indeed C.B. Roellgen, then I would hope that you are familiar with the PMC methodology, being the author of it. Unfortunately, I can't say that for myself, or most people, as their is a tremendous lack of mathematical theory, in regards to presentation and what is available to the public analyst. It is a sad facet of this scheme, as being proprietary will likely render it short of obtaining any real merit of trust and confidence among the community.

The only concrete conclusion that can be made is the fact that the author's claims are hereby pointless, essentially, because there is no proof of formal analysis, therefore no proof of any resistance to any given attack model. I'm not being rude here, by using the word "pointless." I'm being factual.

However, even though I can't supply any analysis on the algorithm itself, to any valuable extent, because of the above statement, I can, on the other hand, supply my concerns as to the validity of the author's knowledge of proper cryptanalytic semantics. Aside from failing to produce any legible mathematical breakdown of the algorithm class itself, there are certain statements in the provided .pdf file that are truly bothersome.

Pseudo-Analysis Via Opinion

In the "Design Goal" section:

"Create a keystream generator that is nearly as powerful as a one time pad" - theoretically, for the One-Time-Pad method, "nearly as" isn't good enough. This holds true in comparison because the OTP is a theoretical stab at perfect secrecy. It is either provably secure in its entirety or not. Anything short is but a mere stream cipher, in practice. To strive for a secure PRNG, cryptographically speaking, is quite alright. It's not entirely accurate to compare this with an OTP scheme. This is just an opinion and one of the less-bothersome aspects of the paper.

Now, let's move on to statements and other figments that worry me
much more.

1. The author claims that differential analysis "is a kind of attack which can only be applied to block ciphers". I strongly disagree. From experience in analysis, it can be applied to stream ciphers. This worries me because the author clearly states that PMC has its "roots" in stream cipher design, yet isn't aware that differential cryptanalysis is a possible threat. Therefore, this is reason enough to believe that the author did not take this into precaution when designing the system. What also makes no sense is that the author also states that some PMC ciphers may be a mixture of both stream and block ciphers, thus making it logical to assume that differential cryptanalysis is an analyst's option, for PMC. Having said this, I question the cryptographic competence of Mr. Roellgen, myself.

2. The purpose of an algorithm white paper is not to elaborate on basic cryptanalysis or point out common cryptography-knowledge aspects of other related miscellany. The purpose is to elaborate on the algorithm itself, which I do not see enough of, in the paper. An analyst will already know all of the types of attack methodologies, so there isn't a need to reiterate this, in my opinion. You left out certain attacks, such as adaptive-chosen-plaintext attacks and rubber-hose cryptanalysis, on top of that. The primary concern here is that none of the claims to immunity carry substantiated proof.

3. There is no uniform rationale presented in the paper. It appears as a mixture of various aspects of cryptography which in turn do not relate to PMC in a way that would juxtapose the security of this new design with other existing algorithms. This lies greatly to the lack of information on the PMC structure itself, in the sense of mathematics. The section devoted to the "security" of PMC has only a spoonful of lines, which I will quote:

"The absence of a detailed mathematical model for the presented cipher makes cryptanalysis a difficult task. As the cipher has as many facets as the number of different password combinations and as the set of primitive PRNGs depends on the actual implementation of the crypto compiler, a more general point of view is appropriate."

This alone negates the entire purpose of Kerckhoff's assumption, therefore negating the reason to even mention any immunity to cryptanalysis. There is no excuse for not presenting a legible set of semantics that properly outline every mathematical detail of the PMC class. This is either because you aren't capable of producing such a paper, or do not agree with Kerckhoff's criterion, of which is a reasonable assumption in itself. There is no credibility or sense of merit, if you believe that concealing the guts of an algorithm is a service to the welfare of proposed cryptographic primitives. I will, however, thank you for making it clear that there is an inherent difficulty of analysis.

Various Other Worries

In light of DPA attack methodology, there is no mathematical proof given to suggest its immunity to side-channel attacks. Contrary to whatever one's opposing view may be, designing an algorithm to be resilient in the face of side-channel attack methodology is a rather tricky matter. Very difficult, to say the least. With a lack of detailed analysis in regards to this, I find it hard to trust. There was also no mention of timing attacks or RF emissions. Any reasons why?

I find it equally hard to trust the proposed new mode of operation (CRM, or Cipher Restructuring/Recompilation Mode), of which lacks details in presentation. Because there was no extensive discussion on these areas, I worry about the author's knowledge of them.

The Approach of An Amateur

My three primary arguments of your paper include:

- Lack of cryptanalytic knowledge portrayal
- Lack of mathematical proofs and concepts
- Lack of rationale

In all honesty, I believe this is an attempt by someone who wishes to follow proper design etiquette, yet lacks the cryptographic knowledge necessary to substantiate claims. While there is nothing wrong with designing an algorithm, I do find it amateur-ish to make such claims as:

"By combining speed, strength and adaptability, PMC Ciphers is able to provide the best ciphers on the market." - but wait, shouldn't there be a reason to actually believe this? ...like public analysis?

There are already many ciphers which combine speed, strength, and adaptability, which are better. What defines "best" in this case? Easy.

They are all peer-reviewed. These are the best ciphers. I am not against this methodology, because I am competent enough as a cryptographer to accept the possibility of new, secure designs. However, I am also competent enough to accept this only from reputable sources of which put extra emphasis on presentation.

I am glad to see the author agrees that there is no improvement of credibility by using a proprietary algorithm. Rest assured that the supposed speed will not make up for this.

The Link on "Cracked Ciphers"

Also, I believe this to be the attempt of a cryptographic neophyte, because of the link provided which discusses "cracked" algorithms. The link isn't too appealing because it provides a list of weak implementations, for the most part. The majority ciphers it does include were attacked by means of brute force, due to inadequate key lengths, factorization, or deduction of an ECC key. This is more or less a display of computational power, in regards to exhaustion, and holds little relevance to this matter. I am not so sure as to why this link would have been recommended, as it does a poor job of describing attacks. By just looking at the page, you would see that PGP is listed as a technology of 56 bits, yet the given link has no occurrence of "56 bits" or why it is significant. This is an obvious fault. PGP isn't 56 bits. PGP isn't broken.

It may use a 56-bit keyed algorithm which may be broken, but that has no relevance to the link. They make it sound as if RC4 is broken (WEP), and that ECC is an inferior method, without fully grasping the fact that most of these are implementation vulnerabilities, rather than based on the algorithms themselves, aside from brute force, factorization, or any other method that relies on the length of a cryptographic key or relative component. Again, this paragraph displays the lack of rationale and/or cryptanalytic knowledge. There's no need to dive into this pool of irrelevance any deeper.

What I Gather From the Design

By looking at the diagram, it appears that the technique is similar, in partial concept, to that of Algorithm M, where there is a cascading of multiple PRNGs. In fact, it seems that this quantitative approach is the only secure manner in which PMC behaves. I notice that you use linear congruential generators, additive congruential generators, multiply-with-carry generators, et cetera. Keep in mind that linear congruential generators and additive generators are not secure by themselves. In fact, LCGs can be trivially broken, as can additive. Generators of the following forms, including LCGs, can also be broken:

Xn = (aXn-1 + b) mod m

Xn = (aXn-1^2 + bXn-1 + c) mod m

Xn = (aXn-1^3 + bXn-1^2 + cXn-1 + d) mod m

This is just to name a few.

Be exceptionally careful when choosing a generator. Stream cipher design is risky business and requires much more mathematical theory than that of block cipher design. The one false myth that you should be aware of is combining linear congruential generators. Although it may appear to perform better on tests of entropy, it does not improve the scenario, cryptographically. In other words, you aren't gaining security. It is your responsibility to study the theory that subsides, indepth, and pay attention to the demeanor of multiple PRNG usage. Otherwise, alone, the bulk of the algorithm's components are useless. Even then, LCGs aren't suitable for cryptography. I like to view stream cipher design in four categories, just like that of Rueppel. It may aid in organization of thought, to class your approach as one of the following theoretical paths:

- System
- Information
- Complexity
- Randomized

System - With this path, you want to define a cumbersome predicament, based on some principle set of algorithm design methodology.

Information - Keep the plaintext away from the analyst. You don't want he/she finding an answer that turns out to be unique, regardless of how involved their analysis may be.

Complexity - Simply having the foundation's base within that of a known problem, such as the difficulty of factorization, for example.

Randomized - Throw off the analyst with a lot of mess, to thwart or lengthen analysis.

Also, there are certain suggested (but not required) components that make up design criteria, such as:

- Confusion
- Diffusion
- Long, non-repeating periods
- Non-linearity
- Linear complexity
- Et cetera.

Can you justify PMC, and/or its security, based on the above techniques? Is there any defined path at all, for your algorithm's structure? If so, as you claim to not be able to provide a defined, static mathematical overview of PMC, can you be sure that PMC will render the same level of security upon each unique manifestation? I am curious as to your motive for choosing this setup. I can provide you with much better configurations which would provide much more security than that of LCG-inclusion.

Any further attempts at analysis would be speculation, as I have nothing more to work with. Perhaps you could provide a better presentation of this methodology, for the sake of us cryptanalysts. That would be good. In fact, that would be your first step in becoming a respected cryptographer. Down with the heathen obscurity, I say.

Any comments?

Conclusion

PMC may or may not be a secure design. It exhibits presentation by that of an amateur to cryptography, through the lack of portrayed cryptanalytic know-how. I completely shun the idea of the attempt to market a proprietary algorithm. It is an ignorant notion. There is no appeal in security via algorithm obscurity. If the author himself cannot distinguish between where differential cryptanalysis is and is not applicable, I will not use, or condone the use of, such a primitive. As I will always swear by, you must know insecurity first, before you can achieve security.

There needs to be detailed design criteria, rationale, basis, et cetera, when designing an algorithm, accompanied by legible analysis. Otherwise, there is no means of forming trust or confidence in the algorithm. This is more than just a pragmatic argument - this is sound advice. Sound advice from a fellow cryptographer.

Remember, proprietary, closed-source algorithms are the fine lines between the commercially-blind community and open cryptographic community. In other words, you will gain much more knowledge, credible analysis, and respect, by keeping it open and shared. It is as simple as following a little proposal etiquette. Properly present the algorithm and voila, analysis. With analysis comes the possibility of trust and confidence - the cornerstones of any wholesome algorithm. There is no market for algorithms. You will learn nothing of true value if you don't share it with those who are seasoned in the area of cryptanalysis.

Peer-review. Don't design algorithms without it. ...and for Shannon's sake, (crypto pun, there) don't propose something as secure, if you can't reassure the community of this. Trust me, we won't take your word for it. Only a fool would. This is a general tactic. Nothing personal.

Cheers, Mr. Roellgen.


Last edited by JustinT on Wed Jan 26, 2005 2:05 pm; edited 7 times in total
Back to top
View user's profile Send private message Visit poster's website
chewiepm
Just Arrived
Just Arrived


Joined: 05 Jul 2003
Posts: 3
Location: hellbound

Offline

PostPosted: Wed Dec 03, 2003 12:28 pm    Post subject: Reply with quote

Unfortunately, we are all just people behind names. For example, chewie is not my real name. However, there is a man out there on the internet who is who he says he is.

I thought Bruce Almighty(Schneier) had something to say on the subject of pmc. We may not be world renown experts, even though Justin does know his stuff, but Bruce also knows his stuff and is a world renown expert.

Here is a reprint of his cryptogram newsletter

http://archives.neohapsis.com/archives/crypto/2003-q1/0004.html

He don't think much of pmc either.
________
IOLITE VAPORIZER REVIEW


Last edited by chewiepm on Sat Feb 19, 2011 5:00 am; edited 1 time in total
Back to top
View user's profile Send private message
broellgen
Just Arrived
Just Arrived


Joined: 02 Dec 2003
Posts: 0
Location: Germany

Offline

PostPosted: Wed Dec 03, 2003 8:20 pm    Post subject: Polymorphic Encryption Reply with quote

To Mr. Schneier: He seems not to be that almighty: He didn't say more than the usual to probably get rid of possible competition. He also didn't react on this e-mail sent to schneier@counterpane.com on March 18, 2003:

broellgen wrote:

Dear Mr. Schneier,

my name is Clemens Bernhard Roellgen and I’ve invented the so-called “Polymorphic Cipher” in 1999. I can understand your comments on the publically available documentation of it. Your comments certainly have a negative effect on current and future projects.

It would have a been a great pleasure to have had an e-mail conversation with you earlier. Anyway, it would be a great honor to have a technical discussion with one of the greatest crypto experts of our time.

It is understandable that your newsletter should warn of snake oil. But what happens if hasty conclusions are made by accident? We all can be wrong sometimes.

What if the proposed cipher is as secure as ciphers like Rijndael or Twofish? It might even be substancially faster, or DPA-proof, or have other positive and negative features?

Shortly after patenting the idea in 1999, the German authorities wanted to make the cipher a state secret, but dropped that attempt after two months. Later I found out (partly with the help of a news editor) that the proper experts had never been asked! They didn’t say if in the end it was right or wrong not to make the patent a state secret.

Then we asked Prof. Dieter Bartmann from the Institute for Bank Computer Science and Bank Strategy from the University in Regensburg (the name of the department has been translated from the German language as good as possible), for his judegement. It was positive.

Dr. Schirle from IZB Soft in Munich, a mathematician, found some errors in our documentation, but also came to a positive result. IZB Soft is a comparably big IT security company which works entirely for banks.

I don’t think that you will ever have the chance to revise your judgement, but isn’t it a probably simple but good idea to be able to choose from two different ciphers like Rijndael and Twofish with one password bit? Maybe the software can make the choice dependent on two password bits. The choice could be one out of these 4 ciphers: Rijndael, Serpent, Mars and Twofish.

By doing this, the password can be as long as 258 bits. The two “cipher select” bits don’t play a role during the actual encryption and decryption process any more. They don’t consume CPU time after the choice has been made. Such an implementation is just as fast as the average of all four worker ciphers when encrypting a big amount of data, but comes with two additional password bits.

A brute force attack on this kind of “cipher” takes 4 times as long as a brute force attack on just one of the worker ciphers (Rijndael, Serpent, etc., alone).

If (by very hard work and much more than luck) a method is found to crack one of the 4 worker ciphers, then the other three are still likely to be secure.

If the number of available worker ciphers isn’t only limited to just 4, wouldn’t it be tedious and hopeless to try and crack each of the available worker ciphers? Let’s think of 128 such worker ciphers which could be available from 7 additional password bits. Cracking one of these is probably impossible, but cracking a number of them, if they are of the calibre of Twofish, is definitely hopeless.

This is (basic) Polymorphic Encryption. It’s not much more, but it’s also not much less.

We would be pleased if you could have a less superficial look at our cipher. I would send you source code, as well as a number of papers which describe the source code in detail. But as not all aspects of the cipher are patented so far, we would kindly ask you to sign an NDA. If you are still negative after having another look at the concept, you can publish this of course, as long as the information which is sent out cannot be used by competitors to create a similar cipher or ideas can be “borrowed”.

Hope to hear from you soon!

Best regards,

Bernd Roellgen
PMC Ciphers, Inc.
Josephsburgstr. 85
81673 Muenchen
Germany

From your newsletter:

PMC Ciphers. The theory description is so filled with pseudo-cryptography that it's funny to read. Hypotheses are presented as conclusions. Current research is misstated or ignored. The first link is a technical paper with four references, three of them written before 1975. Who needs thirty years of cryptographic research when you have polymorphic cipher theory?



JustinT although is of a totally different calibre: He gives reasons for
his statements and conclusions. It's just a bit too much for one day.
As I can still, without doubt, gain cryptographic competence, I feel free
to cite basics to underline my thoughts.

JustinT wrote:

To 1. The author claims that differential analysis "is a kind of
attack which can only be applied to block ciphers":


Biham & Shamir wrote 1991 in their paper "Differential Cryptanalysis attacks (1991)":

Differential cryptanalysis is a method which analyzes the effect of
particular differences in plaintext pairs on the differences of the
resultant ciphertext pairs. These differences can be used to assign
probabilities to the possible keys and to locate the most probable key.

Biham & Shamir showed how to use differential cryptanalysis to crack DES
reduced to 4 .. 8 rounds. They exploited the fact that not all the
output XORs are possible and that some XORed values appear much more
frequently than others.

Page 12: "In DES any S box has 64 * 64 possible input pairs, and each one of
them has an input XOR and an output XOR. There are only 64 * 16 possible
tuples of input and output XORs. Therefore, each tuple results in average
from four pairs. However, not all the tuples exist as a result of a pair,
and the existing ones do not have a uniform distribution."

Is PMC vulnerable to Differential cryptanalysis?

PMC consists of a compiler that places a number of pseudo-random number
generator primitives one after the other. The compiler modifies each random
number generator primitive in its functionality. By doing this, the compiled code
is more than just a number of concatenated uniform primitives. Input to the
compiler is solely the key. The number of equally probable compiled PRNGs is
2^keysize. The stage that turns plaintext into ciphertext is ideally another RNG
that is as flexible as possible and that is biased by the first PRNG.

Differential cryptanalysis tries to find a relationship between different ciphertext
pairs from different plaintext pairs and link them with a probable key or a set of
probable keys. Each compiled PRNG with the corresponding key has its own
characteristic and its weakness. Assuming that the compiled PRNG generates a
pseudorandom bit stream that is XORed with plaintext bits, differences in ciphertext
pairs map directly onto differences in plaintext bits. So there's already one decisive
premise missing: plaintext and ciphertext map directly onto each other through a very
simple function (XOR).

Can the key be revealed through other means?

2^keysize different compiled PRNGs exist. With a codebook, it would be easy to link the
confusion bit stream with the corresponding key. Codebook attacks are effectively prevented
by choosing long (effective) keys. Breaking PMC would be possible if there exists a method that
relates a key to the confusion bit stream that is generated by the (selected) compiled PRNG.
Possibly there exist certain groups of keys that generate similar compiled PRNGs with similarities
in their confusion bit streams. This could be exploited.

JustinT wrote:

2. ".. The purpose is to elaborate on the algorithm itself, which I do not see enough of, in the paper."


Right.

Adaptive-chosen-plaintext attack: In an adaptive chosen plaintext attack the attack can choose
the next plaintext to be encrypted after they analyze the results of previous chosen plaintext
encryptions. In this attack the cryptanalyst must determine the key. This type of attack is similar
to the chosen plaintext attack. With the compiled PRNG used in a basic stream cipher configuration
(like above) it's easy to get the confusion sequence. There's even no need to change plaintext as
there is no interaction with the key (unlike with Feistel structures).

Rubber-hose cryptanalysis:
[sci.crypt newsgroup] The technique of breaking a code or cipher by finding someone who has the
key and applying a rubber hose vigorously and repeatedly to the soles of that luckless person's feet
until the key is discovered. Shorthand for any method of coercion: the originator of the term drily
noted that it “can take a surprisingly short time and is quite computationally inexpensive”
relative to other cryptanalysis methods.

Are you kidding? Only people with Alzheimer's disease have a chance...

JustinT wrote:

3. ".. This alone negates the entire purpose of Kerckhoff's assumption,
therefore negating the reason to even mention any immunity to
cryptanalysis."


Kerckhoff's assumption: the algorithms are never secret -- only the key is secret.

The same as for Bruce Schneier applies: ".. I would send you source code, as well as a number of
papers which describe the source code in detail. But as not all aspects of the cipher are patented so
far, we would kindly ask you to sign an NDA. If you are still negative after having another look at the
concept, you can publish this of course, as long as the information which is sent out cannot be used
by competitors to create a similar cipher or ideas can be “borrowed”. .."

So far Mr. Schneier was not interested in more infos than readily available on our web site.


To "Various Other Worries":

- DPA immunity: Correct, detailed information about the actual implementation is required.

- The Approach of An Amateur: The last big leap in cryptography seems to have taken place in 1976
and 1977 when Diffie-Hellman and RSA were published. If a clever housewife would have written these
papers, we would probably not even know about these great algorithms.

- The Link on "Cracked Ciphers": The link lacks quality, but nevertheless most of it seems to be true.
I don't want to know how many banking systems e.g. still rely on DES. One bank guy told me in 2000:
"We are only allowed to use certified algorithms. DES is certified."

OK. Since 1999, the average time to crack any DES-encrypted message was 22 1/4 hours. Only in
2001, AES became an official standard. Consequently the industry had to rely for 2 years on something
that is insecure. Oops!

- About LCGs: Correct. The paper lacks design specifics.

Quote:

".. If so, as you claim to not be able to provide a defined, static mathematical overview of PMC, can you
be sure that PMC will render the same level of security upon each unique manifestation? .."


It's not appropriate to describe a light bulb with "wick", "wax" and "fire". The nature of PMC is that it's all
but static. Except for the crypto compiler. Consequently it's natural that it can only be described using
different techniques. I tried to use entropy.

- Peer-review: JustinT, you're invited.


Most people here place some kind of slogan at the end.

-------------

128 bit are definitely secure, but the algorithm runs faster at 512 bit. It's nice to live with 512 bit, but not with a slow cipher.

C.B. Roellgen

Formatted by JustinT, for clarity's sake.
Back to top
View user's profile Send private message Visit poster's website
JustinT
Trusted SF Member
Trusted SF Member


Joined: 17 Apr 2003
Posts: 16777215
Location: Asheville, NC, US / Uberlāndia, MG, Brazil

Offline

PostPosted: Thu Dec 04, 2003 2:19 am    Post subject: Further thoughts on Polymorphic Encryption. Reply with quote

I value the reply.

As for Schneier's demeanor in this situation - I can relate. In fact, I can agree with his view. Your presentation does emit a snake-oil odor, as well as being pseudo-cryptographic in what little detail there is. To a cryptanalyst, this doesn't make for an appealing read. There should be more signs of reference, to signify thorough research into the field of which represents the basis for your design methodology. This isn't taken lightly by the community. Your intentions may be sound and well, but you certainly need to brush up on the proposal aspect of it.

I'm not trying to step on your toes here. I'm just being honest. I would be lying if I didn't say that your know-how and design choices are fundamentally flawed. This is nothing that can't be corrected, fortunately.

I don't view the core of your methodology as snake-oil, but here is what I do see. I see an amateur who is working with pre-existing theory, yet is unaware of the structure's security and the proper etiquette for proposing an algorithm. Algorithm proposal is not based upon "what if <insert_algorithm_here> is secure against this attack or that attack." It is based upon the analytical criteria in which an algorithm is designed to resist. You should never begin the proposal until you have finished the preliminary analysis. If you must rely on the public for the basic analysis in which defines what the algorithm is truly resilient to and where it falls short, then you should not be proposing an algorithm at all. More so, to do this commercially is a rather selfish action towards the cryptographic community.

There are really no ifs ands or buts about it. To exhibit the:

- Lack of cryptanalytic knowledge portrayal
- Lack of mathematical proofs and concepts
- Lack of rationale

Again, to exhibit the above is to reek of amateurishness. There is nothing wrong with being an amateur in algorithm design, but on the other hand, one should always be professional in their presentation. I'm not quite sure what Diffie-Hellman, RSA, or a clever housewife has to do with this. ;)

In light of adaptive-chosen-plaintext attacks and rubber-hose cryptanalysis, my only reason for mentioning those two is because you felt it necessary to list possible attacks on ciphers. If you are going to list 75% of them, why not list the other 25%? Either way, without analysis and mathematical proof, to claim immunity to any of them is invalid.

As for my statement on Kerckhoff's assumption. I am well-versed in this principle, which leads to my implication that you do not follow the assumption, as your algorithm is nearly as obscure as the key. This alone negates the entire proposal. Not only is it a contradiction, it is the hindrance of any analysis you could receive, otherwise.

The summary of Biham and Shamir's article that you have provided, in regards to differential cryptanalysis, isn't relevant to my point. My point is that you do not seem to be aware that this analysis doesn't just apply to block ciphers. It has been used numerous times, while performing analysis on stream ciphers. Because you aren't aware of this, there is a high probability that you have failed to notice an attack of this nature.

Now, allow me to do a short breakdown of the first component of the PMC primitive - the LCG, or linear congruential generator. As you should already be familiar with, a generator of this type has a form of:

Xn = (aXn-1 + b) mod m

Now, to define each of the variables, a, b, and m, as constants:

- a is a multiplier
- b is an increment
- m is a modulus

The seed value is simply X0.

The generator's period can be < or = to m. To ensure that you achieve a period of m, you must choose carefully, the constants. If this is done successfully, the result with be maximal period. One instance is the desire for b to be relatively prime to m. Knuth and L'Ecuyer have both written extensively on proper techniques to ensure maximal length. Choosing good constants is a vital aspect of deploying a linear congruential generator. It's important to pay attention to certain properties, including how well the constants hold up, in certain randomness tests.

Sure, these generators are very fast, but on the other hand, very broken. For that matter, any polynomial-based congruential generator is broken. If you study any of Boyar's works, you will find information on the bulk of this.

You state that PMC is all but static, therefore making it reasonable to assume that PMC is dynamic, obviously. Because of this, the constants are likely to be dynamic as well. How do you ensure that upon each dynamic use of PMC, every instance of the linear congruential generator will contain a maximal period? If you can't, then you could end up with much shorter, repeating periods, as opposed to one that is much longer and not as repetitive. As I have mentioned, there is no cryptographic strength in using these generators. If you can spare the decrease in software performance, a well-thought-out LFSR approach would render much more security.

You have admitted that each component has its weakness, yet I have seen no elaborated analysis on these weaknesses. All in all, using a cascade of components doesn't guarantee increased security. Using components that are weak individually, such as linear congruential generators, is not always a smart idea - be it combined with other elements or not. In fact, interactions between multiple entities can actually cause a decrease in security.

Just to jolt your knowledge a bit, can you tell me why this simple additive generator is maximal length? or maybe it isn't. You tell me. Just be honest. There is no shame learning something new. There is only shame in pretending to understand something you don't. This will greatly reduce your chances of producing anything useful, within your cryptographic reach.

Xi = (Xi-57 + Xi-7) mod 2^n

Recognizing this property (or lack thereof) of an additive generator, (or often referred to as lagged Fibonacci) is vital in understanding the level of security you have, when deploying this as a block in the overall construction of a secure generator.

Here's an excerpt from my similar post, regarding multiple block cipher usage. This is a bit like your "one after another" approach.

JustinT wrote:

In terms of cascading encryption, there are pros and cons. Security can be decreased as easily as it can be increased. If all multiple keys are independent of one another, then the cascade is at least as difficult as breaking the first algorithm. Suppose the second algorithm is subject to a chosen-plaintext attack. The first algorithm could initiate this attack, making the second algorithm subject to a known-plaintext attack, now, when used in this cascading scheme. With a chosen-plaintext attack, the cascade is shown to be at least as strong as the any of the component ciphers. It is also assumed by some that it is at least as strong as the strongest cipher. With stream ciphers and block ciphers in OFB mode, the algorithms commute, therefore, the cascade is, in fact, as strong as the strongest algorithm.


For the entire post, refer to this link.

To grasp on the concept of PMC being static, consider the security issue at hand. My questioning of the level of security upon each unique manifestation is important. Because it is dynamic, and because each generator requires carefully chosen elements in order to achieve maximal periods, if this aspect isn't carefully monitored and controlled, then one instance of the algorithm may be signficantly weaker than another, and possibly fatal to data that was encrypted within that particular instance.

I said it before and I'll say it again - there isn't enough information to go on, for worthwhile analysis to take place. If you wish to receive detailed analysis and subsiding mathematical theory, please take the time to prepare an extensive white paper on the matter, accompanied by legible source code and such. If you feel comfortable with providing me with this detailed information, via NDA, feel free to contact me, as I am capable of providing consultation and a fresh insight into relative aspects of number theory.

That sums it up for now. I look forward to learning more of this scheme, if at all possible. Until that time, or your next reply, good luck, Godspeed, and adios.


Last edited by JustinT on Sun Sep 05, 2004 3:54 am; edited 2 times in total
Back to top
View user's profile Send private message Visit poster's website
broellgen
Just Arrived
Just Arrived


Joined: 02 Dec 2003
Posts: 0
Location: Germany

Offline

PostPosted: Thu Dec 04, 2003 8:54 pm    Post subject: Polymorphic Cipher Reply with quote

Thank you for not viewing the core of polymorphic encryption methodology not as snake-oil.
Maybe I can even show one of the most decisive advantages of compiled cryptographic machine code. For this, your instructive example is great:

For the Lagged Fibonacci Generator, the period should be (2^57-1)*2^31. I don't know if F(57,7,+) has been thoroughly tested to pass each and every randomness test, but that's probably possible to find out.

Thank you for giving me such a brilliant example to outline one of the most important design principles of the compiled PRNG. The expression Xi = (Xi-p + Xi-q) mod 2^n is much nicer when it is compiled by using selected or even arbitrary values q and p and by choosing the binary arithmetic operator from one of these: addition, subtraction, xor and multiplication. Modern microprocessors have 'MOVE dest,source' instructions with fixed offsets (e.g. MOV eax,[edi+7]). The example MOV eax,[edi+7] even comes with a variable offset. A 256 bit Internal State which is aligned at 32 bit boundaries, has a depth of 8 DWORDs. The expression Xi = (Xi-1 + Xi-2) mod 2^n, as well as Xi = (Xi-3 * Xi-8) mod 2^n are just two out of 3*7*6=126 possible, but not equally good Lagged Fibonacci Generators. Some will have a short period. Limiting p and q to values greater than 2 probably make F(p,q,o) very suitable as primitive PRNG for a polymorphic cipher.

All that is needed is some divergent thinking, to break out of "stuff that has been chewed again and again since decades". A crypto compiler could for example compile Xp[u] = (Xp[r] + Xp[s] - Xp[t]) mod 2^n with all Xp being variable themselves.
This makes things probably a bit too difficult to analyse, but that's a challenge for a cryptanalyst, isn't it?

But things can become even more complex:
Again a little "breakout" from standard theory could be of use:
LCGs are weak. You have already outlined this repeatedly. They contain no internal state at all. That's the primary reason for their weakness.

Xn = (aXn-1 + b) mod m

a and b may be chosen so as to have integer overflow on nearly every step. The sequence will be less predictable. a and b must be relatively prime numbers. For fans of "static" implementations, a = 16807, b = 0, m = 2147483647 seem to be good choices.

Touching up the LCG may yield this: Xp[t] = (aXp[r] + b) mod m.
Even better: Xp[t] = (a[s]*Xp[r] + b[t]) mod m. a[] and b[] are tables with more or less carefully selected prime numbers.

It's not easy to recognize an LCG in these expressions. In fact there is not much that resembles to the commonly known LCG.

A compiled PRNG consisting of just 3 primitives may be the concatenation of two modified LCGs and a modified ACG (I only have to copy-paste my examples):

Xp[s] = (a[u]*Xp[r] + b[s]) mod m;
s = (Xp[s] + Xp[t]) & 7;
Xp[r] = (a[s]*Xp[r] + b[t]) mod m;
r = (Xp[u] + Xp[t]) & 7;
Xp[u] = (Xp[r] + Xp[s] - Xp[t]) mod 2^n;
u = (Xp[u] - Xp[r]) & 7;


That has not much in common with a simple "cipher cascade". It's more complex. The compiler can choose the basic functions and all function parameters from a set of parameters. I found it difficult to find a mathematic expression for this. My capabilities are clearly limited here. Help me out, please!

There's something else that is appealing: The compiler can generate a compiled PRNG that is huge. Makes sense with a huge internal state only, of course. What about these MLFGs (multiplicative Lagged Fibonacci RNGs) F(521,168,+) or even F(9689,4187,+)?



Although I'm convinced that the principle of cascading ciphers does not even the least apply to PMC, here's an expert's view on it:

Applied Cryptography, Second Edition: Protocols, Algorithms, and Source Code in C by Bruce Schneier, Chapter 15.7 Cascading Multiple Block Algorithms.

Your comment sounds pretty similar, but Mr. Schneier further points out that ".. it can't hurt and could very well increase security. Remember that the keys for each algorithm in the cascade must be independent. If Algorithm A has a 64-bit key and Algorithm B has a 128-bit key, then the resultant cascade must have a 192-bit key. If you don't use independent keys, then the pessimists are much more likely to be right."

Here I tend to trust Mr. Schneier. Pure respect, although never returned.


"If you feel comfortable with providing me with this detailed information, via NDA, feel free to contact me, as I am capable of providing consultation and a fresh insight into relative aspects of number theory."

Thank you for your offer! How can I contact you? You can easily find my e-mail address through www.pmc-ciphers.com. Your experience in cryptanalysis is definitely valuable.

C.B. Roellgen
Back to top
View user's profile Send private message Visit poster's website
JustinT
Trusted SF Member
Trusted SF Member


Joined: 17 Apr 2003
Posts: 16777215
Location: Asheville, NC, US / Uberlāndia, MG, Brazil

Offline

PostPosted: Tue Dec 09, 2003 9:55 am    Post subject: Analysis. Reply with quote

The example of an additive generator that I provided is as follows:

Xi = (Xi-57 + Xi-7) mod 2^n

My question of it being maximal length was as simple as identifying the polynomial as primitive, with a total of three coefficients. Therefore, it is maximal length. Keep in mind that if the primitive polynomial has more than three coefficients, extra measures must be taken to ensure that you achieve maximal length.

One important question may be - How do you cope with the issue of primitive polynomials? In other words, how do you know that each generator in use is providing maximal length sequences? Supposing a generator doesn't provide such a sequence, is there a possibility that the output of the compiled generator would be the result of poorly constructed, repeating sequence induced generators? From what I can see, the entity must be static at all times, in this respect. You must prove that each unique state is as secure as any other given unique state.

I am familiar with Schneier's comments in Applied Cryptography. If you were to read further, you will also see that he agrees that multiple encryption schemes do not inherently increase the security margin. My analogy between the similarities of both cascading multiple block algorithms and compiled PRNGs is logical. I wasn't implying that they are alike in structure. I wasn't implying anything other than the fact that both scenarios involve the utilization of multiple algorithms/generators in an "one after another" approach. Be it cascading or compiled, either way, it's using multiple elements in succession. This is just a joining of concepts, on the surface. It has no relevance to the actual components of either scheme. (In this case, block ciphers and generators.)

In regards to analysis, we can discuss these facets of design, if you'd like. "These facets" being the concepts discussed here and any other aspects yet to be discussed. My schedule is tight, but I would be interested in performing some analysis, if at all possible. You are quite welcome for the offer. This is what I enjoy most. If in any event I can do it to the benefit of others, I'm all for it.

We can discuss consultation/analysis matters from that point. Request my phone number, if you prefer telephony as your choice of conversation medium.

Cheers, Mr. Roellgen.


Last edited by JustinT on Sun Sep 05, 2004 4:03 am; edited 3 times in total
Back to top
View user's profile Send private message Visit poster's website
broellgen
Just Arrived
Just Arrived


Joined: 02 Dec 2003
Posts: 0
Location: Germany

Offline

PostPosted: Tue Dec 09, 2003 8:46 pm    Post subject: Polymorphic encryption Reply with quote

It is probably my mistake not to provide sufficient motivation for the thesis that maximum length for the period of a primitive PRNG in a compiled PRNG is of minor importance or is even irrelevant.

In a compiled PRNG the basic building blocks lose their inherent characteristics.

My example last time was:

Line 1: Xp[s] = (a[u]*Xp[r] + b[s]) mod m;
Line 2: s = (Xp[s] + Xp[t]) & 7;
Line 3: Xp[r] = (a[s]*Xp[r] + b[t]) mod m;
Line 4: r = (Xp[u] + Xp[t]) & 7;
Line 5: Xp[u] = (Xp[r] + Xp[s] - Xp[t]) mod 2^n;
Line 6: u = (Xp[u] - Xp[r]) & 7;

The primitive PRNGs in line 1 and 3 ressemble an LCG, while line 5 looks somewhat like an ACG. But none of the primitives operate independent from each other and they are also not static. In line 1, the s-th element of the internal state is assigned a value that depends on a variable, a constant and the r-th element of the internal state. It is NOT as static as
Xn = (aXn-1 + b) mod m
because r is not always "1" and s not always "0".

In line 2, 4 and 6 there's always a new index computed to replace the one that was recently used. The compiled PRNG is intended to continue with line 1 after line 6 was executed. The loop continues forever. The three primitive PRNGs are interdependent.
For the entity (line 1 to 6) it is very interesting to know the period. Aggravating circumstances are caused by the crypto compiler. It is free to choose constants, binary operations, indices, primitive PRNG building blocks, and even the number of primitive PRNG building blocks.

We've even proposed a new operating mode for this new kind of PRNG: "Cipher Recompilation Mode". The idea behind this is to change the PRNG algorithm itself. Feeding back data to a static algorithm is certainly clever, but modifying or recompiling the whole algorithm may be cooler. Take it as a proposition from an amateur, if you like.
I'm more interested in finding solutions to describe and to analyze compiled PRNGs rather than finding ways to prove that the periods of static building blocks are maximum. As these building blocks interact, the theories behind each of them lose their relevance for long-term predictions like periodicity. It is obvious that this is not just a joining of concepts, on the surface.

It is an open discussion from my side. It would be great if everybody in the discussion about polymorphic encryption has a similar attitude.

I'm looking forward to an analysis from your side, JustinT!

C.B. Roellgen
Back to top
View user's profile Send private message Visit poster's website
JustinT
Trusted SF Member
Trusted SF Member


Joined: 17 Apr 2003
Posts: 16777215
Location: Asheville, NC, US / Uberlāndia, MG, Brazil

Offline

PostPosted: Tue Dec 09, 2003 10:57 pm    Post subject: More information required. Reply with quote

That explains things a bit. Gracias. This being defined, what was your motivation for choosing these particular generators to operate in this particular succession?

I believe you did miss the entire point of me saying that "This is just a joining of concepts, on the surface. It has no relevance to the actual components of either scheme." You must read the context. I was only drawing a comparison between the act of utilizing multiple elements in succession.

Considering that comparison, in either case, there is no proof of inherent security margin increase.

It would have been just as relevant for me to say that shopping carts, wheel chairs, skateboards, and Aston Martin's are similar because they all roll on wheels. All four are separate in structure, essentially, yet they share a like concept - wheels. Just the same, cascading block algorithms and compiled PRNGs share a like concept - multiple elements in succession. See what I'm saying?

As for your proposed mode of operation, can you provide any detailed outline in regards to this? Also, how does this new mode provide confidentiality, exactly? The only conclusion that can be drawn, in terms of security, will only be possible if you provide the guts of the compiled PRNG - how "building blocks" interact - why they interact in the manner they do - the rationale for choosing this particular setup - how much more security is achieved by defying the existing theories of the original generators, as opposed to your altered design. This is important stuff.

If you need tips on how to prepare and present this information, simply pick up a few books or read a few white papers. Any work of Schneier's is a good reference. Be it his white paper on Blowfish or his writings throughout Practical Cryptography, it is useful to take notes from his style.

Are you capable of performing any analysis of your own?

Adios for now.


Last edited by JustinT on Sun Sep 05, 2004 4:06 am; edited 1 time in total
Back to top
View user's profile Send private message Visit poster's website
broellgen
Just Arrived
Just Arrived


Joined: 02 Dec 2003
Posts: 0
Location: Germany

Offline

PostPosted: Thu Dec 11, 2003 12:19 am    Post subject: Polymorphic encryption Reply with quote

Thank you for providing the links. Looks as if it's not so hard to propose new methodologies, but I guess it's a matter of who the author is.

Proving a higher security of a compiled PRNG than with primitive generators operating in succession: It's too easy to imagine that a number of possible combinations find themselves getting caught up in a short sequence pretty quickly. There is absolutely no guarantee for maximum period and it is highly likely that the average period is far shorter than 2^size_of_internal_state.

But what's the minimum sequence length? Showing that a minimum exists could be of importance.
As the employed primitive PRNGs all feature their own characteristic period, which is rarely maximal, it is unlikely that there's a linear dependence that reduces the compiled PRNG to something very simple. If there is one, the worst that can happen is that only a fraction of the internal state is effectively used:
A "static" example: Xn = (Xn-32 - Xn-32 + aXn-1 + b) mod m

Functions using dynamic indices, i.e. indices that are taken from the internal state have an advantage here: A non-linear function that (e.g.) permutates a (large) number of bits in the internal state changes the indices and inevitably affects the linear dependence that might have reduced the compiled PRNG before.

Compiled PRNGs with 16 and more primitives showed good results in randomness tests. Even with good results from tests, up to now all implementations break up linear dependencies with a permutation function.

Still doesn't sound like a proof of inherent security margin increase. On the other hand it should be noted that it is still possible to talk about something that is abstract and variable here. It's not easy to describe such a construct and that could be advantageous against attempts to break it.
I preferred to do analysis to the extent that I could handle by my own so far because I rarely met with people who could do more than just repeat the already known. Using entropy to describe this highly variable structure yielded the best results so far.

To the proposed mode of operation: It can be used at any time, e.g. every second, when encrypting a new MPEG sequence, or simply whenever you want.
It's a (probably) new way to use history to some extent. How much more security can be achieved with it? That depends on the underlying theory, which is surely of primary interest.

C.B. Roellgen
Back to top
View user's profile Send private message Visit poster's website
JustinT
Trusted SF Member
Trusted SF Member


Joined: 17 Apr 2003
Posts: 16777215
Location: Asheville, NC, US / Uberlāndia, MG, Brazil

Offline

PostPosted: Thu Dec 11, 2003 1:07 pm    Post subject: Following up. Reply with quote

You are correct. It truly is not hard to propose a new methodology. As I have mentioned numerous times, it is all in the suit that you wear - in one way or another. Keep in mind, even though it is dependent upon the author, there is a difference between etiquette and style.

Most good cryptographers have their own style, which includes a given approach to analysis, algorithm structure design preferences, and so on and so forth. Etiquette is something they all follow, regardless of their style. This is the same etiquette which encompasses successful tactics for reputable presentation and the criteria for achieving just that. By all means, have your own style, but remember to flex that style within the confinement of good etiquette.

From the get-go, I see the mentioning of the inclusion of non-linearity, which is a crucial move. It will prove to be rather vital, when considering many secure components.

Stay on this track.

There's nothing wrong with tossing around ideas for a stream cipher-induced primitive. The only problem I have is with the attempt to market such a methodology, when the author's knowledge and confidence seem to be unstable at this point. It's not so much of a negative outlook, as one of genuine concern.

I just find it rather ignorant and mindless to market a system as secure, and charge a hefty price to boot, when there is no concrete, professional analysis or means of publicly available integrity. This is what stands out as being an amateur-ish attempt. Nothing personal; just an observation of poor etiquette.

If you want to establish trust, you must analyze, analyze, and analyze. Even if you are capable of performing analysis on your own, peer-review is the only way to gain the public's trust. It seems the majority will always confide in the opinion of themselves. If you are truly serious, as it appears by your marketing of this, that's when you hire a cryptanalyst, or a group of analysts, to inspect your work. With that in mind, we can continue to converse via e-mail, in regards to this matter, and discuss what may be discussed here, on the forum.

Cheers.


Last edited by JustinT on Sun Sep 05, 2004 4:09 am; edited 1 time in total
Back to top
View user's profile Send private message Visit poster's website
Display posts from previous:   

Post new topic   Reply to topic   Printer-friendly version    Networking/Security Forums Index -> Cryptographic Theory and Cryptanalysis - Internal and Transmission Security All times are GMT + 2 Hours
Goto page 1, 2  Next
Page 1 of 2


 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum

Community Area

Log in | Register