Nothing new to comment on in most of the logic blogs I've been following, so I'm going to reference the Stanford Encyclopedia of Philosophy article on Many-valued logic.

I may also from time to time discuss other articles.

As I've already noted, the prevailing opinion among logicians is that the development of modal logic on the basis of 3-valued logic never succeeded, but I cannot find where the attempt is documented. I've examined the standard literature on many-valued logic, including the works Ackermann, Bolc and Borowitz, Malinowski, Rescher, and Rosser and Turquettee, and none of them discuss it in any detail. I've also checked the references on modal logic, e.g. Hughes and Cresswell, and they don't discuss it either. There is a proof that the Lewis systems can't be reduced to three values, but then my system isn't exactly one of them, so the conditions of the proof don't apply.

There is some connection between the strict conditional I use and the issues related to relevance logic. Some of the paradoxes of material and strict implication can be addressed.

There is also a connection with paraconsistent logic and issues related to that subject.

## Tuesday, May 31, 2005

## Saturday, May 28, 2005

### 3VL - Fuzzy logic and intuitionism

I can also connect this with fuzzy logic. While Fuzzy logic treats individual truth values in the interval between true and false, Three valued logic distinguishes the endpoints True and False, and the entire interval between as the third value F. It thus acts as a bridge between Classical logic and fuzzy logic.

I have also observed connections with intuitionistic, partly as a result of the comparison I did with other 3-valued systems. When I was at the University of Utah, I wrote a paper with my results and tried, (unsuccessfully) to get one of the professors of Logic there to critique it. However, one person there, John Halleck, took a look at it and included a 3-valued logic evaluator on his web site. Borrowing from his list of axioms of Heyting's Intuitionist PC; I have:

HA1: p=>(p&p) True

HA2: (p&q)=>(q&p) True

HA3: (p=>q)=>((p&r)=>(q&r)) True

HA4: ((p=>q)=>(q=>r))=>(p=>r) True

HA5: q=>(p=>q) contingent

HA6: (p&(p=>q))=>q True

HA7: p=>(p+q) True

HA8: (p+q)=>(q+p) True

HA9: ((p=>r)&(q=>r))=>((p+q)=>r) True

HA10: ~p=>(p=>q) contingent

HA11: ((p=>q)&(p=>~q))=>~p contingent

However, if we use ~<> instead of ~, the last two evaluate as true, and in HA5,

if we use instead, []Q => (p => Q), the expression is true.

As with the case with modal logic, a slight adjustment makes this entirely compatible with intuitionistic logic so that they are not identical, but they are quite similar.

I have also observed connections with intuitionistic, partly as a result of the comparison I did with other 3-valued systems. When I was at the University of Utah, I wrote a paper with my results and tried, (unsuccessfully) to get one of the professors of Logic there to critique it. However, one person there, John Halleck, took a look at it and included a 3-valued logic evaluator on his web site. Borrowing from his list of axioms of Heyting's Intuitionist PC; I have:

HA1: p=>(p&p) True

HA2: (p&q)=>(q&p) True

HA3: (p=>q)=>((p&r)=>(q&r)) True

HA4: ((p=>q)=>(q=>r))=>(p=>r) True

HA5: q=>(p=>q) contingent

HA6: (p&(p=>q))=>q True

HA7: p=>(p+q) True

HA8: (p+q)=>(q+p) True

HA9: ((p=>r)&(q=>r))=>((p+q)=>r) True

HA10: ~p=>(p=>q) contingent

HA11: ((p=>q)&(p=>~q))=>~p contingent

However, if we use ~<> instead of ~, the last two evaluate as true, and in HA5,

if we use instead, []Q => (p => Q), the expression is true.

As with the case with modal logic, a slight adjustment makes this entirely compatible with intuitionistic logic so that they are not identical, but they are quite similar.

## Friday, May 27, 2005

### 3VL - Miscellaneous comments.

I've been following a number of other logic related blogs. Logblog mentioned a conference on exact philosophy. I'm inclined to think it an oxymoron.

On logicandlanguage, there is a reference to Kant's law; that you can't get necessity-style claims from contingent claims. This seems to correspond to my rules that []P => P and P => <>P, but not their converses. With caution, it's possible to interpret ?P as "P is contingent", but it's not possible with only 3 values to distinguish "contingently true" from "contingently false", which points to the need of a 4-valued logic. I've worked out the basic truth tables based on my interpretation of 3-valued logic, but I haven't otherwise done much with it. But it isn't the Lukasiewicz 4-valued logic. I've also been discussing the subject on Mathematics and Computation.

On logicandlanguage, there is a reference to Kant's law; that you can't get necessity-style claims from contingent claims. This seems to correspond to my rules that []P => P and P => <>P, but not their converses. With caution, it's possible to interpret ?P as "P is contingent", but it's not possible with only 3 values to distinguish "contingently true" from "contingently false", which points to the need of a 4-valued logic. I've worked out the basic truth tables based on my interpretation of 3-valued logic, but I haven't otherwise done much with it. But it isn't the Lukasiewicz 4-valued logic. I've also been discussing the subject on Mathematics and Computation.

## Thursday, May 26, 2005

### 3VL - Modal logic

One of the reasons I liked dealing with logic is that it provides a richer and consistent formal language for discussing the truth values of propositions: certain(true), not necessary, possible, not possible (false), doubtful or equivocal (U), and two-valued or dichotomous.

I had begun with the idea of trying to reproduce a modal logic which would correspond to the similarities, and with the introduction of the strict Lukasiewicz conditional, I could managed to reproduce versions of all the axioms of S5.

Later I found a term, "formal similarity" to describe the relationship, because there are substantial differences between this version and Lewis-type systems.

Lewis used a different notion of possibility than I do. His corresponds to the existence of a possibility, where mine takes it in a somewhat different sense of "not impossible". He associated "possible" with "non-self-contradictory" and would probably not include doubtful or equivocal statements as possible, where I would include them. I believe he would argue that "It is possible that I will eat a turkey on Thanksgiving and it is possible that I will not eat a turkey on Thanksgiving, but it is not possible that I will and won't eat a turkey on thanksgiving." I would inquire about whether he is using "and" in a truth-functional sense or an additive sense and whether a half-eaten turkey would make "will and will not" plausible.

He also bases his logic strictly on two-valued classical logic, and assumes that all its laws hold without restriction (including the law of the excluded middle). I don't; I have to modify some of them to account for the existence and effect of doubtful propositions, which he doesn't acknowledge or account for.

He also used a different notion for his strict conditional. He defines "strictly implies" as "it is necessary that if P then Q", and "it is not possible for P and not Q". While I do find a use for "it is certain that if P then Q", The idea that "it is not possible for P and not Q" is too strict for my purposes. My version is weaker but still adequate.

The effect of these differences is that there is a significant difference in substance between the two systems, but since I can reproduce analogues of all the Lewis axioms as theorems, I can reproduce corresponding analogues of any theorem of S5, and I can decide whether a conjecture is or is not a theorem, and if not, why not, far more easily. "Anything you can do, I can do better?"

I had begun with the idea of trying to reproduce a modal logic which would correspond to the similarities, and with the introduction of the strict Lukasiewicz conditional, I could managed to reproduce versions of all the axioms of S5.

Later I found a term, "formal similarity" to describe the relationship, because there are substantial differences between this version and Lewis-type systems.

Lewis used a different notion of possibility than I do. His corresponds to the existence of a possibility, where mine takes it in a somewhat different sense of "not impossible". He associated "possible" with "non-self-contradictory" and would probably not include doubtful or equivocal statements as possible, where I would include them. I believe he would argue that "It is possible that I will eat a turkey on Thanksgiving and it is possible that I will not eat a turkey on Thanksgiving, but it is not possible that I will and won't eat a turkey on thanksgiving." I would inquire about whether he is using "and" in a truth-functional sense or an additive sense and whether a half-eaten turkey would make "will and will not" plausible.

He also bases his logic strictly on two-valued classical logic, and assumes that all its laws hold without restriction (including the law of the excluded middle). I don't; I have to modify some of them to account for the existence and effect of doubtful propositions, which he doesn't acknowledge or account for.

He also used a different notion for his strict conditional. He defines "strictly implies" as "it is necessary that if P then Q", and "it is not possible for P and not Q". While I do find a use for "it is certain that if P then Q", The idea that "it is not possible for P and not Q" is too strict for my purposes. My version is weaker but still adequate.

The effect of these differences is that there is a significant difference in substance between the two systems, but since I can reproduce analogues of all the Lewis axioms as theorems, I can reproduce corresponding analogues of any theorem of S5, and I can decide whether a conjecture is or is not a theorem, and if not, why not, far more easily. "Anything you can do, I can do better?"

## Wednesday, May 25, 2005

### 3VL versus 2VL

It's also characteristic of this 3VL that it reduces to the two valued case when all the propositions involved are definitely true or false, and the middle value excluded. But a logic that has all the theorems of 2VL, and only those, would be equivalent to it. Not all the theorems of 2VL hold, just as not all the theorems of ordinary arithmetic hold for the integers, and not all those of real numbers apply to the complex numbers. Many theorems of 2VL hold in 3VL as well; typically those involving algebraic-type manipulations of expressions. However, some of them must be modified or restricted; typically those involving rules of inference.

As I've mentioned, direct proof via Modus ponens and transitive chains of inference doesn't work in standard Lukasiewicz logic; These rules need to be restricted to avoid dubious conditionals. Similarly, indirect proofs that rely on some form of "Reductio ad absurdum" also need to be restricted. It is not sufficient to prove P by assuming ~P and then deriving a contradiction Q and ~Q, because this isn't necessarily a contradiction in 3VL. Indirect proof is still possible, but it requires stronger contradictions of the forms "possible and impossible ("<>P & ~<>P) , or "Certainly and not necessarily []P & ~[]P", or even "Certain and impossible" (<>P & ~<>P).

There are also rules that express ideas that aren't available in 2VL. If Certainly P, then P; ([]P => P) and if P then possibly P (P => <>P) are both valid rules but their converses are not>

As I've mentioned, direct proof via Modus ponens and transitive chains of inference doesn't work in standard Lukasiewicz logic; These rules need to be restricted to avoid dubious conditionals. Similarly, indirect proofs that rely on some form of "Reductio ad absurdum" also need to be restricted. It is not sufficient to prove P by assuming ~P and then deriving a contradiction Q and ~Q, because this isn't necessarily a contradiction in 3VL. Indirect proof is still possible, but it requires stronger contradictions of the forms "possible and impossible ("<>P & ~<>P) , or "Certainly and not necessarily []P & ~[]P", or even "Certain and impossible" (<>P & ~<>P).

There are also rules that express ideas that aren't available in 2VL. If Certainly P, then P; ([]P => P) and if P then possibly P (P => <>P) are both valid rules but their converses are not>

## Tuesday, May 24, 2005

### 3VL - doubtful inference

The impression I have gathered is that Lukasiewicz 3-valued logic hasn't had a great deal of respect in the logical community. This is entirely understandable. As it has been presented so far, there are chronic difficulties with interpretation, and you can't do the same kinds of things with it that you can with classical logic. Most of the serious work that has been done with it has been using Lukasiewicz' "Polish notation", which is unfamiliar to most people who work with logic, and in Europe rather than the United States.

But the small step of defining a strict conditional for it makes an incredible difference. It's like giving it the power pill that turns lowly shoeshine boy into Underdog; like turning Bruce Banner into the Incredible Hulk; like turning a lightning bug into lightning; moonlight into sunlight, climbing over a mountain peak and seeing the Pacific Ocean on the other side. The difference in effectiveness is so huge, that it's amazing no one has seen it before. But if it has been seen, I haven't found it in the literature.

It was like a dazzling flash of hindsight. Revelation followed revelation so swiftly, and in such interconnected fashion, that I no longer recall their exact sequence of events. But I can describe some of them.

One of the early ones is that I realized the reason why Lukasiewicz logic hadn't been workable before.

I was already aware that, using the original Lukasiewicz conditional, Modus Ponens fails as a tautology. But it fails in only one case, Namely, when P is doubtful and Q is false. The truth table labels this as doubtful.

But of course!! The Lukasiewicz conditional allows the expression of doubtful conditionals, and if it were true without restriction, it would be possible to start with a doubtful premise and a doubtful conditional, and advance to a false conclusion. But by forbidding dubious conditionals and assuring that it is definitely the case that if P, then Q we repair the deficiency. The original truth table quite correctly labels a case where modus ponens can and should fail.

A few more examples, such as transitivity, yielded similar results, and the basis for a whole theory of doubtful inference falls out, naturally and easily.

Of course this must be so!! One of the purposes of logic, after all, is to assure that our rules of reasoning are correct and that we do not start from true premises and reason to false conclusions. And the strict conditional has just the kind of ordering properties, on three values, that the ordinary material conditional has for two values; when P =>Q is true, The conclusion Q is at least as true as the premises P.

But the small step of defining a strict conditional for it makes an incredible difference. It's like giving it the power pill that turns lowly shoeshine boy into Underdog; like turning Bruce Banner into the Incredible Hulk; like turning a lightning bug into lightning; moonlight into sunlight, climbing over a mountain peak and seeing the Pacific Ocean on the other side. The difference in effectiveness is so huge, that it's amazing no one has seen it before. But if it has been seen, I haven't found it in the literature.

It was like a dazzling flash of hindsight. Revelation followed revelation so swiftly, and in such interconnected fashion, that I no longer recall their exact sequence of events. But I can describe some of them.

One of the early ones is that I realized the reason why Lukasiewicz logic hadn't been workable before.

I was already aware that, using the original Lukasiewicz conditional, Modus Ponens fails as a tautology. But it fails in only one case, Namely, when P is doubtful and Q is false. The truth table labels this as doubtful.

But of course!! The Lukasiewicz conditional allows the expression of doubtful conditionals, and if it were true without restriction, it would be possible to start with a doubtful premise and a doubtful conditional, and advance to a false conclusion. But by forbidding dubious conditionals and assuring that it is definitely the case that if P, then Q we repair the deficiency. The original truth table quite correctly labels a case where modus ponens can and should fail.

A few more examples, such as transitivity, yielded similar results, and the basis for a whole theory of doubtful inference falls out, naturally and easily.

Of course this must be so!! One of the purposes of logic, after all, is to assure that our rules of reasoning are correct and that we do not start from true premises and reason to false conclusions. And the strict conditional has just the kind of ordering properties, on three values, that the ordinary material conditional has for two values; when P =>Q is true, The conclusion Q is at least as true as the premises P.

## Monday, May 23, 2005

### 3VL - Success

I need to go back to the Principle of the Excluded middle. I say principle, because it's not a law, here: it's a contingent statement which applies to some propositions but not others. It comes in two forms: bivalence (p v ~p), and noncontradiction ~(P & ~P). In classical logic, these are equivalent. Both of these and their equivalence have been frequently challenged by various logicians or philosophers, but few people take these challenges seriously. In this logic, these two formulations are equivalent, but the necessity of the excluded middle may be either asserted or denied. The assertion, "It is necessarily (certainly) the case that either P or not P", or equivalently "It is not possible for both P and not P)" marks a dichotomously uncertain statement, one which must be either true or false: not neither and not both, although it may not be known which is actually the case. [](P v ~P) = ~<>(P & ~P) = !P. The denial, "it is not necessarily the case that either P or not P", or equivalently "It is possible for both P and not P", marks an equivocally uncertain statement, one with the middle truth value. ~[](P v ~P) = <>(P & ~P) = ?P.

When arguments are symbolized, one finds that that the middle may be consistently be either included or excluded, but the logic strictly enforces consistency once the choice is made. It is obviously inconsistent, and in fact results in a genuine contradiction, to allow the use of the middle truth value on the one hand, and then reassert the excluded middle in one of its forms, on the other. Yet the temptation to do so is both insidious and ubiquitous. More than one of the arguments that have been advanced against three valued logic employ just such an argument.

Perhaps more importantly, when I examined the various versions of 3VL, I found that several of them could be expressed in terms I had defined, which made this a more general system. And then, when I was looking at their connectives searching for such a definition, I noticed a certain definition of equivalence and said, "Hey, wait a minute! That's not an equivalence, that's only a biconditional!" Mathematically, an equivalence relation is reflexive, symmetric, and transitive, and these "logical equivalences" were none of those. It should also express the idea that two formulas should have the same truth value, and they didn't do that, either. Only one of them did (I believe it was Kleene's system, the one that had the conditional I had long ago discarded as inadequate). I had a use for that definition, and so I appropriated it.

At this point, I had a partially functional logic. I could establish commutativity, associativity, and the distributive laws for Conjunction (&) and disjunction (v); I had double negation, De Morgan's laws, and the interconversion of the modal functions, and the law of the contrapositive. I had my two types of uncertainty, and their behavior with respect to the other operators. Now I could add properties of equality (If P=Q, then ~P = ~Q), and properties of equality (If P and P=Q then Q; if P = Q and Q=R then P=R), which was an advance.

It also gave me an intepretation for the Lukasiewicz conditional: I could define it as (~P v Q v P = Q), which was curious, but didn't strike me as particularly useful or profound. And then, after I don't remember how long, I noticed that I didn't need a separate definition for equivalence. I could get it by applying necessity (or certainty) to the Lukasiewicz biconditional I was already using. P = Q = [](P <-> Q)

And then, on the basis that what was good for the biconditional was good for the conditional, I decided to define a strict Lukasiewicz conditional, P => Q as [](P -> Q), removing the uncertainty.

Duh. Of course. Obliviously. And the light came on, and suddenly I understood more than I had ever dreamed of, or anyone will believe.

When arguments are symbolized, one finds that that the middle may be consistently be either included or excluded, but the logic strictly enforces consistency once the choice is made. It is obviously inconsistent, and in fact results in a genuine contradiction, to allow the use of the middle truth value on the one hand, and then reassert the excluded middle in one of its forms, on the other. Yet the temptation to do so is both insidious and ubiquitous. More than one of the arguments that have been advanced against three valued logic employ just such an argument.

Perhaps more importantly, when I examined the various versions of 3VL, I found that several of them could be expressed in terms I had defined, which made this a more general system. And then, when I was looking at their connectives searching for such a definition, I noticed a certain definition of equivalence and said, "Hey, wait a minute! That's not an equivalence, that's only a biconditional!" Mathematically, an equivalence relation is reflexive, symmetric, and transitive, and these "logical equivalences" were none of those. It should also express the idea that two formulas should have the same truth value, and they didn't do that, either. Only one of them did (I believe it was Kleene's system, the one that had the conditional I had long ago discarded as inadequate). I had a use for that definition, and so I appropriated it.

At this point, I had a partially functional logic. I could establish commutativity, associativity, and the distributive laws for Conjunction (&) and disjunction (v); I had double negation, De Morgan's laws, and the interconversion of the modal functions, and the law of the contrapositive. I had my two types of uncertainty, and their behavior with respect to the other operators. Now I could add properties of equality (If P=Q, then ~P = ~Q), and properties of equality (If P and P=Q then Q; if P = Q and Q=R then P=R), which was an advance.

It also gave me an intepretation for the Lukasiewicz conditional: I could define it as (~P v Q v P = Q), which was curious, but didn't strike me as particularly useful or profound. And then, after I don't remember how long, I noticed that I didn't need a separate definition for equivalence. I could get it by applying necessity (or certainty) to the Lukasiewicz biconditional I was already using. P = Q = [](P <-> Q)

And then, on the basis that what was good for the biconditional was good for the conditional, I decided to define a strict Lukasiewicz conditional, P => Q as [](P -> Q), removing the uncertainty.

Duh. Of course. Obliviously. And the light came on, and suddenly I understood more than I had ever dreamed of, or anyone will believe.

## Sunday, May 22, 2005

### 3VL types of uncertainty

I haven't given up on the other aspects of self-directed education (in case anyone is actually reading this), but in the last few weeks, I've found some logic-oriented blogs on the net. Starting with Logblog I'll start referring to those once I get done with my "confessions" here.

My attitude toward professional logicians is, If you can't join em, fight em. No one competent to understand what I am talking about has shown any interest. So, since I read that one of the components of a successful blog is to be controversial, and since I am clearly suffering delusions of enlightnment, I'm going to turn guerrilla and snipe at the Establishment.

At the point in my studies I had reached, I was not aware that I had almost independently reconstructed the 3-valued logic of Lukasiewicz, and I didn't understand the reasoning or philosophy behind the Lewis systems S1-S5 beyond what I could determine from the axioms.

I ventured onto Compuserve (The internet was just then beginning to grow), and asked there whether anyone had any comment. I was referred to Joe Celko, who was described as working on a 3-valued logic that dealt with missing values in Data bases, who referred me in turn to a discussion going on in the pages of Data Base Programming and Design. I read the articles with great interest, and found myself sympathizing with both sides on the debate. The 3-valued approach the proponents were using was similar to what I was doing, but I agreed that a sound theory was lacking. I didn't have the answers, either, but again, there were unanswered questions.

I went back to school to try to get my BS in Mathematics, and took a course in classical logic. The approach used in that course was natural deduction, and it basically covered propositional logic without going into predicate logic, but I took the opportunity to study that on my own. I took note of the fact that theorems of logic corressponded to truth-functionally true statements (that is, a statement that evaluated as true on every assignment of truth values), their negations were truth-functionally false, and others were contingent, and I wondered whether the middle value I was using could be used to describe these.

I transferred to ASU for a semester, and later lived next to the University for a year, and took the opportunity to examine the literature a little closer. I was dismayed to discover that my discoveries had indeed been anticipated, and I almost gave up. However, there were still unsettled questions. One of the comments I encountered was that "In spite of the promising combination of trivalence and modality, modal logic on this basis was never fully developed." I wanted to know why, and there was no further discussion, no references, no reasons why it didn't work. The other was the objection to interpretation. Lukasiewicz intended his truth value to represent the uncertainty of the future contingent, but an objector (no reference given) pointed to the "law of the excluded middle" and, apparently, there was no answer. When I worked on this, I decided that there were two different kinds of uncertainty involved. Using constants instead of variables or tables !P expresses the idea "True or false, but it's not certain which (!T=T, !U=F, !F=T) while ?P expresses the doubtfulness associated with the middle truth value (?T=F, ?U=T, ?U=T).

It's trivial to show that ~!P = ?P and ~?P = ~P, but !~P=P and ?~P=?P. This meant, to me, that "uncertainty" is an ambiguous concept, with two formally similar but contradictory interpretations. I didn't fully work out the details of how these were associated with the "and" and "or" at this point.

After this, I moved back to Utah, close enough to BYU that I could consult the literature there, and ventured onto the internet, this time at the newsgroup math.logic. One person noted that according to my tables, [](P v Q) = []P v []Q, and <>P & <>Q = <>(P & Q), which aren't accepted in traditional (e.g. Lewis-type) modal logic, while someone else referred me to Bolc & Borowic's work on multi-valued logic. I labored over these for some time, trying to figure how I could get [](P v Q) & ~[]([]P v []Q); and (<>P & <>Q) & ~[](<>(P & Q), but no matter how I transformed and tortured these statements, I got contradictions. Eventually, I decided that they were genuinely contradictions. To simplify the problem, supposing that P and Q are mutually exclusive, so that Q = ~P, and then applying the various transformation rules, these boil down to trying to assert the excluded middle on one side and deny it on the other. No wonder there's a contradiction!

My attitude toward professional logicians is, If you can't join em, fight em. No one competent to understand what I am talking about has shown any interest. So, since I read that one of the components of a successful blog is to be controversial, and since I am clearly suffering delusions of enlightnment, I'm going to turn guerrilla and snipe at the Establishment.

At the point in my studies I had reached, I was not aware that I had almost independently reconstructed the 3-valued logic of Lukasiewicz, and I didn't understand the reasoning or philosophy behind the Lewis systems S1-S5 beyond what I could determine from the axioms.

I ventured onto Compuserve (The internet was just then beginning to grow), and asked there whether anyone had any comment. I was referred to Joe Celko, who was described as working on a 3-valued logic that dealt with missing values in Data bases, who referred me in turn to a discussion going on in the pages of Data Base Programming and Design. I read the articles with great interest, and found myself sympathizing with both sides on the debate. The 3-valued approach the proponents were using was similar to what I was doing, but I agreed that a sound theory was lacking. I didn't have the answers, either, but again, there were unanswered questions.

I went back to school to try to get my BS in Mathematics, and took a course in classical logic. The approach used in that course was natural deduction, and it basically covered propositional logic without going into predicate logic, but I took the opportunity to study that on my own. I took note of the fact that theorems of logic corressponded to truth-functionally true statements (that is, a statement that evaluated as true on every assignment of truth values), their negations were truth-functionally false, and others were contingent, and I wondered whether the middle value I was using could be used to describe these.

I transferred to ASU for a semester, and later lived next to the University for a year, and took the opportunity to examine the literature a little closer. I was dismayed to discover that my discoveries had indeed been anticipated, and I almost gave up. However, there were still unsettled questions. One of the comments I encountered was that "In spite of the promising combination of trivalence and modality, modal logic on this basis was never fully developed." I wanted to know why, and there was no further discussion, no references, no reasons why it didn't work. The other was the objection to interpretation. Lukasiewicz intended his truth value to represent the uncertainty of the future contingent, but an objector (no reference given) pointed to the "law of the excluded middle" and, apparently, there was no answer. When I worked on this, I decided that there were two different kinds of uncertainty involved. Using constants instead of variables or tables !P expresses the idea "True or false, but it's not certain which (!T=T, !U=F, !F=T) while ?P expresses the doubtfulness associated with the middle truth value (?T=F, ?U=T, ?U=T).

It's trivial to show that ~!P = ?P and ~?P = ~P, but !~P=P and ?~P=?P. This meant, to me, that "uncertainty" is an ambiguous concept, with two formally similar but contradictory interpretations. I didn't fully work out the details of how these were associated with the "and" and "or" at this point.

After this, I moved back to Utah, close enough to BYU that I could consult the literature there, and ventured onto the internet, this time at the newsgroup math.logic. One person noted that according to my tables, [](P v Q) = []P v []Q, and <>P & <>Q = <>(P & Q), which aren't accepted in traditional (e.g. Lewis-type) modal logic, while someone else referred me to Bolc & Borowic's work on multi-valued logic. I labored over these for some time, trying to figure how I could get [](P v Q) & ~[]([]P v []Q); and (<>P & <>Q) & ~[](<>(P & Q), but no matter how I transformed and tortured these statements, I got contradictions. Eventually, I decided that they were genuinely contradictions. To simplify the problem, supposing that P and Q are mutually exclusive, so that Q = ~P, and then applying the various transformation rules, these boil down to trying to assert the excluded middle on one side and deny it on the other. No wonder there's a contradiction!

## Saturday, May 21, 2005

### 3VL. Not quite.

I recall doing this sometime between 1984 and 1988, after I moved from Utah to Phoenix. When I was working out my version of 3-valued logic, the truth tables for "And" (&) and "or" (v) were satisfactory, but in order to express relationships between propositions, I needed a conditional and biconditional. My first attempt gave (using the constants) T -> T = T; T -> U = U; T -> F = F; U -> T = T; U -> U = U; U -> F = U; F -> T = T; F -> U = T; F -> F = T.

I then began exploring truth tables for the tautologies of classical logic, for instance de Morgan's law ~ (P & Q) <-> ~P v ~Q. I quickly found that there was a gaping hole in every truth table I could construct. For the most familiar logical laws, most entries turned out T, but when P and Q both had values of U, the proposition in question had values of U. This was hardly tolerable. It should be obvious that "if P then P" should be a tautology, but even for something this simple, I got U. At this point, I think I decided to see what else had been done on the subject. I went out to the ASU library to find either Rosser and Turquette's "Many valued logic" or Restall's "Three valued logic" (I'm not sure which), and found that this had been done. It's still not clear to me whether this was Kleene's "weak" system or his "strong" system, but in either case, it didn't work, for the very reason I had already discovered. I found that there was another alternative, Lukasiewicz 3-valued logic, which had the same truth tables I had already worked out, but differed in only one place: U -> U = T (instead of U. I didn't have much time to study it then, but I took this one idea to work with later.

Maybe a year or so later, I saw a friend working on truth tables for a different 3VL, and picked up my own studies again. This time, it occurred to me that I could use a function to distinguish "T or U" from "F", and called this "possible", and one to distinguish "T" from "U or F", and called this one "certain" or "necessary". In combination with negation, one of the first and easiest results were the formulas "Certainly not" = "not possible" and "Not necessarily" = "Possibly not".

When I found this, I considered it too simple and elegant to ignore. Also, using the Lukasiewicz conditional, and biconditional, I could prove a number of significant theorems of elementary propositional logic. However, one of the most important ones, Modus ponens, didn't work. I consulted the source most easily available to me, the Encyclopedia Britannica's article on "History and Kinds of logic", and learned a little bit more about the Lewis systems of modal logic. I particularly noted that there were several varieties, that these systems were not truth functional and could not be expressed with truth tables, but had to be constructed on an axiomatic basis. and that the decision problem (deciding whether a given proposition was or was not a theorem), was particularly difficult. I then tried evaluating a number of the axioms according to the three-valued tables I had developed, and found that some of them worked, and some of them didn't. At this point, I wondered "Why do some of these work, but not others?".

I then began exploring truth tables for the tautologies of classical logic, for instance de Morgan's law ~ (P & Q) <-> ~P v ~Q. I quickly found that there was a gaping hole in every truth table I could construct. For the most familiar logical laws, most entries turned out T, but when P and Q both had values of U, the proposition in question had values of U. This was hardly tolerable. It should be obvious that "if P then P" should be a tautology, but even for something this simple, I got U. At this point, I think I decided to see what else had been done on the subject. I went out to the ASU library to find either Rosser and Turquette's "Many valued logic" or Restall's "Three valued logic" (I'm not sure which), and found that this had been done. It's still not clear to me whether this was Kleene's "weak" system or his "strong" system, but in either case, it didn't work, for the very reason I had already discovered. I found that there was another alternative, Lukasiewicz 3-valued logic, which had the same truth tables I had already worked out, but differed in only one place: U -> U = T (instead of U. I didn't have much time to study it then, but I took this one idea to work with later.

Maybe a year or so later, I saw a friend working on truth tables for a different 3VL, and picked up my own studies again. This time, it occurred to me that I could use a function to distinguish "T or U" from "F", and called this "possible", and one to distinguish "T" from "U or F", and called this one "certain" or "necessary". In combination with negation, one of the first and easiest results were the formulas "Certainly not" = "not possible" and "Not necessarily" = "Possibly not".

When I found this, I considered it too simple and elegant to ignore. Also, using the Lukasiewicz conditional, and biconditional, I could prove a number of significant theorems of elementary propositional logic. However, one of the most important ones, Modus ponens, didn't work. I consulted the source most easily available to me, the Encyclopedia Britannica's article on "History and Kinds of logic", and learned a little bit more about the Lewis systems of modal logic. I particularly noted that there were several varieties, that these systems were not truth functional and could not be expressed with truth tables, but had to be constructed on an axiomatic basis. and that the decision problem (deciding whether a given proposition was or was not a theorem), was particularly difficult. I then tried evaluating a number of the axioms according to the three-valued tables I had developed, and found that some of them worked, and some of them didn't. At this point, I wondered "Why do some of these work, but not others?".

## Friday, May 20, 2005

### Three valued logic truth tables

I started with the intepretation of my third logical value as "True or false, but it's not certain which". For negation, I could extend the normal table for true or false, and reasoned, The negation of "true or false" would be "false or true": If I don't know whether or not a statement is true, I also don't know whether or not its negation is true. So, if P = U, then ~P = U.

The table for "or", I decided "if P is true, and Q is either true or false, the value of P or Q doesn't depend on the truth of Q", so if P = T and Q = U, P v Q = U, and likewise with P and Q interchanged. U or U should be U, U or F should be U.

Similarly with the table for "and".

Actually, as I found out later, this turns to be an unfortunate interpretation, but the truth tables still work. There is another interpretation that works better, though.

The table for "or", I decided "if P is true, and Q is either true or false, the value of P or Q doesn't depend on the truth of Q", so if P = T and Q = U, P v Q = U, and likewise with P and Q interchanged. U or U should be U, U or F should be U.

Similarly with the table for "and".

Actually, as I found out later, this turns to be an unfortunate interpretation, but the truth tables still work. There is another interpretation that works better, though.

## Thursday, May 19, 2005

### Three valued logic beginnings

I've recently begun following some logic blogs in an attempt to find an outlet for my work in three-valued logic. I've been told that my results are probably not publishable, but I want to discuss them. I mentioned earlier that I don't clearly recall when I became interested in logic, but I can describe some of how I developed this topic.

Between 1981 and 1984, I spent many hours in the mathematics section of the BYU library, and in the process kept my knowledge of logic from dissolving into rust, as well as picked up a smattering of predicate logic. For various reasons, related to the early-20th century "Crisis" in mathematical foundations, I encountered reason to suspect that classical two-valued logic was good for mathematics, but there were doubts that it was sufficient, especially when it dealt with infinite sets. Mathematicians became concerned about the difference between "True" and "provable". I read about various paradoxes in set theory, and did a little bit of playing with them. Russell's paradox attracted my special attention, as did the discussion that even going to a three-valued logic wouldn't necessarily resolve it. I also encountered an article which attempted to analyze Anselm's proof of the existence of God using modal logic, which attracted my attention to that subject.

As I recall now, it was about this time that I started tinkering with what I called a "logic of indecision", using three values. Among my early attempts were truth tables for negation, disjunction (using the OR), and conjuncion (AND), and I used symbols. T, U (or I), and F for the truth values.

Between 1981 and 1984, I spent many hours in the mathematics section of the BYU library, and in the process kept my knowledge of logic from dissolving into rust, as well as picked up a smattering of predicate logic. For various reasons, related to the early-20th century "Crisis" in mathematical foundations, I encountered reason to suspect that classical two-valued logic was good for mathematics, but there were doubts that it was sufficient, especially when it dealt with infinite sets. Mathematicians became concerned about the difference between "True" and "provable". I read about various paradoxes in set theory, and did a little bit of playing with them. Russell's paradox attracted my special attention, as did the discussion that even going to a three-valued logic wouldn't necessarily resolve it. I also encountered an article which attempted to analyze Anselm's proof of the existence of God using modal logic, which attracted my attention to that subject.

As I recall now, it was about this time that I started tinkering with what I called a "logic of indecision", using three values. Among my early attempts were truth tables for negation, disjunction (using the OR), and conjuncion (AND), and I used symbols. T, U (or I), and F for the truth values.

### Electromagnetism

As part of my discussion on physics, it's useful to go into a little more detail on electromagnetism.

This includes three or four areas, at least from an elementary point of view. These are:

1) Electrostatics. This refers to the behavior of electricity and electric charge and related quantities, at rest.

2) Electric current. This refers to the behavior of electric charge in motion, and includes various kinds of electric circuits. Electrostatics and electric current are often combined under electricity.

3) Magnetism. This refers to magnets, both from magnetic fields and from electric currents.

4) Optics. This refers to light and electromagnetic radiation in general.

These subjects depend heavily on mechanics. Many of the concepts of electromagnetism are most easily introduced from classical mechanics. There is a certain analogy with gravitation, and relativity was developed as an attempt to reconcile certain findings in electromagnetic theory with mechanics. From an advanced point of view, these are closely connected. At small scales, electromagnetism is also closely tied to quantum mechanics as well. The role of thermodynamics is not greatly important except for thermal radiation. There are close connections with the structure of matter.

The other sciences of chemistry, astronomy, earth science, and biology mostly furnish examples of electromagnetism. Biographies of prominent and pioneering scientists, and various associations can be examined. There are numerous textbooks that include discussions of electromagnetism, most of them discussing mathematical relationships. The techniques of working with it vary according to the specific subject. Various instruments are required, since the quantities involved are mostly invisible. Practical application, the importance in various societies, and the history are also useful studies.

I don't have any particular recommendations, for study of this subject until some other time.

This includes three or four areas, at least from an elementary point of view. These are:

1) Electrostatics. This refers to the behavior of electricity and electric charge and related quantities, at rest.

2) Electric current. This refers to the behavior of electric charge in motion, and includes various kinds of electric circuits. Electrostatics and electric current are often combined under electricity.

3) Magnetism. This refers to magnets, both from magnetic fields and from electric currents.

4) Optics. This refers to light and electromagnetic radiation in general.

These subjects depend heavily on mechanics. Many of the concepts of electromagnetism are most easily introduced from classical mechanics. There is a certain analogy with gravitation, and relativity was developed as an attempt to reconcile certain findings in electromagnetic theory with mechanics. From an advanced point of view, these are closely connected. At small scales, electromagnetism is also closely tied to quantum mechanics as well. The role of thermodynamics is not greatly important except for thermal radiation. There are close connections with the structure of matter.

The other sciences of chemistry, astronomy, earth science, and biology mostly furnish examples of electromagnetism. Biographies of prominent and pioneering scientists, and various associations can be examined. There are numerous textbooks that include discussions of electromagnetism, most of them discussing mathematical relationships. The techniques of working with it vary according to the specific subject. Various instruments are required, since the quantities involved are mostly invisible. Practical application, the importance in various societies, and the history are also useful studies.

I don't have any particular recommendations, for study of this subject until some other time.

## Thursday, May 12, 2005

### Particle mechanics

I mentioned a while ago that I wanted to start discussing some of my thoughts on logic. I'm going to have to hold off on that for a little, while I discuss mechanics and the like.

One of the textbooks I've been looking at lately said something about mechanics being at the foundation of physics. To a large extent, I agree with this. Although by itself, particle mechanics doesn't seem to be a particularly large or useful area of knowledge, it's important to understand it in order to understand many other subjects.

I have divisions of particle mechanics:

1) Description. Many introductions tend to skip over this, but there are significant differences between particles and other extended bodies.

2) Kinematics. This subject includes the description of motion, without regard to its cause.

3) Force and Momentum. This includes Newton's laws of motion

4) Work and energy

5) Systems of particles.

One of the textbooks I've been looking at lately said something about mechanics being at the foundation of physics. To a large extent, I agree with this. Although by itself, particle mechanics doesn't seem to be a particularly large or useful area of knowledge, it's important to understand it in order to understand many other subjects.

I have divisions of particle mechanics:

1) Description. Many introductions tend to skip over this, but there are significant differences between particles and other extended bodies.

2) Kinematics. This subject includes the description of motion, without regard to its cause.

3) Force and Momentum. This includes Newton's laws of motion

4) Work and energy

5) Systems of particles.

## Monday, May 09, 2005

### Classical mechanics

I'm trying to tie this more directly into subjects I'm working on. I've done analysis of classical mechanics before, but my notes are buried and left behind in my last move, and it's been a while since I had convenient access to multiple sources. In a typical university library that uses the Library of Congress classification, classical mechanics can be found in several places. There are a few general works on the sciences that mention some basic principles. There are also more specialized texts on physics, and there are also more specialized texts on mechanics, each containing more general ideas. I was trying to reconcile a couple of these, and ran into problems in the order and style of presentation.

In the physics text, the approach started with the description of motion, (kinematics), and then went to Newton's laws, then to rigid bodies, and then other subjects. The mechanics text started with statics and equilibrium of forces, then went to the description of motion. The physical laws are the same with either approach, but I wanted to find one that would unify them. After much head-scratching and rearranging of topics, I decided upon a scheme that worked for me.

1) Particle mechanics deals with the laws of motion as they apply to particles with no (or negligible) parts, rotation or internal motion.

2) Rigid bodies deal with these same laws as applied to bodies that have shape and parts, and adds the topic of rotation.

3) Deformable body mechanics deals with bodies that can be deformed, and includes elastic bodies, fluids, and wave mechanics.

I noticed in one of the science texts a discussion of religion and science, and appreciated the comment that they deal with different approaches. Science deals with the "how" of nature, while religion deals with the "why" and with the building of communities. The text made mention of the fact that certain religious fundamentalists attempt to use scripture to pronounce on questions of science. The comment would have been balanced if it had mentioned that there are zealots of a scientific turn of mind and no appreciation of religion who go the other way and try to use the known laws of science to "disprove" religion.

In the physics text, the approach started with the description of motion, (kinematics), and then went to Newton's laws, then to rigid bodies, and then other subjects. The mechanics text started with statics and equilibrium of forces, then went to the description of motion. The physical laws are the same with either approach, but I wanted to find one that would unify them. After much head-scratching and rearranging of topics, I decided upon a scheme that worked for me.

1) Particle mechanics deals with the laws of motion as they apply to particles with no (or negligible) parts, rotation or internal motion.

2) Rigid bodies deal with these same laws as applied to bodies that have shape and parts, and adds the topic of rotation.

3) Deformable body mechanics deals with bodies that can be deformed, and includes elastic bodies, fluids, and wave mechanics.

I noticed in one of the science texts a discussion of religion and science, and appreciated the comment that they deal with different approaches. Science deals with the "how" of nature, while religion deals with the "why" and with the building of communities. The text made mention of the fact that certain religious fundamentalists attempt to use scripture to pronounce on questions of science. The comment would have been balanced if it had mentioned that there are zealots of a scientific turn of mind and no appreciation of religion who go the other way and try to use the known laws of science to "disprove" religion.

## Saturday, May 07, 2005

### Astronomy

Most of astronomy does not seem to be as practically useful as other areas of knowledge, but it still belongs. I've been drawn to it because of my interests in science fiction.

I have four major divisions of it:

1) Solar System astronomy. This includes studies of the sun, planets, asteroids, comets, and other things that exist and occur in our solar syste

2) Stellar astronomy. This includes studies of stars and star systems, the life cycles of stars, and the Milky Way Galaxy

3) Galactic astronomy. This includes studies of other galaxies and their composition, classification, and history.

4) Cosmology. This includes theoretical analysis and speculations on the structure, origin, and fate of the universe.

This depends heavily on the various areas of physics, to the point that astrophysics is an important combined field. Mechanics, including not only classical mechanics but gravitation, relativity, and quantum mechanics are all used in astronomy. Electromagnetism, thermodynamics, and the structure of matter are also used heavily.

Chemistry does not seem to be used as heavily, but chemical substances, changes, and systems are discussed in various areas. Areas of earth science are also useful, but there do not seem to be many applications of biology.

The application of other areas, such as psychology, and biography, the role of astronomical societies, language, literature, and philosophy, astronomy as a profession, instrumentation, the role of education and religion, national traditions, and the history of astronomy can be followed following the same patterns as for science in general and the other sciences mentioned.

This is an area in which we are collectively less informed than earlier generations. The majority of people today, particularly in the US, live in cities which are lit at night by electricity, which also lights up the sky to the extent that all but the brightest stars are invisible. Venus is regularly taken for a UFO. If I would suggest anything, it would be to go find a rural area on some clear night, and spend a few hours just looking. For those who are more ambitious and would like to find their way around the heavens, there are numerous guides available.

I have four major divisions of it:

1) Solar System astronomy. This includes studies of the sun, planets, asteroids, comets, and other things that exist and occur in our solar syste

2) Stellar astronomy. This includes studies of stars and star systems, the life cycles of stars, and the Milky Way Galaxy

3) Galactic astronomy. This includes studies of other galaxies and their composition, classification, and history.

4) Cosmology. This includes theoretical analysis and speculations on the structure, origin, and fate of the universe.

This depends heavily on the various areas of physics, to the point that astrophysics is an important combined field. Mechanics, including not only classical mechanics but gravitation, relativity, and quantum mechanics are all used in astronomy. Electromagnetism, thermodynamics, and the structure of matter are also used heavily.

Chemistry does not seem to be used as heavily, but chemical substances, changes, and systems are discussed in various areas. Areas of earth science are also useful, but there do not seem to be many applications of biology.

The application of other areas, such as psychology, and biography, the role of astronomical societies, language, literature, and philosophy, astronomy as a profession, instrumentation, the role of education and religion, national traditions, and the history of astronomy can be followed following the same patterns as for science in general and the other sciences mentioned.

This is an area in which we are collectively less informed than earlier generations. The majority of people today, particularly in the US, live in cities which are lit at night by electricity, which also lights up the sky to the extent that all but the brightest stars are invisible. Venus is regularly taken for a UFO. If I would suggest anything, it would be to go find a rural area on some clear night, and spend a few hours just looking. For those who are more ambitious and would like to find their way around the heavens, there are numerous guides available.

## Friday, May 06, 2005

### Mechanics

For some reason, I keep going back to mechanics as a starting point for my various studies. At this point, I am talking about mechanics as a division of physics.

There are several subjects that can be included in mechanics. Originally, this term had to do with machines, as in levers and screws, but with the addition and emphasis of other subjects, the meaning began to change. I recognize four principal areas.

1) Classical mechanics. This includes studies of particles, rigid bodies, and deformable bodies, including gases and fluids. This still is the most common and useful area of study. Before the 20th century, nearly all mechanics was classical mechanics.

2) Gravitation. This is usually included in classical mechanics, but it doesn't neatly fit within another organization of the subject, so I have separated it out.

3) Relativity. This deals with corrections which are needed to classical mechanics in the cases of very high speeds and strong gravitational fields. Since these are outside the realm of everyday experience and require some advanced mathematics to fully comprehend, I will not develop it in great deal.

4) Quantum mechanics. This deals with the corrections that are needed to classical mechanics in the case of atomic-sized and smaller particles. This is more applicable, especially in chemistry, but involves even more advanced mathematics, and I will set aside discussion of this subject as well.

There are some connections to electromagnetism and the structure of matter, and a few connections to thermodynamics, but for the most part, mechanics is considered more fundamental than these other subjects. Chemistry is useful when the particular properties of specific substances are important. Astronomy, earth science, and biology are useful for examples and illustrations. There are comparatively few people who specialize in mechanics as an area of theoretical study. It is somewhat difficult to find internet resources at an intermediate level of study, and this is most easily approached by examining general physics textbooks. It requires a fairly high level of mathematics, and is more closely associated with education than any of the other social institutions. The communities and peoples involved and the history of mechanics are also useful, though it can be difficult to find an introductory account of the history of mechanics.

My interest in the subject is more mathematical and theoretical than experimental, although what resources to suggest depends heavily on how much you already know.

There are several subjects that can be included in mechanics. Originally, this term had to do with machines, as in levers and screws, but with the addition and emphasis of other subjects, the meaning began to change. I recognize four principal areas.

1) Classical mechanics. This includes studies of particles, rigid bodies, and deformable bodies, including gases and fluids. This still is the most common and useful area of study. Before the 20th century, nearly all mechanics was classical mechanics.

2) Gravitation. This is usually included in classical mechanics, but it doesn't neatly fit within another organization of the subject, so I have separated it out.

3) Relativity. This deals with corrections which are needed to classical mechanics in the cases of very high speeds and strong gravitational fields. Since these are outside the realm of everyday experience and require some advanced mathematics to fully comprehend, I will not develop it in great deal.

4) Quantum mechanics. This deals with the corrections that are needed to classical mechanics in the case of atomic-sized and smaller particles. This is more applicable, especially in chemistry, but involves even more advanced mathematics, and I will set aside discussion of this subject as well.

There are some connections to electromagnetism and the structure of matter, and a few connections to thermodynamics, but for the most part, mechanics is considered more fundamental than these other subjects. Chemistry is useful when the particular properties of specific substances are important. Astronomy, earth science, and biology are useful for examples and illustrations. There are comparatively few people who specialize in mechanics as an area of theoretical study. It is somewhat difficult to find internet resources at an intermediate level of study, and this is most easily approached by examining general physics textbooks. It requires a fairly high level of mathematics, and is more closely associated with education than any of the other social institutions. The communities and peoples involved and the history of mechanics are also useful, though it can be difficult to find an introductory account of the history of mechanics.

My interest in the subject is more mathematical and theoretical than experimental, although what resources to suggest depends heavily on how much you already know.

## Thursday, May 05, 2005

### Chemistry

I have a chronic problem with selecting topics to study and develop. If I have a scheme, it becomes too rigid, but if I don't, I get lost among the possibilities. For now, I'm trying a scheme, but if it isn't satisfactory, I'll try something else.

I don't much like the traditional subdivisions of chemistry. It used to be that they were: General chemistry, analytic chemistry, inorganic chemistry, organic chemistry, and physical chemistry, and were studied in about that order. I've been trying a different approach:

1) Substances. The physical and chemical properties of elements, compounds, and mixtures.

2) Chemical change. Chemical equations and relationships of substances, thermodynamics and energy, rates and mechanisms, and types of change.

3) Chemical systems. One phase, two-phase, and multiphase systems. In this context, a Phase refers to one of the states states of matter: solid, liquid, or gas.

I'm not sure whether this approach really works yet or not; I'm still investigating it.

This depends heavily on physics. Mechanics, especially parts of classical mechanics is often useful. Electricity, magnetism, and optics are also important in chemistry. I've noticed some difference in the chemical approach to thermodynamics and the physical approach. Much of the structure of matter belongs as much to chemistry as to physics. There are various uses for astronomy, earth science, and biology in the study of chemistry. Discussions of the human body and psychology are somewhat useful, and I can sketch out how other areas are related to chemistry by identifying particular chemists. Like other areas of science, chemistry is largely a social endeavor. The chemical literature, mathematics, and measurement; chemistry as an occupation; chemical education; national approaches to chemistry and the history of chemistry are also subjects of interest.

I'm not mentioning any experiments or activities in chemistry: there are others more qualified to do so. What I can do is discuss some of the theoretical aspects of the subject.

I don't much like the traditional subdivisions of chemistry. It used to be that they were: General chemistry, analytic chemistry, inorganic chemistry, organic chemistry, and physical chemistry, and were studied in about that order. I've been trying a different approach:

1) Substances. The physical and chemical properties of elements, compounds, and mixtures.

2) Chemical change. Chemical equations and relationships of substances, thermodynamics and energy, rates and mechanisms, and types of change.

3) Chemical systems. One phase, two-phase, and multiphase systems. In this context, a Phase refers to one of the states states of matter: solid, liquid, or gas.

I'm not sure whether this approach really works yet or not; I'm still investigating it.

This depends heavily on physics. Mechanics, especially parts of classical mechanics is often useful. Electricity, magnetism, and optics are also important in chemistry. I've noticed some difference in the chemical approach to thermodynamics and the physical approach. Much of the structure of matter belongs as much to chemistry as to physics. There are various uses for astronomy, earth science, and biology in the study of chemistry. Discussions of the human body and psychology are somewhat useful, and I can sketch out how other areas are related to chemistry by identifying particular chemists. Like other areas of science, chemistry is largely a social endeavor. The chemical literature, mathematics, and measurement; chemistry as an occupation; chemical education; national approaches to chemistry and the history of chemistry are also subjects of interest.

I'm not mentioning any experiments or activities in chemistry: there are others more qualified to do so. What I can do is discuss some of the theoretical aspects of the subject.

## Tuesday, May 03, 2005

### Physics

Since I am personally more interested in science than in many other subjects, I'm going to shift to the other end of the subjects from history.

I consider physics to be the study of the laws or regularities of nature in general. As part of my studies in the past, I've devoted some hours to finding a uniform approach to it, and I have several categories:

1) Mechanics. This includes laws of motion, force, and energy. Subtopics include gravitation, relativistic mechanics, and quantum mechanics. The first two are most applicable to other people than scientists.

2) Electromagnetism. This includes electricity, magnetism, and optics, including light. These are less visible, but are regularly employed in our society.

3) Thermodynamics. This includes the study of temperature, heat and related quantities.

4) Structure of matter. This includes the study of subatomic particles, atomic and nuclear physics, molecular physics, and the forms of matter: solids, liquids, and gases, that we usually deal with.

These are fundamenal and more basic than the other sciences, although there is some overlap with chemistry. One of my own projects involves identifying particular prominent physicists, and societies of physicists.

I've recently taken a look at the classification schedules in the Library of Congress to reorganize the subjects of books according to my own preferences, and there is a great deal of popular literature that attempts to discuss very advanced concepts of physics without the use of mathematics. I find this situation very unsatisfactory: I like the mathematics, and there are some areas I would very much like to know about but still lack enough mathematical background. Physics depends heavily on measurement, as well as other areas of applied science. I have something of a distaste for philosophy of physics, although there may be a few useful ideas. One of my pet peeves is the way that physics has become such a specialized occupation and that it requires so much advanced training to even understand the ideas that are being discussed. Physics is generally considered difficult and serious, and there isn't very much that is recreational or fun, although a few creative educators have found some. I don't have any of the apparatus and equipment needed to do research in physics, although some of it is accessible.

I am interested in physics education and improving it, more than in commercial research, and I would prefer that the government not be the primary source of funding. Physics and religion deal with different subjects. To oversimplify, physics concentrates on what can be seen and observed by anyone, while religion deals with what is unseen, at least by most.

Physics is primarily a product of western civilization, but it is hard to get more specific without dealing more with its history. Although it has roots in ancient history, its recognized ancestry comes largely from Greek philosophers in the early classical period, and most of its development since the 16th century.

I consider physics to be the study of the laws or regularities of nature in general. As part of my studies in the past, I've devoted some hours to finding a uniform approach to it, and I have several categories:

1) Mechanics. This includes laws of motion, force, and energy. Subtopics include gravitation, relativistic mechanics, and quantum mechanics. The first two are most applicable to other people than scientists.

2) Electromagnetism. This includes electricity, magnetism, and optics, including light. These are less visible, but are regularly employed in our society.

3) Thermodynamics. This includes the study of temperature, heat and related quantities.

4) Structure of matter. This includes the study of subatomic particles, atomic and nuclear physics, molecular physics, and the forms of matter: solids, liquids, and gases, that we usually deal with.

These are fundamenal and more basic than the other sciences, although there is some overlap with chemistry. One of my own projects involves identifying particular prominent physicists, and societies of physicists.

I've recently taken a look at the classification schedules in the Library of Congress to reorganize the subjects of books according to my own preferences, and there is a great deal of popular literature that attempts to discuss very advanced concepts of physics without the use of mathematics. I find this situation very unsatisfactory: I like the mathematics, and there are some areas I would very much like to know about but still lack enough mathematical background. Physics depends heavily on measurement, as well as other areas of applied science. I have something of a distaste for philosophy of physics, although there may be a few useful ideas. One of my pet peeves is the way that physics has become such a specialized occupation and that it requires so much advanced training to even understand the ideas that are being discussed. Physics is generally considered difficult and serious, and there isn't very much that is recreational or fun, although a few creative educators have found some. I don't have any of the apparatus and equipment needed to do research in physics, although some of it is accessible.

I am interested in physics education and improving it, more than in commercial research, and I would prefer that the government not be the primary source of funding. Physics and religion deal with different subjects. To oversimplify, physics concentrates on what can be seen and observed by anyone, while religion deals with what is unseen, at least by most.

Physics is primarily a product of western civilization, but it is hard to get more specific without dealing more with its history. Although it has roots in ancient history, its recognized ancestry comes largely from Greek philosophers in the early classical period, and most of its development since the 16th century.

Subscribe to:
Posts (Atom)